If OpenAI hopes to maintain a decisive edge in the increasingly competitive global race for artificial intelligence dominance, it may ultimately need to secure not just advanced computing hardware but a robust and independent source of energy to sustain it. The company behind ChatGPT has recently entered into a multibillion-dollar strategic alliance with AMD, under which it will deploy a vast supply—amounting to six gigawatts—of the semiconductor manufacturer’s most powerful AI-optimized chips. This ambitious agreement comes shortly after another major announcement: OpenAI and the leading chip designer Nvidia revealed plans for an additional ten gigawatts of computing capacity, underscoring the company’s relentless pursuit of expansion.

OpenAI has repeatedly acknowledged that its ambitious growth strategy hinges on access to immense computational resources, which are essential to train and operate its ever more sophisticated machine learning models. However, these rapidly growing data requirements have produced an unforeseen yet equally urgent challenge: how to acquire the enormous quantities of electrical power necessary to keep these chips running consistently and sustainably. At present, the escalating demand for AI computation has placed extraordinary strain on the U.S. electrical grid, pushing it perilously close to its operational limits. Energy utilities nationwide have warned that nearly sixty gigawatts of additional generation capacity—an amount roughly equivalent to the power consumption of six large metropolitan areas—may be needed before the decade’s end. Yet expanding power infrastructure in the United States is no simple task; it typically requires years of planning, engineering assessments, environmental reviews, and regulatory hearings before new facilities can be brought online.

Against this backdrop, the breakneck speed of new chip deployments has intensified discussions among energy and technology experts who contend that traditional public utilities can no longer reliably meet the data industry’s surging demands. Increasingly, analysts advocate that major data center operators consider producing their own electricity through self-sustaining or on-site generation solutions. Sean Farney, vice president of data center strategy for the Americas at Jones Lang Lasalle, summarized the situation succinctly: the sector has long recognized that grid capacity is nearing its maximum, and therefore the only path forward involves innovation—developing self-contained power systems that can free operators from dependence on an overstretched grid.

OpenAI has already begun to move in that direction. Its data center in Abilene, Texas—part of the company’s larger Stargate project—is reportedly powered by its own natural gas plant located directly on-site, marking an early example of how AI companies may integrate energy production into their infrastructure blueprints. The trend is not confined to OpenAI alone: Elon Musk, for example, is reportedly supplying electricity to xAI’s Memphis data center through mobile natural gas turbines while also purchasing land that once housed a gas plant in Mississippi. These developments collectively hint at a broader transformation of how the technology industry approaches energy—shifting from passive consumption toward active generation and long-term self-reliance.

AMD, in a statement, confirmed that the six gigawatts of computational capacity outlined in the new partnership constitute additional resources beyond any previous commitments that OpenAI had announced. When asked for clarification, OpenAI chose not to comment publicly. The five-year deal, AMD explained, has been carefully structured to ensure strong incentive alignment: OpenAI will receive warrants for up to 160 million AMD shares—equivalent to approximately one percent of the company—distributed incrementally as each gigawatt of computing power is successfully deployed.

According to AMD’s CEO Lisa Su, the rollout of the first wave of energy-intensive hardware, representing one gigawatt of computational capacity, is anticipated to begin during the latter half of 2026. She further emphasized that the complete deployment timeline for all six gigawatts will depend largely on how swiftly and efficiently the two firms can secure reliable access to adequate power. “The guiding intention,” Su noted candidly, “is to move forward with deployment as rapidly as conditions permit.” Her comment reflects not only a determination to accelerate technological progress but also an acknowledgment that the availability of energy itself has become one of the defining constraints—and battlegrounds—of the AI revolution.

Sourse: https://www.businessinsider.com/openai-amd-deal-highlights-ai-strain-on-power-grid-2025-10