Name one thing that artificial intelligence companies seem to prize even more than the relentless acquisition of additional computing power—an endeavor that often borders on obsession. I’ll wait. OpenAI, true to form, has once again demonstrated its insatiable appetite for scale and processing capacity through its latest move: a monumental $38 billion agreement with Amazon’s AWS, reinforcing its ongoing campaign to secure a commanding share of cloud resources. These increasingly frequent arrangements, echoing one another in tone and ambition, have begun to blur together, suggesting a single unbroken narrative of expansion. To make sense of it all, I reached out to my insightful colleague, Alistair Barr, whose expert coverage of this domain gives him a front-row perspective. (Incidentally, he authors the excellent Tech Memo newsletter—well worth a subscription.)

So, let’s dive into the details.

**Dan:** I once jokingly remarked that OpenAI seems to finalize new deals for additional compute every day of the week that ends with a “y.” It’s beginning to feel less like a jest and more like an accurate description of their strategy. What’s your high-level assessment of OpenAI’s latest arrangement with AWS?

**Alistair:** Until recently, Amazon was effectively locked out from directly working with OpenAI’s models due to the company’s longstanding and deeply integrated partnership with Microsoft. However, Microsoft revised its expansive agreement with OpenAI just last week, effectively loosening the exclusivity provisions and giving the startup greater freedom to collaborate with other cloud providers. Given that opportunity, one of the most logical first steps would be to contact Amazon Web Services—still the undisputed powerhouse of cloud computing—to secure additional infrastructure. In many ways, AWS remains the central platform upon which industrial-scale machine learning projects depend.

**Dan:** But just because OpenAI *can* enter into such deals does not necessarily mean it *should*. The company already has future compute commitments estimated at over one trillion dollars, a staggering figure by any measure. Executives often argue that the greater danger lies not in overspending but in failing to invest enough. Yet, there must be some threshold where that argument loses force. Or perhaps my caution simply betrays a lack of the grand financial vision that enables venture capitalists to invest billions without hesitation.

**Alistair:** What OpenAI’s Chief Financial Officer, Sarah Friar, mentioned earlier this year sheds valuable light on that reasoning. According to her, the company is forced to make difficult decisions about which initiatives *not* to pursue solely because its computing capabilities remain limited, despite its vast scale. For any visionary leader, the knowledge that constrained infrastructure could stifle progress must be profoundly frustrating. Viewed from that perspective, aggressively pursuing as many cloud and AI-compute partnerships as possible is not reckless—it is strategically pragmatic and even necessary.

**Dan:** Fair enough. However, I still wonder whether these vast investments are translating into meaningful financial returns. You recently reported on an observable slowdown in enterprise-level adoption of AI technologies, suggesting a possible disconnect between the sweeping promises made by AI executives and the measured, sometimes cautious, interest exhibited by actual corporate users.

**Alistair:** That’s a legitimate concern. Remember, OpenAI remains a startup at its core, and its ability to generate sustainable profits in the future is inherently uncertain. This pattern, however, is deeply familiar in the technology sector. Many of today’s largest and most successful firms once spent years bleeding capital while they built user bases and refined monetization models. Consider Amazon, which endured prolonged periods of seemingly endless investment before profitability finally materialized. Or take Meta, which initially seemed overextended yet has since evolved into a company generating roughly $70 billion in annual profit. It is a common trajectory: scale first, revenue later.

**Dan:** That makes sense. Still, I can’t help feeling uneasy about the sheer magnitude of OpenAI’s current expenditures. It’s an audacious strategy—one that could either redefine the technology landscape or strain its finances in the long run. I suspect Sam Altman might not be thrilled to hear me say that.

**Alistair:** Admittedly, the company’s long-term payoff may ultimately hinge on how successfully it commercializes its products. If OpenAI integrates search-style advertisements directly into ChatGPT, for example, the implications could be extraordinary. Imagine a billion users voluntarily divulging their questions, desires, and ambitions—effectively revealing the most detailed dataset of human intent since the advent of Google Search. For context, Google earns roughly $100 billion annually in profit from its Search engine alone. Should OpenAI capture even half that figure, ChatGPT-based advertising could potentially yield $50 billion in profit. Moreover, with the rollout of Sora, its AI-driven video network, the same monetization approach could plausibly apply. Using Meta’s profitability as a model, which currently stands near $70 billion annually, OpenAI might analogously secure an additional $35 billion.

Layer on top of that a diverse set of potential revenue streams: consumer and enterprise subscriptions, a possible Apple-style device ecosystem, productivity-oriented AI tools designed for workplace integration, and even a curated AI-powered app marketplace. If these various ventures perform well, OpenAI could ultimately approach the extraordinary milestone of $100 billion in annual profit. Such returns would not only justify its massive data center expansion but would also place it among the most financially powerful enterprises in the world. Still, none of this is preordained. As venture capitalist Marc Andreessen famously mused, “Is OpenAI the next Google?” That remains the trillion-dollar question. And if I already knew the answer, I’d likely be investing in that future—rather than simply writing about it.

**The Business Insider Today Team:**
Dan DeFrancesco, Deputy Executive Editor and Anchor, New York; Hallam Bullock, Senior Editor, London; Akin Oyedele, Deputy Editor, New York; Grace Lett, Editor, New York; and Amanda Yen, Associate Editor, New York.

Sourse: https://www.businessinsider.com/openai-amazon-aws-compute-deal-cloud-ai-adoption-customer-demand-2025-11