OpenAI's Data Center Shift Spooks Investors Before IPO
OpenAI's pivot to proprietary data centers raises Wall Street concerns about capital intensity, profitability, and competitive positioning ahead of its
OpenAI is spending $40 billion to build out its own data center infrastructure, a capital commitment that has rattled investors already nervous about the company's path to profitability ahead of its anticipated 2026 IPO.
The figure, confirmed by three people familiar with the company's planning, represents more than double what OpenAI spent on compute in all of 2024. And it comes at a delicate moment. The ChatGPT maker burned through $8.5 billion last year while generating roughly $3.7 billion in revenue, according to financial documents reviewed by The Information. Its projected 2025 losses now approach $14 billion before accounting for the new infrastructure buildout.
Why OpenAI is betting the farm on silicon
The company hasn't been subtle about its ambitions. CEO Sam Altman told reporters in January that OpenAI would need "an order of magnitude more compute" to train its next generation of models. What he didn't say then: OpenAI wants to own that compute, not rent it.
For years, OpenAI relied almost entirely on Microsoft's Azure infrastructure, a relationship cemented by the tech giant's $13 billion investment. That arrangement gave OpenAI priority access to Nvidia's most advanced GPUs without the headache of managing physical data centers. But the dependency came with constraints. Microsoft dictated deployment timelines, limited OpenAI's ability to negotiate with other cloud providers, and took a reported 20% cut of all API revenue.
The new strategy flips that model. OpenAI is now directly negotiating with chip manufacturers, signing preliminary agreements with Broadcom and Marvell for custom AI accelerators, and scouting locations in Texas, Arizona, and the Midwest for facility construction. The goal, according to one executive briefed on the plans: cut inference costs by 60% within three years while eliminating the "tax" paid to cloud middlemen.
"They're essentially becoming a vertically integrated semiconductor company. That's not what investors signed up for when they valued this as a software business."
— Gene Munster, managing partner at Deepwater Asset Management
The IPO math gets complicated
Here's where the tension lies. OpenAI's last private valuation hit $157 billion in October 2024. At that price, the company would need to command a public market cap rivaling Meta or Tesla to deliver reasonable returns to late-stage backers like Thrive Capital and SoftBank.
But infrastructure-heavy businesses trade differently than software companies. Cloud providers like AWS and Google Cloud operate at 25-30% operating margins after years of optimization. OpenAI's current margins, according to two people with knowledge of its finances, sit in the single digits for its API business and are negative for ChatGPT Plus subscriptions once customer acquisition costs are included.
The comparison to Meta is particularly uncomfortable. Mark Zuckerberg's company spent decades building data centers before its AI pivot. OpenAI is attempting a similar transformation in 18 months while still losing money on every dollar of revenue.
---
What does this mean for the IPO timeline?
OpenAI's board had initially targeted a public offering by mid-2026. That schedule now looks aspirational.
The infrastructure commitments create a cash flow problem that can't be solved by more venture funding. OpenAI has already tapped debt markets for $6.5 billion, and its revolving credit facility is largely drawn. The company needs either dramatically higher revenue — its $11.6 billion 2025 revenue target, which Altman has privately called "aggressive but achievable" — or a restructuring of its corporate form that would allow it to raise equity more freely.
Current investors are caught in a bind. Thrive Capital, which led the 2024 round at the $157 billion valuation, has mark-to-market provisions that could trigger if OpenAI doesn't file for an IPO by late 2026. SoftBank's $1.5 billion commitment includes similar terms. The data center splurge, while potentially rational long-term, compresses the window for a public debut that satisfies everyone's expectations.
"Every dollar they spend on concrete and cooling systems is a dollar not going to model research. The market will ask: are you a tech company or a utility?"
— Sarah Guo, founder of Conviction and former Greylock partner
The competitive pressure nobody's talking about
There's another factor driving the infrastructure rush. Anthropic, OpenAI's closest rival, secured a $4 billion commitment from Amazon for dedicated Trainium chip capacity in 2024. Google DeepMind has unfettered access to TPU clusters. Meta's Llama models train on infrastructure that doesn't appear on any external balance sheet.
OpenAI's Microsoft partnership once provided competitive parity. Now, Altman and his team see it as a straitjacket. The company has reportedly explored — and rejected — a $10 billion additional investment from Microsoft that would have extended the exclusivity arrangement through 2028. That rejection signals confidence, or desperation, depending on who's interpreting.
The risk is execution. OpenAI has never managed physical infrastructure at scale. Its competitors have decades of operational experience. One former Google data center executive, who spoke on condition of anonymity because they now work with OpenAI competitors, put it bluntly: "Training models is hard. Running a hyperscale data center is a different kind of hard. They don't have the scar tissue."
What happens next
OpenAI will likely attempt to thread the needle with a two-phase IPO: a 2026 listing of a restructured operating company that excludes the most capital-intensive infrastructure assets, followed by a later integration once those facilities prove they can deliver the promised cost savings. It's a structure that has precedent — CoreWeave, the GPU cloud provider, pursued something similar — but never at this scale with a consumer brand this prominent.
The company has one advantage that doesn't appear in any spreadsheet: ChatGPT's 300 million weekly active users give it distribution that no infrastructure buildout can replicate. The question is whether that user base will stick around long enough for the economics to work, or whether competitors with deeper pockets and simpler business models will eat OpenAI's lunch while it's busy pouring concrete in the Texas heat.
---
Related Reading
- How Companies Are Actually Using AI to Cut Costs - The AI Stocks to Watch in 2026 - OpenAI Teases 'Extreme' Reasoning in Next AI Model - OpenAI Signs Defense Deal After Anthropic Policy Clash