OpenAI closed a $110 billion funding round led by Amazon, NVIDIA, and SoftBank, putting its pre-money valuation at $730 billion. That’s up from $500 billion in October and past the previous record $40 billion round. The round is still open for additional investors, and an IPO is expected later in 2026.
Amazon: $50 Billion and an Infrastructure Deal
Amazon’s $50 billion breaks into two tranches: $15 billion upfront and another $35 billion contingent on milestones or an IPO by December 31, 2028. The capital comes alongside a technical partnership that goes well beyond just writing a check.
The centerpiece is co-development of a Stateful Runtime Environment on AWS Bedrock. The idea is persistent context for AI models across compute, memory, and identity layers, so applications and agents can maintain state between interactions rather than starting from scratch every time. AWS becomes the exclusive third-party cloud provider for OpenAI’s Frontier enterprise platform, which handles AI agents with shared context, governance, and security controls.
This expands a prior $38 billion AWS deal by $100 billion over eight years. As part of the agreement, OpenAI commits to consuming 2 gigawatts of Trainium capacity, specifically Trainium3 and Trainium4 chips starting in 2027. There are also plans for custom OpenAI models built for Amazon’s consumer applications in retail and logistics.
Andy Jassy called the stateful runtime environment work a shift in what AI apps and agents can do. Sam Altman’s framing was more grounded: AI should be practical and useful. That’s not exactly a bold thesis, but the infrastructure commitments behind it are concrete.
NVIDIA: Inference and Training at Scale
NVIDIA is putting in $30 billion. In return, OpenAI is committing to 3 gigawatts of dedicated inference capacity and 2 gigawatts of training on NVIDIA Vera Rubin systems. That is a significant compute lock-in. Next-generation inference at that scale requires purpose-built infrastructure, and Vera Rubin is NVIDIA’s answer to what comes after Blackwell.
The compute numbers across both partnerships are worth putting side by side: 2 gigawatts of Trainium capacity from the Amazon side, 3 gigawatts of inference plus 2 gigawatts of training from NVIDIA. OpenAI is building out a very large, multi-vendor infrastructure footprint, which makes sense if the $280 billion revenue projection by 2030 is even in the right ballpark.
SoftBank: $30 Billion, Core Participant
SoftBank is contributing $30 billion as a core round participant. They also led OpenAI’s previous record round, so this continues an established relationship. The research doesn’t detail specific technical agreements on their end the way it does for Amazon and NVIDIA.
What the Numbers Actually Mean
OpenAI projects $280 billion in revenue by 2030, split evenly between consumer and enterprise. That is an aggressive number. The previous valuation was $500 billion from an October secondary share sale. Jumping to $730 billion pre-money in a few months reflects both the scale of capital being committed and the competitive pressure everyone is feeling to lock in positions with the leading frontier lab.
Microsoft’s partnership is unchanged. The exclusive AWS arrangement for Frontier is specifically scoped to that enterprise platform, so the two cloud relationships appear to coexist by design.
The Stateful Runtime Environment on Bedrock is probably the most technically interesting piece here. Persistent context across compute, memory, and identity for agents is a real engineering problem. Most current agent deployments handle this awkwardly with external databases and session management bolted on. A native stateful layer built into the runtime would change how agent applications are architected. Whether the co-development actually delivers that cleanly remains to be seen, but it is the right problem to be working on.
For context on what these models are actually capable at inference, the model benchmarks post covering Claude Opus 4.6 and GPT-5.3-Codex gets into the performance details that this infrastructure is being built to serve. And if you want a broader view of where the labs stack up, the 2026 LLM rankings post covers the full competitive field.
The round is still open. More investors may join before it closes.

