Amazon is holding advanced discussions to invest at least $10 billion in OpenAI, a move that could push the artificial intelligence company’s valuation beyond $500 billion and reshape competitive dynamics across the global AI infrastructure market. The talks, which have not been finalised, centre on a strategic investment that would bind OpenAI more closely to Amazon Web Services through the use of Amazon’s custom Trainium chips for large-scale model training.People familiar with the matter say the proposal reflects mounting pressure on leading AI developers to secure dependable, cost-effective computing power as model sizes grow and competition intensifies. OpenAI’s need for vast quantities of specialised chips has already driven it into deep partnerships with cloud providers, and the potential Amazon deal signals an effort to diversify beyond existing arrangements while reducing reliance on Nvidia’s dominant but expensive graphics processing units.
For Amazon, the investment would represent one of the largest external bets in its history and underline AWS’s ambition to be a central platform for frontier AI development. Trainium, designed in-house to lower the cost of training large models, has been promoted as a viable alternative to Nvidia’s GPUs. Securing OpenAI as a flagship user would provide a powerful endorsement, potentially encouraging other developers to consider Amazon’s silicon for workloads that have traditionally defaulted to Nvidia.
OpenAI, founded in 2015 and now one of the most influential players in generative AI, has seen its valuation climb rapidly as its models have been embedded across consumer and enterprise products. Market estimates earlier this year placed the company’s valuation near $300 billion following secondary share sales. A deal with Amazon at the scale under discussion would imply a substantial step up, reflecting investor confidence in long-term revenue from AI services despite rising operating costs.
The talks also highlight the increasingly complex web of alliances forming around AI infrastructure. OpenAI already relies heavily on Microsoft’s cloud for training and deployment, while also exploring additional capacity through other partners. An expanded relationship with Amazon would mark a notable broadening of its compute strategy, though it raises questions about how OpenAI would balance overlapping interests among its largest backers and infrastructure providers.
Industry analysts note that the economics of AI training are forcing such partnerships. Training cutting-edge models can cost billions of dollars annually in compute and energy, with chip supply constraints adding further strain. By negotiating access to Trainium at scale, OpenAI could reduce per-unit training costs and gain leverage in future hardware procurement, while Amazon would benefit from higher utilisation of its data centres and greater visibility for its custom chips.
Yet scepticism persists over the sustainability of ever-larger investments and valuations. Critics argue that while demand for AI services is growing quickly, monetisation remains uneven and capital expenditure is rising faster than revenues for many providers. They warn that valuations north of $500 billion assume continued breakthroughs, strong enterprise adoption and stable regulatory conditions, all of which remain uncertain.
Regulatory scrutiny is another factor hovering over the discussions. Large technology investments that further entrench dominant platforms in AI infrastructure are likely to attract attention from competition authorities in multiple jurisdictions. Any agreement would need to be structured carefully to avoid exclusive arrangements that could be seen as limiting market access for rivals or reinforcing concentration in cloud services.
From Amazon’s perspective, the proposed investment aligns with a broader strategy of positioning AWS as an indispensable backbone for AI development rather than simply a commodity cloud provider. The company has already committed tens of billions of dollars to expanding data centre capacity and developing proprietary chips, betting that demand for AI compute will remain robust for years.
Topics
Technology