On Monday, Amazon announced a partnership that will allow OpenAI to use the company’s cloud computing services to run AI systems for products like the popular ChatGPT. OpenAI is paying $38 billion to access Amazon Web Services (AWS) servers and “hundreds of thousands of state-of-the-art NVIDIA GPUs.”
“OpenAI will immediately start utilizing AWS compute as part of this partnership, with all capacity targeted to be deployed before the end of 2026, and the ability to expand further into 2027 and beyond,” Amazon said in a statement.
Industry analysts and leaders say an expansion of OpenAI’s cloud partners was expected.
“We were kind of waiting for an open AI AWS partnership,” Rick Villars, chief analyst and group vice president at IDC Research, told Newsweek. “They wouldn’t be making those statements unless they knew that they had some commitments to some pretty big capacity.”
Why It Matters
Villars added that companies already working with Amazon gain in ease of connecting to data already on AWS servers to support agentic tools or products. For OpenAI and the Seattle-based retail and web-hosting giant, the move is a signal of rising demand for their services. Whether that rise is a projection or a reality is not fully certain.
“For the sort of deep integration with their [data], you need to be on the cloud where the customer’s data is in their applications,” he said. “They were on Microsoft, which is also the number two player and relative thing. But to not be on Amazon…they need to capture it to address the whole enterprise.”
Less than a week prior to the announcement of this deal, OpenAI altered its agreement with Microsoft, which had previously been its exclusive cloud services provider. OpenAI is also in the process of changing its business structure to a for-profit model after being formed as a not-for-profit organization.
“Scaling frontier AI requires massive, reliable compute,” said OpenAI co-founder and CEO Sam Altman in the Amazon statement. “Our partnership with AWS strengthens the broad compute ecosystem that will power this next era and bring advanced AI to everyone.”
What To Know
The electricity and computing power needed to run AI systems are dependent on an abundance of resources, including computer chips, energy, cooling, data storage, cybersecurity and other needs. OpenAI’s latest move should open up capacity for new and existing applications of its products, but also raises some concerns.
“AWS now hosts compute for both OpenAI and Anthropic. These are direct competitors running frontier models on the same infrastructure provider,” Rob T. Lee, SANS Institute chief of research and chief AI officer, told Newsweek via email. “Amazon invested $4 billion in Anthropic and is building an $11 billion data center exclusively for their workloads. Now they’re also running OpenAI. Organizations need to understand what data residency actually means when your AI provider’s competitor shares the same cloud infrastructure.”
What People Are Saying
Matt Garman, CEO of AWS, said in a statement: “As OpenAI continues to push the boundaries of what’s possible, AWS’s best-in-class infrastructure will serve as a backbone for their AI ambitions. The breadth and immediate availability of optimized compute demonstrates why AWS is uniquely positioned to support OpenAI’s vast AI workloads.”
John Morris, CEO of software company Ocient, told Newsweek: “Recent data confirms that enterprise demand for AI is reshaping the market and driving unprecedented global demand for computing power. OpenAI and Amazon’s announcement highlights that CPUs, not just GPUs, are central to the next phase of AI innovation, driven by agentic workloads that rely heavily on traditional compute. We’re seeing the same trend among our customers, who are preparing for a world where deep business analytics are performed with or by AI agents. OpenAI’s move to broaden beyond Microsoft is a natural progression – innovation thrives on openness. This latest partnership underscores a broader industry shift already underway: access to diverse infrastructure, not cloud exclusivity, will define sustainable AI growth alongside advances in data efficiency at scale.”
Lee of the SANS Institute continued, to Newsweek: “It’s interesting that enterprises banned ChatGPT in 2023 over data privacy concerns, but two years later they’re rebuilding entire business processes on the same models. The difference? Now there’s a contract and a logo that says ‘enterprise ready.’ We went from shadow AI to sanctioned AI without fixing the underlying governance problem.”
What’s Next
Villars predicts more tech titans making partnerships to build and develop capacity and service quality in AI-powered products and services, and to signal to the markets that they’re open for business despite concerns about their computing capacity.
“I think you’re going to see more of these cooperative announcements, not just from OpenAI, but for any software company, to kind of reassure people who are investing in their agents that we’re going to have the capacity to meet the performance needs.”
Read the full article here


