OpenAI alliance drama happens again! Announced a $38 billion cloud computing power agreement with AWS
OpenAI, a new artificial intelligence startup with a current market value of US$500 billion, announced on the evening of the 3rd, Taipei time, that it had signed a major computing power purchase agreement with cloud infrastructure provider AWS worth up to US$38 billion. Under this agreement, OpenAI will immediately begin working through the computing power and infrastructure provided by AWS, and will further begin to use hundreds of thousands of NVIDIA (NVIDIA) GPUs located in the United States. The market expects that this cooperation plan will provide OpenAI with the flexibility to expand infrastructure in 2026 and even beyond.
According to CNBC reports, OpenAI has long maintained an exclusive cloud computing partnership with Microsoft. Microsoft has invested more than $13 billion in OpenA since 2019. Microsoft will no longer be OpenAI's exclusive cloud computing provider until early 2025. Last week, the preferred status Microsoft used to enjoy under renegotiated commercial terms officially expired, allowing OpenAI, the creator of ChatGPT, to work more broadly with other hyperscale cloud providers. Although OpenAI has previously reached cloud service cooperation agreements with Oracle and Google. However, after all, AWS is the clear leader in this market.
Dave Brown, vice president of computing and machine learning services at AWS, said that OpenAI is committed to purchasing computing power from AWS, which is a very, very direct customer relationship. While the first phase will use existing AWS data centers, Amazon will eventually build completely independent data center computing power for OpenAI outside of these data centers. OpenAI CEO Sam Altman emphasized in a statement that expanding advanced AI infrastructure requires massive, reliable computing power. Therefore, he believes that the cooperation with AWS is expected to strengthen the computing ecosystem, further promote the development of the next AI era, and bring advanced AI services to everyone.
Reports indicate that this infrastructure cooperation plan will support OpenAI’s two core needs, which are the inference operations used to provide instant responses to ChatGPT users, and the huge computing power required for training operations to train the next generation of advanced models. Moreover, the current agreement clearly states that NVIDIA will use two popular Blackwell architecture chips. Although AWS has its own self-developed Trainium AI chip, the details of the use of Trainium chips have not been disclosed in the current OpenAI agreement.
For AWS, this agreement is huge and of great significance. AWS CEO Matt Garman said that the breadth and real-time availability of AWS optimized computing demonstrate its unique advantages in supporting the execution of OpenAI's huge AI work. It is worth noting that Amazon is also a close partner of OpenAI’s main competitor Anthropic, having invested billions of dollars in it and is even building a data center campus specifically for Anthropic.
In terms of market reaction to the cooperation agreement, Amazon’s stock price rose by about 5% after the news came out. However, while OpenAI is expanding its partnership, its relationship with Microsoft is not over. OpenAI reiterated its commitment last week, saying it would purchase an additional $250 billion in Azure cloud services. Therefore, for OpenAI, the agreement with AWS is a milestone in the maturity of its operations. Because by decentralizing its cloud partners and locking in long-term computing power, OpenAI is sending a signal of independence. Sam Altman recently noted that an initial public offering (IPO) would be the most likely path given OpenAI’s need for capital.




