OpenAI and AWS logos /Courtesy of Yonhap News Agency

Bloomberg reported on the 17th that Amazon is in talks to invest more than $10 billion (about 14.8 trillion won) in OpenAI, the developer of ChatGPT, and to supply artificial intelligence (AI) chips.

A source familiar with the matter told Bloomberg that OpenAI is negotiating to adopt Amazon's in-house AI Semiconductor "Trainium" on the condition that it receives investment from Amazon. It is also considering expanding its cloud contract with Amazon Web Services (AWS) to secure additional computing power to run AI models such as ChatGPT.

Bloomberg said that if this transaction goes through, OpenAI's valuation is likely to be assessed at more than $500 billion (about 740 trillion won).

In the industry, the view is that OpenAI, which has relied on Nvidia graphics processing units (GPUs), has joined hands with Amazon to diversify its chip suppliers and keep rival Google in check. Recently, Google's AI model "Gemini 3" drew market attention with performance surpassing OpenAI's ChatGPT, and Google developed Gemini 3 based on its in-house AI chip, the tensor processing unit (TPU). Sensing a crisis, OpenAI declared an in-house code red and is pouring its full capabilities into improving ChatGPT's performance,

Previously, after freeing itself from a product mandatory-use agreement with its biggest backer Microsoft (MS), OpenAI signed a seven-year, $38 billion (about 54 trillion won) server lease contract with AWS. The investment currently under discussion is said to be an extension of that contract.

OpenAI has already signed long-term contracts totaling $1.5 trillion with Nvidia, Oracle, AMD, and Broadcom to be supplied with chips and data centers. Over several years, Nvidia will invest up to $100 billion in OpenAI, and in return, OpenAI will purchase Nvidia AI chips.

Analysts say this transaction would also be a boon for Amazon, which wants to cement its position in the AI industry. This month at "AWS re:Invent 2025" in Las Vegas, the United States, Amazon unveiled "Trainium 3." Emphasizing that Trainium 3's power consumption is 40% lower than the previous version, Amazon said it offers lower expense and higher computational efficiency for AI model training and learning compared with Nvidia GPUs.

※ This article has been translated by AI. Share your feedback here.