Graphic=Jeong Seo-hee

Google is strengthening its movement to supply its self-developed artificial intelligence (AI) chips to external corporations, directly challenging NVIDIA.

IT media outlet Deinformation reported on the 3rd (local time) that Google is in discussions to introduce its AI chip, the Tensor Processing Unit (TPU), to small cloud providers in data centers. It is known that an agreement was reached to equip the TPU at the New York data center with the cloud company Fluidstack, headquartered in London.

Google is also conducting similar negotiations with companies like Crusoe, which is building a dedicated data center for OpenAI, and CoreWeave, which has received investment from NVIDIA. Deinformation noted that Google's target is primarily new cloud service providers that heavily rely on NVIDIA chips.

In particular, Google offered Fluidstack incentives in the form of operational cost guarantees. This means they would support up to $3.2 billion if Fluidstack is unable to bear the operating costs of its New York data center.

Google has been expanding its revenue and reducing its reliance on NVIDIA through the development of its own TPU. Recently, it has utilized the TPU in major projects, including the AI model 'Gemini', and has been leasing chips to external corporations through Google Cloud. The demand for the 6th generation TPU 'Trilium', released at the end of last year, is surging, and demand for the 7th generation 'Ironwood', designed for large-scale inference tasks, is also expected to grow.

※ This article has been translated by AI. Share your feedback here.