OpenAI has stated that it currently has no plans to use Google’s artificial intelligence chips on a large scale. This announcement came after some media outlets reported that OpenAI might start using Google’s chips to meet its growing computing needs.
A spokesperson for OpenAI explained that while the company is testing Google’s TPUs (Tensor Processing Units) at an early stage, there are no active plans to deploy them broadly.
At the moment, OpenAI mainly uses AI chips from Nvidia and AMD to power its systems. The company is also working on developing its own chip, which is expected to reach the “tape-out” phase in 2025 — this is when the chip’s design is finalized and sent for manufacturing.
In addition, it was reported in June that OpenAI had started using Google Cloud to help meet its expanding computing demands. However, most of OpenAI’s computing power still comes from CoreWeave, a company that provides GPU-based cloud services.
Google has recently begun offering its TPU chips — previously used only internally — to external companies. This strategy has attracted major clients like Apple, as well as OpenAI rivals such as Anthropic and Safe Superintelligence, both founded by former OpenAI executives.
Overall, implementing a new chip system on a wide scale requires significant software and system adjustments. That’s why OpenAI is only testing Google’s chips for now and does not plan to rely on them heavily in the near future.