PYMNTS-MonitorEdge-May-2024

OpenAI Explores Developing AI Chips to Tackle Shortages

OpenAI is reportedly exploring the development of its own artificial intelligence (AI) chips or the acquisition of a chip company to tackle the scarcity of expensive AI chips it heavily relies on.

The company has been considering multiple options, including building its own chips, collaborating more closely with chipmakers like NVIDIA and diversifying its suppliers beyond NVIDIA, Reuters reported Friday (Oct. 6), citing unnamed sources. However, no final decision has been made yet.

The shortage of graphics processing units (GPUs), essential for running AI applications, has been a significant concern for OpenAI, with NVIDIA currently dominating the market with over 80% global market share, according to the report. OpenAI CEO Sam Altman has made acquiring more AI chips a top priority for the company behind ChatGPT.

Developing custom chips would place OpenAI among the elite group of tech giants, such as Google and Amazon, that design their own chips to meet specific needs, the report said. However, this endeavor would require a substantial investment, potentially costing hundreds of millions of dollars annually. 

Alternatively, OpenAI could expedite the process by acquiring a chip company, similar to Amazon’s acquisition of Annapurna Labs in 2015, per the report. While OpenAI has conducted due diligence on a potential acquisition target, the identity of the company remains undisclosed.

Even if OpenAI decides to pursue its own custom chips, it would likely take several years to develop, leaving the company dependent on commercial providers like NVIDIA and Advanced Micro Devices (AMD) in the interim, according to the report. Building their own processors has proven challenging for some tech companies, with limited success.

OpenAI’s motivation to secure more AI chips stems from two main concerns: the shortage of advanced processors required for its software and the high costs associated with running the necessary hardware to power its AI technologies, the report said. 

The company spends up to $700,000 a day maintaining its underlying infrastructure and server costs, and recorded total losses of $540 million in 2022. Its ChatGPT-4 was likely trained on somewhere between 10,000 and 25,000 of NVIDIA’s A100 chips.

NVIDIA has become the go-to company for the computer chips used in AI applications, and its market value hit the trillion-dollar mark earlier this year thanks to high demand for those chips.

For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.

PYMNTS-MonitorEdge-May-2024