The San Francisco A.I. start-up believes there is not enough computing power on Earth to build the artificial intelligence it wants to create.
Earlier this year, word spread that Sam Altman, OpenAI’s chief executive, was pitching a plan that he hoped would pump trillions of dollars into the construction of new silicon chip factories and computer data centers.
Mr. Altman’s advisers and potential partners have since walked back that figure, which was equal to about a quarter of the economic output of the United States. But OpenAI still hopes to raise hundreds of billions of dollars.
It is an extravagant plan, but there is an explanation for it.
Why does OpenAI care so much about chips and data centers?
Chatbots like OpenAI’s ChatGPT learn their skills by analyzing almost all the text on the internet, including books, Wikipedia pages, news articles, computer programs and countless other online sources. (The New York Times sued OpenAI and Microsoft in December for copyright infringement of news content related to A.I. systems.)
All this “machine learning” requires a tremendous amount of computing power. That comes from specialized silicon chips packed into warehouselike data centers in places including Silicon Valley, Washington State and Oklahoma.
OpenAI is trying to raise the money needed to build more chips and pack them into more data centers.
So, OpenAI wants to get into the hardware business?
Not exactly. It wants other companies to build this new infrastructure. These are basically the same companies that build artificial intelligence chips and data centers today.