Nvidia’s push into overseas artificial intelligence (AI) infrastructure highlights growing worries about America’s outdated power grid.
The potential departure of AI investment spotlights a critical infrastructure challenge facing U.S. competitiveness: While foreign markets move quickly to build power systems for next-generation computing facilities, American utilities’ lengthy deployment timelines for new electrical capacity could redirect billions in tech investment abroad and reshape the global AI landscape.
“If the U.S. only relies on traditional approaches to expand the country’s power infrastructure to meet AI infrastructure’s voracious appetite for power, the U.S. will fall behind on its plans to lead in AI globally,” Allan Schurr, chief commercial officer at energy transition company Enchanted Rock, told PYMNTS. “Delays in adding new transmission and generation capacity, which can range from three to 10-plus years, will force companies to seek out alternative locations to support AI infrastructure.”
The power grid’s limitations threaten U.S. competitiveness beyond just tech firms. With retailers, banks and logistics companies all depending on AI for daily operations, infrastructure bottlenecks could slow economic growth across sectors.
Nvidia Chief Scientist William Dally warned recently that America’s slow power grid expansion threatens its AI dominance as companies seek overseas locations for new facilities.
At an industry conference, Dally said the massive electricity needed to run and cool AI data centers will drive development to countries that can rapidly deploy new power generation, Politico reported Thursday (Nov. 14). He noted such facilities are increasingly unlikely to be built in the United States, where grid capacity expansion lags behind global competitors.
Powering AI presents significant energy challenges due to the immense computational demands of training and running large models. Data centers hosting AI workloads require vast amounts of electricity, often straining local power grids. The rise of generative AI has driven demand for GPUs and TPUs, which are energy-intensive compared to traditional CPUs. Cooling systems for these high-performance processors further amplify energy consumption. The environmental impact is stark: AI training can emit as much carbon dioxide as dozens of flights.
Experts warn that America’s energy infrastructure is struggling to keep pace with AI’s rapid growth. Power shortages during peak times — about 500 hours annually — are creating roadblocks, the North American Electric Reliability Corporation reported. Data centers in AI hubs like Texas rely more on renewables, but face issues balancing supply and demand. While federal and state grid modernization efforts are underway, delays leave AI companies struggling, according to an industry analysis by McKinsey & Company.
Contrary to popular belief, Schurr said there’s no shortage of baseload power to support large AI-driven energy demands. He said the real issue lies in managing the roughly 500 hours a year when the grid faces electricity shortages. For AI companies, the challenge is securing flexible power during these critical periods.
“For example, most regions have about 25% additional capacity if the data center can self-supply from onsite power generation during the top 5% of the annual hours,” he said. “Since data centers typically install onsite backup generation, shifting to cleaner microgrids for this backup power can also allow them to achieve this 5% self-supply.”
As data centers struggle with growing energy demands from AI workloads, companies are exploring innovative cooling solutions. For example, the company Vannadium combines phase-change materials with distributed ledger technology in an attempt to make data centers more efficient.
CEO Rick Gilchrist told PYMNTS: “The future of data centers lies in blending breakthrough technologies like advanced phase-change materials with digital infrastructure innovations such as distributed ledger technology. By addressing energy efficiency and heat management, we can not only mitigate grid constraints but also ensure the U.S. remains competitive in the AI race against nations with faster infrastructure deployment.”
While the industry is actively implementing energy efficiency measures and alternative cooling solutions to manage power demand, Schurr noted these improvements alone cannot address the critical “time to power” bottleneck facing data centers.
“Perhaps 10-20% improvements in data center energy use can be realized from these alternative cooling approaches, but we need much more than that to solve the broader grid challenge,” he said.