China going to win AI race thanks to cheaper electricity, warns NVIDIA CEO
Jensen Huang warns China will win AI race through energy dominance
Satya Nadella admits Microsoft’s GPUs sit idle due to power shortages
The true AI bottleneck isn’t chips or data but electricity access
We’ve finally reached that stage of the AI arms race where securing compute isn’t the primary concern for the world’s leading (tech) powers. Access to gigawatt scale electricity has emerged as the single biggest bottleneck to launching the latest and greatest AI datacenters, something that’s eventually going to tip the scales in China’s favour, according to NVIDIA CEO Jensen Huang.
SurveySpeaking at the sidelines of the Future of AI Summit, Huang wasn’t talking about algorithms or ambitions, but practicality. And he believes it’s much easier for AI companies to access cheaper energy in China, he warned. While the West drowns in cynicism, red tape, and export bans, China is quietly building the future with cheap, abundant electricity.
For a man whose company is now the world’s most valuable by market cap, this is far from idle commentary. NVIDIA’s entire AI empire is built on H100s, GB200s, and soon-to-be mythical Blackwell chips as the lifeblood of modern AI apps and services. But these chips are also insatiable energy hogs. And as Washington tightens export controls, Beijing is doubling down on energy access at a scale the US power grid simply can’t support.

Huang’s frustration mirrors that of another tech titan – Microsoft’s Satya Nadella – who recently offered his own sobering confession: “I’ve got GPUs sitting in inventory that I can’t plug in.” Imagine that. The CEO of the world’s second-most valuable company, with billions of dollars’ worth of state-of-the-art NVIDIA processors collecting dust because there’s no electricity to run them. Unprecedented!
Also read: Satya Nadella says Microsoft has GPUs, but no electricity for AI datacenters
For years, big tech’s scarcest resource was compute – how many NVIDIA chips you could hoard, rent, or smuggle through a supply chain bottleneck. Now, the great scarcity is power. Just for reference, a single hyperscale AI datacenter can draw up to 100 megawatts – enough to light 80,000 homes (in the US). Morgan Stanley projects they’ll gulp over a trillion liters of water annually by 2028 just to keep cool. Microsoft, Google, and OpenAI are all scrambling for alternatives – from nuclear microreactors to space-based solar – but none of these will arrive in time for the next AI boom cycle.
China, meanwhile, has turned energy access into an industrial weapon. In 2023, it added nearly 350 gigawatts of new generation capacity – more than the entire power output of the UK and Germany combined. It dominates 80% of global solar manufacturing, builds new coal and hydropower plants faster than the US can approve one, and operates the world’s most advanced ultra-high-voltage transmission grid.
A statement from NVIDIA CEO Jensen Huang. pic.twitter.com/Exwx54OYJV
— NVIDIA Newsroom (@nvidianewsroom) November 5, 2025
The paradox Huang points out is brutal. The same US export controls designed to slow China’s AI ascent may instead accelerate it. Cut off from NVIDIA’s GPUs, Chinese chipmakers are scaling homegrown designs – and they’ve got the electricity to feed them. Meanwhile, America’s AI giants are fighting each other (and state regulators) for megawatts. “Fifty new regulations” is how Huang described the US landscape, in stark contrast to China’s decisiveness.
Because before the future of artificial intelligence gets written in code running on the latest chips in datacenters, they still need to be turned on with electricity – a commodity that’s fast becoming more precious with each passing day.

Also read: Project Suncatcher: Google’s crazy plan to host an AI datacenter in space explained
Jayesh Shinde
Executive Editor at Digit. Technology journalist since Jan 2008, with stints at Indiatimes.com and PCWorld.in. Enthusiastic dad, reluctant traveler, weekend gamer, LOTR nerd, pseudo bon vivant. View Full Profile