In the race to build smarter, faster, and more capable AI, another competition is playing out quietly in the background: energy efficiency. Each chatbot query may feel weightless when you type it out, but behind the screen, servers are buzzing, GPUs are firing, and datacenters are drawing power. And as AI becomes a daily tool for millions, those watt-hours add up to something much bigger than a single conversation.
A new wave of research and disclosures in 2025 has finally shed light on just how much electricity the world’s leading chatbots consume per query and the differences are stark.
Also read: 5 Ways India is Helping the Environment: From Net Zero to Beyond Plastic
According to the latest estimates:
For context, a Google search consumes about 0.3 Wh, while streaming a minute of video on YouTube can use 1–3 Wh. That means most advanced chatbots are now on par with or only slightly above a search query, a remarkable leap in efficiency compared to just two years ago, when estimates for ChatGPT queries were as high as 3 Wh each.
But there’s a catch: these numbers generally reflect short, text-only prompts. Once you shift into image generation, video synthesis, or other multimodal tasks, energy usage spikes dramatically, sometimes into the tens of watt-hours for a single query.
Also read: Claude is taking AI model welfare to amazing levels: Here’s how
The sharp differences reflect not just the size of models, but also infrastructure, hardware optimization, and energy strategy.
A single half-watt text query may not seem like much, but scale it up and the stakes become clear. Billions of daily prompts translate into megawatt-hours of electricity use. For instance, one million queries to Grok could consume as much energy as powering a U.S. household for a year.
And that’s before factoring in the heavy-duty side of AI: image generation, code compilation, or video outputs can multiply energy use by 10, 20, or even 100 times compared to a simple text exchange. The “per query” figures most companies share are therefore only part of the story.
The latest figures suggest a trend: the leading chatbots are converging around 0.3 Wh per text query, a massive improvement from just a couple of years ago. But the outliers show the risks, not every AI company is building with efficiency at the forefront.
As users, we may not feel the watt-hours behind our questions. But collectively, the power draw of generative AI is shaping climate debates, infrastructure investments, and even regulatory conversations.
The question now is simple: can AI’s energy appetite keep shrinking, even as its capabilities keep expanding into image, video, and beyond?
Because from ChatGPT to Gemini, Claude to Grok, every prompt we type or picture we generate is part of a much bigger power equation.
Also read: Mistral AI study highlights the environmental impact of LLMs