Billions Of AI Prompts, Billions In Energy Costs: The Planet Pays As
Data Centres Heat Up
The title for the world's most
powerful artificial intelligence model keeps changing hands as tech giants
continue to roll out new and advanced models capable of understanding user
needs better and performing tasks more efficiently—such is the nature of the AI
race. The benefits of the technology can't be overlooked, as it has impacted
almost every imaginable sector from healthcare to agriculture, education,
governance, finance, and more, helping not only with increased productivity
across industries but also playing an irreplaceable role in advancing
scientific research. This has also led to a better-than-ever generative AI,
allowing users to produce high-quality images and realistic videos with just a
text prompt.
While debates around the ethical use
and implementation of AI continue, an equally urgent concern is its
environmental impact—particularly the rising electricity consumption required
to power these systems and the high amount of water to keep them from
overheating. Organisations like the United Nations and the International Energy
Agency have documented the growing carbon footprint of tech giants, largely
driven by the energy demands of data centres supporting AI development.
However, pinpointing the environmental
cost of consumer-level generative AI applications remained difficult, leading
to only speculations—until recently. We finally have an official estimate of
energy and water consumption for an average AI query, thanks to numbers thrown
by OpenAI and a detailed account shared by Google.
Environmental cost of ChatGPT and
Gemini user queries
Back in June 2025, OpenAI CEO Sam
Altman penned a blog revealing the energy consumption of a single ChatGPT
query: 0.34 watt-hours of power (about what an oven would use in a little over
one second, or a high-efficiency lightbulb would use in a couple of minutes)
and 0.32176 millilitres of water (roughly one-fifteenth of a teaspoon). He did
not specify which model the figures referred to, nor did he provide
documentation explaining how the estimates were calculated.
Google, however, recently published a
paper detailing how it measures the carbon emissions, energy usage, and water
consumption of its AI models. It says that the per-prompt energy impact of
Gemini is equivalent to watching TV for less than nine seconds.
According to the methodology, the
median Gemini Apps text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03
grams of carbon dioxide equivalent (gCO2e), and consumes 0.26 millilitres (or
about five drops) of water.
Google calls this a comprehensive
estimate that accounts for all critical elements of serving AI globally, and is
considerably higher than simplified estimates that measure just 0.10 Wh of
energy, 0.02 gCO2e, and 0.12 mL of water. It says that most current estimates
only account for active machine usage, leading to overly optimistic figures,
but its methodology reveals that the true energy, carbon, and water footprint
of AI is significantly higher than simplified models suggest.
Google’s methodology considers not
just active computation but the full system dynamics across hardware, data
centres, and model operations, which include:
Full system power usage, including
actual chip utilisation
Idle machine energy, required for
reliability and scalability
CPU and RAM consumption, beyond
just TPUs and GPUs
Data centre overhead, like cooling
and power distribution (measured via PUE)
Water usage, especially for
cooling systems
While the simplified estimates for the
median Gemini App text prompt are 0.10 Wh, 0.02 gCO2e, and 0.12 mL water,
Google’s comprehensive estimates put them at 0.24 Wh, 0.03 gCO2e, and 0.26 mL
water, accounting for all critical elements of serving AI globally.
In addition to providing Gemini’s
energy usage, Google advocates for industry-wide consistency in measuring AI’s
resource consumption to better reflect real-world operational efficiency. “By
sharing our findings and methodology, we aim to drive industry-wide progress
toward more efficient AI. This is essential for responsible AI development,”
Google said.
The cause for concern remains
While industry experts and
stakeholders praise Google’s move, calling it a step towards transparency, they
also highlight that AI demands more resources than traditional computing, and
the load caused by billions of user queries would be worlds apart when compared
to a single prompt.
“While today’s usage might seem under
control, the real challenge is what happens when billions of such queries are
made daily. That’s when the impact becomes huge,” Manoj Dhanda, Founder and
CEO, Utho Cloud, tells ETV Bharat.
Notably, the data obtained by Axios
suggests that OpenAI’s ChatGPT alone sees more than 2.5 billion prompts every
day, which is over 912.5 billion requests every year. If we bring in tools like
Gemini, Grok, Copilot, Character AI, and others into the mix, the number of
daily AI prompts worldwide will climb even higher.
The environmental cost of training
these models is another story altogether.OpenAI in 2018 revealed that the
amount of “compute” required to train its largest AI models was doubling every
3-4 months. Researchers at the time estimated that training GPT-3 (with 175
billion parameters) consumed 1,287 megawatt hours (MWh) of electricity and
generated 552 tons of carbon dioxide, equivalent to the annual emissions of 123
gasoline-powered passenger vehicles.
The emission is believed to be escalating
since then, according to an Epoch report, as the training requirements for the
more recent frontier AI models have increased by 4-5 times annually during the
past four years. According to the National Institute of Communication Finance
(NICF), data centres are on track to account for 14 per cent of all global
emissions by 2040.
While Google showcased a lower
emission level for a single text prompt, its overall carbon footprint has
increased exponentially, primarily due to its AI efforts. Driven by the energy
demands of power-intensive data centres, indirect carbon emissions from the
operations of four leading AI-focused tech companies—Google, Microsoft, Meta,
and Amazon—have increased by an average of 150 per cent between 2020 and 2023,
according to Greening Digital Companies 2025 report, published by the
International Telecommunication Union (the UN agency for digital technologies)
and the World Benchmark Alliance.
Environmental impact of AI data
centres
Data centres are used to train and run
the large language models, multimodal models, and deep learning models that
power scientific research, enterprise needs, and consumer-centric generative AI
tools, allowing users to generate text, images, and videos. While they have
been around for years to support the internet, cloud computing services, and
other obligations, the AI race has prompted a sudden rise in their construction
and operation. For instance, Meta plans to spend hundreds of billions of
dollars to build several massive AI data centres for superintelligence, with
its first multi-gigawatt data centre, dubbed Prometheus, expected to come
online in 2026.
Zigment AI Founder and CEO Dikshan
Dave says, “Artificial intelligence has the power to transform industries, but
its environmental footprint is also significant and should not be ignored.
Training and running AI models do consume significant electricity and water,
since data centres need power to run servers and keep systems cool.”
According to IEA, while traditional
data centres use between 10 and 25 megawatts (MW) of power, demand by
hyperscale AI centres can exceed 100 MW–equivalent to the annual electricity
consumption of 100,000 households. The largest data centre announced is set to consume
as much electricity as 5 million households.