I’ve written about Gen AI and climate change before, from the point of view of what the tech companies say about it. Now, I’m looking at it from the ground up…
I’m a big user of various generative AI tools – and a reader of all sorts of articles and papers and newsletters about the subject.
One thing that comes up a lot in all that reading is the climate impact of the world’s growing use of AI tools.
I’ve long been aware that there’s a cost to my use of AI, but it wasn’t until I was training a group of people that I had to think about it in any detail.
The trainees wanted to know more about climate impact that I had at my fingertips – which pushed me to do some research. Going down the rabbit hole taught me one thing: this is a very complicated topic.
But because I needed to condense the information for the training course, I was forced to drill down to the simplest way of looking at it. This post is a distillation of what I learned, in the hope that it is useful to you.
First up, some basic concepts
Just as we plug in electrical appliances without thinking about how the power is made, the digital world we inhabit is created out of our sight, in ways that are invisible.
These are some of the things that are out of our sight:
Data centres (also called server farms) – buildings that store many computers (servers) which are connected to each other, and all of which need power. That power will be sourced from electrical grids, which generate carbon emissions. Servers generate heat (think about how your laptop gets hot), so they need cooling systems. And water is currently the easiest and cheapest method for cooling.
The lifecycle of AI – this looks beyond data centres to take into account the environmental impact of AI throughout the entire life of its physical resources. That includes the raw material needed to make physical devices like computer chips, transporting hardware and the impact of collecting, shipping, recycling, or disposing of electronic waste.
And importantly – this is not just about Gen AI, the new kid on the block. Data centres run most services performed online, including websites, cloud storage, email, and video streaming
What are the numbers?
Trying to understand the impact of AI on a particular country or on the globe proved to be the most difficult part of the research. There are many moving parts here, a lot of technical concepts and many mind-boggling statistics. In the end, I put all the articles I had found into NotebookLM and asked it to extract just one thing from the research: the climate impact of one ChatGPT query (the link takes to the notebook, where you can see all the sources for the information below). Here’s what I found:
Energy use: Older estimates suggest that one ChatGPT query uses about 3 Watt-hours (Wh) of electricity. That’s roughly ten times more energy than a typical Google search. But newer and possibly more accurate estimates suggest a query might use closer to 0.3 Wh – ten times less and equal to one Google search. That 0.3 Wh is the same amount of energy needed to run a space heater for less than one second, or leave a single incandescent light bulb on for 18 seconds. (A watt- hour is a measure of a small amount of electricity or energy consumption – it scales up into bigger measures like this: South Africa generated an average of 3,046 kilowatt-hours (kWh) per capita in 2024.)
Carbon emissions: When you account for all emissions, including the energy used for cooling massive servers and the costs of training the AI model, a single query causes the emission of about 0.28 grams of CO2, or possibly 2 to 3 grams of CO2 (using higher estimates). That’s comparable to the carbon emitted by driving a petrol-driven car for just over a metre. Watching one hour of Netflix has the same climate impact as asking ChatGPT about 100 questions.
Water consumption: A single query is estimated to use roughly 30 millilitres of water. A short conversation (20 to 50 questions and answers) with ChatGPT uses about half a litre of fresh water. (Note though that most of this water footprint is actually related to the water used to generate the electricity needed for the data centre and in training the model, not the water used directly for cooling in the centre itself.)
Looked at from an individual point of view, these are all really small amounts. But how does it look when you pan out and take in the bigger picture?
The broader climate impacts of AI
It’s at this point that I really started running into contrary and confusing estimates. I’ve tried to boil it down to the broad parameters and stay away from eye-crossing numbers. These are the some of the things that worry researchers:
1. The development and widespread use of Generative AI tools are causing a massive spike in demand for data centres, which means that demand for electricity is increasing worldwide, which has a knock-on effect in carbon emissions. This high energy demand means that, globally, data centre emissions are predicted to triple by 2030 compared to pre-AI boom projections.
2. The highly concentrated, constant demand from large data centres puts strain on local electricity grids. This pressure sometimes forces utilities to rely on older, less efficient, fossil-fuel powered plants to ensure reliable power.
3. In areas facing water stress, the clustering of data centres can put significant pressure on local water supplies. For example, the initial training of a massive model like GPT-3 consumed 700,000 litres of fresh water.
4. The physical equipment needed to run AI systems eventually becomes waste. Generative AI is expected to contribute to a rapid rise in e-waste, with an estimated 16 million tons of cumulative waste generated by 2030.
On the other hand…
AI can also improve efficiency and help manage complex systems. A 2025 study estimates that AI could reduce global emissions annually by 3.2 to 5.4 billion tonnes of carbon-dioxide-equivalent by 2035. That’s because the computing power of AI, if applied wisely, can make thing better by helping to design and implement policies, improve insights, and monitor systems. (Grantham Research Institute on Climate Change, 2025).
So what are we to do?
If you are anything like me, you now have a headache. Trying to assess our own personal use of AI in the context of global, interlocking systems is hard to do. I did however find one way of looking at things that made sense to me.
The perspective comes from data scientist Hannah Ritchie who I have always found to be a sensible and reliable guide to the world of science. She notes that many climate-conscious people feel guilty about using ChatGPT (and that people judge others for using it, because of the perceived environmental impact.)
But she says she has looked at the data and stopped worrying about it.
You can read her full reasoning here: What’s the carbon footprint of using ChatGPT? Here’s her conclusion in a nutshell:
1. Regular (and even relatively high) users of text-based LLMs should stop stressing about the energy and carbon footprint. “It’s not a big deal, and restraining yourself from making 5 searches a day is not going to make a difference. In fact, it might have a net negative impact because you’re losing out on some of the benefits and efficiencies that come from these models.”
2. Power users who generate videos and audio (not pictures) should be aware that their climate footprint may be significantly larger.
3. In general, AI energy demand is an issue for wider society (the same as other sectors that we need to electrify, such as cars, heating, or parts of industry). “It’s just that individuals querying chatbots is a relatively small part of AI’s total energy consumption. That’s how both of these facts can be true at the same time.”
Andy Masley, who Ritchie quotes, takes this further. He says there are many troubling aspects of Gen AI use – but that worrying about your personal use of ChatGPT is wasted time.
In a related post, this is his thinking:
One of the most important shifts in talking about climate has been the collective realization that individual actions like recycling pale in comparison to the urgent need to transition the energy sector to clean energy. The current AI debate feels like we’ve forgotten that lesson. After years of progress in addressing systemic issues over personal lifestyle changes, it’s as if everyone suddenly started obsessing over whether the digital clocks in our bedrooms use too much energy and began condemning them as a major problem.
What am I doing?
In this, as in everything, I take the view that starting small is the only recourse I have; that we need to see what we have as resources to be husbanded, to be maintained for as long as they can.
Just as I will mend an item of clothing rather than buying a new one, or not use chemical fertilisers in my vegetable beds, I will use Gen AI thoughtfully. That means two things:
1. Being intentional: Do I need to use AI for this? Could I do it myself? Then just do that. (Which also means being intentional about how the use of AI affects my thinking and my creativity).
2. Being less digital: Cutting the time I spend on digital technologies in general makes a lot of sense, in a lot of ways. AI usage fits in there too.
Final word: shiny new toys like Gen AI need to be viewed in the same way as everything else we do: carefully.
Main picture: Data centre in Coleraine, Northern Ireland, Geoffrey Moffett, Unsplash
Other things I have written
Nine things you need to know about electricity – We don’t think about electricity until we don’t have it any more. A primer for the technically clueless.
Make do and mend – the real revolution – In which I share a small money-saving tip – and some radical thoughts on how to save the world. (Hint: no grand actions required).
Is Gen AI making us lazy? And other 3am questions – We’re living through deep changes in the way we interact with technology. If you are worried that Gen AI is making you lazy, here are some ways to approach that…
Navigating AI hype: Five newsletters that will help – Everywhere you look, there’s an article about artificial intelligence. Here’s a list of the people I follow to help in navigating AI hype.
Figuring out which AI tools to use – it’s not a pretty picture – Doing the right thing is complicated. Figuring out which AI tools to use is not just about being cool; it’s a consumer decision too.
How can I help you make order from chaos?
Join the Safe Hands AI Explorer learning circle!
Sign up for my Sensible Woman’s Guide to AI and Content Creation, which comes out fortnightly.
Or sign up for my personal fortnightly email newsletter In Safe Hands (it’s a place to pause and think about things).
Book a free half hour of my time here. Find out how I can help with writing, editing, project management, general clear thinking, technical hand holding, or an introduction to AI.
Contact me via email.
