Even ChatGPT Will (Almost) Tell You About AI’s Growing Carbon Footprint

AI chatbots like ChatGPT raise environmental concerns, but the transparency around these issues is often lacking.

Anoushka Rajesh
Climate Change
Published:
<div class="paragraphs"><p>There’s been plenty of discussion about the environmental impact and resource-heavy nature of AI chatbots, but it remains true that we don’t yet have a clear picture of the extent of it.</p></div>
i

There’s been plenty of discussion about the environmental impact and resource-heavy nature of AI chatbots, but it remains true that we don’t yet have a clear picture of the extent of it.

(photo: The Quint)

advertisement

Are we prepared for the climate crisis staring us in the face? The Quint wants to go big on telling the most important stories of our time. Support us to tell them. Become a member now.

Take a shot for every time you've heard the term "AI" over the past year. Actually, don't do that — chances are, you'd end up with alcohol poisoning. Stale jokes aside, AI is now everywhere. Especially generative AI.

From your workplace to family WhatsApp groups, generative AI chatbots are there to provide answers, create presentations, and help refine your thoughts and ideas.

But as with all "good things", there’s a cost — and it turns out, the price of using AI chatbots is paid by the planet.

This isn't exactly new information. There’s been plenty of discussion about the environmental impact and resource-heavy nature of AI chatbots, but it remains true that we don’t yet have a clear picture of the extent of it.

How do advanced AI chatbots like ChatGPT impact the environment? Let’s hear it straight from the source.

ChatGPT Says...

Since OpenAI's ChatGPT is one of the most advanced and widely used forms of generative AI tools, this is the chatbot we will be interacting with. It's important to note that the chatbot's responses may vary depending on the version, the prompts used, and the model it is trained on.

These excerpts are from the answers provided by ChatGPT in response to specific prompts.

Initial Prompt: You are an environmental researcher and journalist. You are working a story to find out the environmental impact of using artificial intelligence for day-to-day work. You will have to answer a few questions based on available research but also cite sources for each of the responses given by you.

What is ChatGPT's carbon footprint?

ChatGPT: "The carbon footprint of ChatGPT (and similar large language models) primarily arises from the energy consumed during the training phase and the inference phase (when users interact with the model). The impact can be quite significant due to the high computational resources required for both training and running these models.

One study from University of Massachusetts Amherst (2020) estimated that training a large AI model (similar in size to GPT-3, which powers ChatGPT) can emit over 284 tons of CO2. This is roughly equivalent to the emissions of five cars over their entire lifetimes.

After the model is trained, each interaction (like a conversation with ChatGPT) still uses computational resources. The energy consumed during inference is much smaller compared to training but can still add up significantly due to the large number of queries being processed.

AI companies are increasingly aware of these environmental impacts and are exploring ways to mitigate them..."

So, while ChatGPT explains how training and running AI models, along with maintaining data centres, require large amounts of energy, and that training models is a huge contributor to carbon emissions, it remains vague about the actual energy consumption and doesn't not provide granular data or dive into specifics.

For instance, while the answer mentions that the inference phase consumes energy, it doesn't break down how much energy ChatGPT uses per query or how this varies based on factors like the complexity of the request or the model's deployment (for example, scaling for multiple users). In practice, large models like ChatGPT require significant resources per query, and as the use of generative AI continues to rise, so does the energy consumption of data centers.

The 2020 study that ChatGPT is citing when talking about the 284 tons of CO2 emission linked to training a large AI model is this study cited in a report by UNESCO.

A more recent study from 2023 reported that training GPT-3 with 175 billion parameters resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 cars on petrol for a year.

According to a study conducted by scientists at MIT and published in 2024, the power requirements of data centers in North America alone grew from 2,688 megawatts at the end of 2022 to 5,341 megawatts by the end of 2023, with part of this increase attributed to the demands of generative AI. And the global electricity consumption of datacenters rose to 460 TWh in 2022.

A generative AI training cluster can consume about seven or eight times more energy than a typical computing workload, according to Noman Bashir, Computing and Climate Impact Fellow at MIT Climate and Sustainability Consortium (MCSC), and the lead author of the study.

Moreover, what is not mentioned here is that resources (water and electricity) are also required to maintain idle servers even when that are not actively being used.

ADVERTISEMENT
ADVERTISEMENT

Is using ChatGPT bad for the environment?

ChatGPT: "The environmental impact of using ChatGPT depends on several factors, and whether it can be considered "bad" for the environment is a complex question...In isolation, the use of ChatGPT does contribute to environmental costs in terms of energy consumption and carbon emissions, especially during the training phase and in regions where fossil fuels power the data centers. However, it is not inherently "bad" for the environment if companies and users take steps toward mitigating these impacts. Many AI companies are committed to reducing their environmental footprint by adopting renewable energy, improving energy efficiency, and offsetting carbon emissions."

Again, when asked a simple, single-line question as this, ChatGPT once again highlights and breaks down the key points, but it sticks to providing broad answers without going into details. However, with a bit of prodding, it’s possible to get more specific data as well.

What is ChatGPT's carbon footprint (quantify it)?

ChatGPT: "Training large AI models like GPT-3 (which powers ChatGPT) requires massive computational resources. Here's an estimate based on available research:

Training a model like GPT-3 can consume over 1,287 MWh of energy. To put this into perspective, this is roughly the amount of energy consumed by 120 US homes for a year (according to the US Energy Information Administration).

The carbon emissions from training a large model like GPT-3 depend on the energy mix of the data centers. If we assume an average carbon intensity of 0.4 kg CO2 per kWh (this can vary by region and energy mix), the emissions from training GPT-3 would be approximately: 1,287MWh×1,000kWh/MWh×0.4kg CO2/kWh=514,800kg CO2 or about 515 metric tons of CO2."

This number is much closer to the estimated by the 2023 study cited above.

"While individual queries have a low carbon footprint, the overall environmental impact of ChatGPT will grow with usage...Each query generates approximately 0.12 grams of CO2. With millions of queries, this could add up to tens of metric tons of CO2 annually."

"For context, the carbon footprint of a single Google search is estimated to be about 0.3 grams of CO2, which is far lower because Google's models are optimized for efficiency. ChatGPT queries may be higher in comparison because of the model's size and complexity."
ChatGPT

A little more prodding — and ChatGPT tells us that a single prompt can be estimated to consume about 0.0003 kWh of energy. To put this into context, this is roughly equivalent to the energy consumed by a low-power LED light bulb (around 1 watt) running for 18 minutes.

It's worth noting that ChatGPT does underscore that while the energy consuption of a single query might be low, with millions of queries being made everyday, it can add up.

According to OpenAI CEO Sam Altman, ChatGPT had 300 million weekly users as of December 2024.

How does ChatGPT use water?

ChatGPT: "ChatGPT, like other large AI models, relies on extensive computing resources for training and inference, and water plays a role in cooling the infrastructure that supports these processes...The overall environmental impact of AI models, including the water used, is highly dependent on the geographical location of the data centers and the methods of cooling employed. While AI companies are increasingly transitioning to more sustainable energy sources and water-saving practices (like closed-loop cooling systems), water consumption remains a significant consideration for the environmental impact of these technologies."

Again, the response doesn't provide specific figures or even general estimates of how much water is consumed by data centers hosting AI models like ChatGPT.

This answer, and the ones before, briefly touch on geographical locations as a caveat but doesn’t elaborate on how water usage can be significantly impacted by geography and the subsequent impact on the local environment. For example, data centers in arid regions may need more water for cooling than those in cooler climates. And, in areas where fresh portable water is already scarce, it can create trouble for low-income communities in the area.

The matter is further complicated by the fact that developing economies like India are prioritising the expansion of data centres "by giving incentives like subsidies in electricity, exemption from power cuts, and exemption from pollution control norms, to states to set up data centres," explains Dona Mathew, Senior Research Associate, Digital Futures Lab.

Broad Brushing Doesn’t Capture the Full Picture

In all the searches, ChatGPT was largely correct in broad strokes but not in details — and it made errors of omission that reflect a larger societal oversimplification of climate change.

It discusses the "indirect" environmental impact but overlooks the magnitude of this effect.

Interestingly, ChatGPT was also quick to direct towards the efforts being made by big tech companies to address the issue, even if it is in vague terms. Some version of this line shows up after pretty much every prompt:

Screenshot of ChatGPT conversation (as on 15 February 2025).

(Photo: Anoushka Rajesh/The Quint)

Screenshot of ChatGPT conversation (as on 15 February 2025).

(Photo: Anoushka Rajesh/The Quint)

Screenshot of ChatGPT conversation (as on 15 February 2025).

(Photo: Anoushka Rajesh/The Quint)

Each concern was also often followed with reassurances that, if properly managed, the negative effects wouldn’t be so severe. In doing so, it seemed to downplay the overall impact, suggesting that with the right control, the consequences could be minimised or even avoided entirely.

While transitioning to renewable energy is a crucial step in mitigating this impact, the growing energy demands of data centers are rapidly increasing — much faster than clean energy can currently accommodate — and are projected to become a major source of electricity demand in the next decade, as highlighted in the MIT study.

In the US, this has already led to delays in the retirement of fossil fuel power plants and proposals to bring some retired fossil fuel and nuclear plants are being tabled.

According to the authors of the MIT study,

"The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants."

Dona Mathew wasn't surprised that ChatGPT's responses are lacking — she tells The Quint, "Whatever data it is being trained would have elements of recent conversations online and general information around it. I'm not sure how accurate this information is."

A study published in December 2024 found bias in chatbot outputs while characterising the issues, causes, consequences, and solutions to environmental challenges.

The authors of the study titled – 'Does AI bias perceptions of environmental challenges?' – found that chatbots tend to suggest small, traditional solutions to environmental problems, relying on past experiences, instead of recommending more drastic changes to current economic, social, and political systems.

"Chatbots avoid mentioning radical solutions to environmental challenges and instead suggest incremental approaches rooted in past experience. This last bias is particularly consequential given the incommensurability of incremental action with a narrowing timeframe for action on many environmental challenges," the study authors noted.

"There is definitely a bias against hard environmental action in the chatbot's responses," adds Mathew.

The bottom line, she says, is "we need more transparency," especially as generative AI becomes the primary tool for fetching information used by individuals, websites, and even entire web browsers.

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT