How Much Energy Does AI Really Use? Understanding ChatGPT’s Carbon Footprint


You know those moments when life backs you into a corner and you have to reconsider something you swore you’d never do?
That was me with AI.

I was vehemently against it — loudly, proudly, and with a touch more dramatic flair than necessary.
The ethics. The unknowns. The “are we building our robot overlords?” vibes.
We can (and will) get into that in another post.

But then real life happened.

The work on my plate got bigger.
The hours I had available got smaller.
The pressure to know everything instantly? Persistent.

Somewhere between parenting, researching, and trying to keep all the balls in the air, I caved. 
I opened ChatGPT — very reluctantly.

And because I’m me, my first question was: is using ChatGPT secretly destroying the planet? Read on — you can decide for yourself and let me know.

First: What the Real Data Says (Not the Headlines)

Let’s ground this in facts, because the internet is vibrant with misinformation right now.

Three of the most reliable, politically-neutral sources say this:

International Energy Agency (IEA)

AI accounts for a small share of global data center electricity use, which is 1–1.5% of global electricity consumption.
(That 1–1.5% includes Google searches, Netflix, email servers, Dropbox, iCloud, and every app you’ve ever forgotten to delete.)

Stanford AI Index Report (2024)

The big energy cost happens during training.
Normal use — called inference — is exponentially smaller.

Allen Institute for AI

Most news stories dramatically inflate AI energy use by confusing the energy needed for training with the energy for day-to-day use.

If this sounds like comparing “baking a birthday cake” to “slicing a piece of birthday cake,” you're not wrong.

The Part Nobody Talks About: Training vs. Daily Use

Training (the big energy event)

This happens once.
It uses significant energy, yes — similar to other large-scale scientific computing projects like weather modeling or pharmaceutical research.

But your everyday prompts are not retraining anything.

Inference (your everyday use)

This is what happens when you:

  • draft an email

  • ask for a recipe

  • research a new car

  • help with math homework

  • write a polite text when you are anything but polite

And its energy use is small — similar to:

  • 30–60 seconds of HD streaming

  • a few dozen emails

  • scrolling for a minute or two

In other words:
If you’re comfortable streaming, emailing, or using GPS, you are already engaging with vastly larger digital energy use.


Who’s Actually Responsible for the Energy Footprint?

Let’s make this very clear — and very fair.

1. Training Phase (big labs)

Who’s responsible:
OpenAI, Google, Meta, Microsoft, Amazon, Anthropic, and other AI labs that train large models.

This is where the one-time energy and water use happens.

Users’ role:
None. You did not ask anyone to train anything. You’re using what already exists.

2. Infrastructure (data centers + cooling)

Who’s responsible:
Cloud providers:

  • Amazon AWS

  • Microsoft Azure

  • Google Cloud

They choose where data centers are built, how they’re cooled, and whether they run on renewables or fossil fuels.

Users’ role: minimal. You don’t control how servers are powered.

3. Inference (daily use)

Who’s responsible:
The AI provider + data center operators.

Energy use: extremely small per prompt.

Users’ role: a drop in the ocean.

4. System-Level Responsibility

Governments, regulators, and energy providers set the rules:

  • renewable adoption

  • efficiency standards

  • reporting requirements

  • carbon targets

These choices outweigh anything an individual user could do.

Am I Still Morally Connected to the Original Training Cost?

Short answer:
Yes — philosophically.
The model exists because training happened.

But practically?
My per-use footprint is tiny, because that energy was spent once for the benefit of millions of users.

It’s like feeling morally responsible for the carbon footprint of an airplane you didn’t manufacture — but you did, eventually, sit in seat 14B.

Also — and this is the part that gave me peace — the world was moving ahead with AI whether I personally participated or not.
5G rolled out.
Cloud infrastructure expanded.
Governments and institutions were already two steps ahead.

My individual moral stance wasn’t pausing global development.
It was just slowing down my workflow.

So Let’s Make This Tangible:

How Much Energy Does a Week of Regular AI Use Actually Consume?

Let’s imagine a very normal, very human week of AI use:

  • 2–3 prompts to draft a work email

  • 10–15 prompts researching a new car

  • 1–2 prompts finding a dinner recipe

  • 4–6 prompts helping a middle-schooler with homework

  • 1 “can dogs eat basil?” question (important)

  • 3–5 prompts planning meals

Total: 25–35 prompts per week.

Here’s what that translates to:

= One load of laundry

Yes. A single load.

= A 5–7 mile drive

Basically: the distance you drive because you forgot your wallet.

= Less than one mile of energy per seat on a commercial flight

You would need to use AI thousands of times in a week to equal one cross-country flight per person.

= Leaving an LED bulb on for 2–4 hours

That’s it.

Not dramatic. Not infrastructural. Not planet-ending.

Where This Leaves Me (And Maybe You)

I’m still cautious about AI.
I still have ethical concerns, questions about guardrails, and thoughts about long-term cultural impact.
But I’m also realistic.

My time is finite.
My work is real.
And AI helps me get to better work faster — research, editing, clarity — not replacing thought, but supporting it.

So yes, I use AI.
I use it intentionally.
I ask hard questions about it.

Now you get to decide where you land.
If anything here made you pause, send it to a friend and start the conversation.
I’d love to hear what you decide.

What We’re Loving Right Now

We love transparency — especially when it cuts through sensationalism.
We love the growing research making AI’s energy footprint easier to understand.
And we love tools that actually help households lower energy use elsewhere, like:

  • smart thermostats

  • high-efficiency washers

  • induction cooktops

  • heat pumps

These are areas where energy savings are actually meaningful — and where small decisions add up quickly.

FAQ

Is AI worse for the environment than streaming video?

No. Streaming—especially HD and 4K—uses significantly more energy than AI prompts.

Does using AI contribute meaningfully to climate change?

Not at normal household usage levels.

Why do headlines make AI sound catastrophic?

Because they usually report training data as if it's daily use data.

Should I feel guilty about using AI?

If you’re comfortable using cloud storage, GPS, social media, email, or streaming platforms, your AI usage fits squarely in the same category.

TL;DR

  • AI uses energy — but not remotely at the level headlines suggest.

  • Training large models uses the most energy; you aren’t retraining anything when you ask AI to help draft an email.

  • A week of typical AI usage equals roughly:

    • one load of laundry, or

    • a 5–7 mile drive, or

    • less than one mile of commercial flight energy per seat.

  • Daily AI prompts use about the same energy as a few minutes of video streaming.

The biggest responsibility lies with AI labs, cloud providers, governments, and infrastructure, not individual users.

Affiliate Disclaimer

As always, What We’re Loving Right Now only recommends products we genuinely use, adore, and would gift to our own best friends. Some of the links in this post are affiliate links, which means we may earn a small commission — the kind that helps us keep sharing the things we truly love (and never the ones we don’t).

Next
Next

Stefanie Bales