Can AI be green?

AI’s electricity appetite is rising fast.

North-American data-centre demand doubled between late 2022 and late 2023, hitting roughly 5 GW (MIT News, 2024).

Training the original GPT-3 consumed about 1,287 MWh and emitted ~552 t CO₂ – emissions that would have fallen by 85 % on Canada’s hydro grid (Solar Impulse Foundation, 2025). If the sector stays on its current trajectory, analysts warn global data-centre use could double again by 2030.

The scale is real, but so are the levers

Clients often feel powerless: As a business or brand, you don’t have control over where major cloud providers (like Amazon Web Services, Microsoft Azure, or Google Cloud) host or run your AI models. You can, however, decide which model you use and who runs it. A compact GPT-4-o “mini” or Claude 3 Haiku solves many day-to-day tasks at a fraction of the energy of flagship models.

Providers don’t all operate the same way. The carbon intensity of an AI output can vary dramatically depending on where the model is hosted and run. For example, Google’s data centre in Hamina, Finland runs on 97% carbon-free energy thanks to the region’s clean electricity mix. But the same workload running on a coal-dependent grid elsewhere could generate up to 20 times more emissions. That means the environmental impact of AI isn't just about what you use, it’s also about where it happens.

Three questions to ask before your next AI task or project:

1. Is the model the right size for the job?
Larger doesn’t always mean better. Choose the smallest model that meets your needs – especially for routine or repeatable tasks. It’ll cost less, run faster, and consume far less energy.

2. How transparent is the provider?
Check what your vendor publishes about its infrastructure. Look for carbon intensity metrics, renewable energy targets, and water usage disclosures. If they’re silent on all three, that’s a red flag.

3. Can the workload be made more efficient?
Small changes can make a big difference: batch tasks, reuse outputs where possible, and use vector search or embeddings instead of regenerating full responses every time. Efficiency isn’t just about speed – it’s about impact.

Intent and impact

Net zero pledges ring hollow if AI growth cancels the savings. Boards should demand lifecycle audits that account for electricity, water and hardware, and build environmental safeguards into every AI procurement and deployment.

You can't control where an AI provider runs its models, but you can control which model you use, who provides it, and how efficiently you run it.

Making smart, low-impact choices isn't necessarily easy, but hopefully this article offers practical considerations to move forward with.

Useful, lower-carbon AI is possible. But only if we design for efficiency from the start.

Next
Next

What happens to SEO in an AI-first world?