• Futurists x Entrepreneurs

Leaving the Lights On: The Hidden Carbon Cost of Lazy AI

Dec 20, 2025

Leaving the Lights On: The Hidden Carbon Cost of Lazy AI

We spend a lot of time talking about AI ethics, bias, hallucinations, job displacement, who controls what, and where all of this might lead humanity. Those conversations matter. But there’s another dimension of responsibility that gets far less attention and feels much more immediate to me: energy efficiency.

Not efficiency in a policy-paper sense, and not in some distant future where regulations catch up, but in the everyday way we use AI right now. Every prompt, every regeneration, every “analyze this” comes with a real energy cost. A single ChatGPT-style query can consume several times more energy than a traditional Google search, with some estimates putting it close to an order of magnitude higher. More compute means more servers running, more electricity flowing, and more emissions embedded in what looks like a harmless question box on a screen.

What really makes this uncomfortable is watching how casually AI is used in practice. Vague prompts that force models to overthink. Long responses generated with no real intention to read them. The same question asked five different ways just to see what comes out. Analysis requested for problems the user doesn’t actually care to act on. At scale, this behavior isn’t clever or curious, it’s wasteful.

It’s not very different from leaving the lights on in an empty room. If someone ran their AC with the windows open and justified it by saying the grid was getting cleaner, we’d still recognize the inefficiency. With AI, though, we’ve convinced ourselves that compute is somehow infinite. It isn’t.

Data centers are already one of the fastest-growing energy loads in the world, and AI acts as a powerful multiplier. This isn’t just another layer of software on top of the internet; it’s an accelerator of demand, and the curve is steep. The problem isn’t that AI uses energy, everything valuable does. The problem is how little intention we bring to its use.

So far, “responsible AI” has been framed almost entirely around ethics and governance. That framing is necessary, but it’s incomplete. Responsibility also means efficiency. It means writing clearer prompts, asking for the minimum useful output, actually reading before regenerating, and not outsourcing thinking you don’t plan to use. These aren’t productivity hacks—they’re energy decisions.

Efficient prompts don’t just produce better answers. They reduce compute. They lower demand. They quietly cut emissions. Bad prompts do the opposite, scaling waste at machine speed.

Every major technology eventually forces a new kind of literacy. Electricity did. Cars did. The internet did. AI is doing it now. If AI is becoming core infrastructure, closer to energy, water, or transportation than to a novelty app, then how we use it matters just as much as what it can do.

Waste is still waste. It just runs on GPUs now.

So here’s the question I keep coming back to: if AI is on track to become one of the largest energy consumers on Earth, shouldn’t the way we prompt it be part of climate literacy?