AI's Hidden Environmental Cost: Why 'Please' in Prompts Isn't the Real Issue
AI's Environmental Cost: Beyond 'Please' in Prompts

A curious idea has been circulating online: that removing words like 'please' and 'thank you' from your ChatGPT prompts could help save the planet. The logic seems sound at first glance. AI systems process text incrementally, so longer prompts require slightly more computation and therefore consume a bit more energy. OpenAI's chief executive, Sam Altman, has himself noted that such small increments add up to significant operating costs at the scale of billions of queries.

The Negligible Impact of Polite Language

However, experts are clear that the environmental cost of politeness is effectively a myth. While technically accurate, the energy difference from a few extra words is utterly negligible compared to the vast power required to run the underlying data centre infrastructure. The persistence of this belief is perhaps more telling than the claim itself. It indicates a public intuition that artificial intelligence is not the immaterial, cloud-based service it often appears to be, but a technology with a tangible physical and environmental footprint.

This instinct is worth serious consideration. AI depends on large, high-density data centres that draw substantial electricity and require continuous, water-intensive cooling. These facilities are embedded in complex systems of energy supply, water management, and land use. As adoption of AI expands globally, so too does this underlying resource footprint. The critical environmental question, therefore, shifts from how individual prompts are phrased to how frequently and intensively these powerful systems are used.

Why Every AI Query Carries a Fresh Energy Bill

A fundamental structural difference explains why AI consumption is so resource-intensive. When you stream a video or open a document, the system is primarily retrieving existing data; the major energy cost has already been incurred. In contrast, each query to a large AI model like ChatGPT triggers a fresh computation, known as an 'inference'. This requires a full computational pass through the model, generating energy demand anew every single time.

This characteristic makes AI behave less like conventional software and more like heavy infrastructure, where use translates directly into energy demand. The scale is already significant. Research published in the journal Science estimates data centres account for a considerable share of global electricity use, with demand rising rapidly due to AI. The International Energy Agency (IEA) has warned that electricity demand from data centres could double by 2030 if current growth trajectories continue.

Beyond Electricity: A Systemic Footprint

Electricity is only one part of the environmental equation. Data centres also require large volumes of water for cooling, and their construction consumes land, materials, and creates long-lived physical assets. These impacts are felt locally, even when the services provided are global.

The situation in New Zealand, highlighted by Richard Morris, a Postdoctoral Fellow at Lincoln University, offers a clear case study. While the country's high share of renewable electricity attracts data centre operators, this demand is not impact-free. Large facilities can pressure local grids, and claims of renewable supply don't always equate to new green generation being built. Electricity powering servers is electricity unavailable for other critical uses, especially during dry years when hydroelectric generation is constrained.

Viewed systemically, AI introduces a new metabolic load into regions already straining under climate change, population growth, and competing resource demands. Energy, water, land, and infrastructure are tightly linked; a change in one area propagates through the entire system.

Moving Beyond the Myth to Mature Policy

This interconnected reality matters profoundly for climate adaptation and long-term planning in the UK and worldwide. While adaptation work often focuses on land management and resilient infrastructure, AI development is frequently planned in isolation, as if it were a purely digital concern rather than a physical one with persistent resource needs.

The popularity of the 'please' myth is less a factual error and more a signal. The public senses AI has a substantial footprint, even if the precise language to discuss it is still evolving. Taking this intuition seriously opens the door to a more grounded conversation. The consequential questions are no longer about politeness in prompts, but about how AI infrastructure is integrated into national energy planning, how its water use is managed, and how its growing demand competes with other social and environmental priorities.

This is not an argument to reject AI, which delivers immense value across healthcare, research, and logistics. But like any major infrastructure, it carries significant costs alongside its benefits. Treating AI as immaterial software obscures these costs. Acknowledging its physical nature is the first step towards managing its environmental impact responsibly as its role in our society expands.