Skip to content

AI datacentres need huge amounts of power — what UK readers should ask next

Retro-futurist 1950s-style illustration of a friendly glowing AI datacentre connected to wind turbines, pylons and UK homes, ordinary people looking thoughtfully at an electricity meter, playful optimistic illustrated magazine style, no text, no captions, no signage.

The UK government wants Britain to become a serious AI economy. That ambition now has a very practical question attached to it: where will all the electricity come from?

The Guardian has reported that two parts of government appear to be working with very different assumptions about how much power AI datacentres may need by 2030. The Department for Science, Innovation and Technology’s own compute roadmap says the UK may need at least 6GW of AI-capable datacentre capacity by the end of the decade. The Department for Energy Security and Net Zero, meanwhile, has reportedly pointed to wider commercial-services energy forecasts that imply a much smaller increase.

That may sound like a Whitehall spreadsheet problem. It is not. AI depends on datacentres: buildings full of specialist chips, cooling systems, backup power and network equipment. If the UK builds many more of them, the impact will not be limited to technology firms. It could affect planning decisions, local electricity infrastructure, water use, climate targets, and the cost and reliability of the digital services people use every day.

Why AI needs so much power

When you ask a chatbot to summarise a document, generate an image, write code or plan a holiday, the work does not happen inside your phone. Most of it happens in a datacentre, often using powerful graphics chips designed to run huge numbers of calculations in parallel.

Those chips need electricity. So do the cooling systems that keep them working. The biggest AI systems also need large amounts of power while they are being trained, and more again when millions of people use them afterwards. That is why the government’s UK Compute Roadmap treats compute as national infrastructure rather than just another technology service.

The same roadmap says the government wants AI Growth Zones, more public compute, new supercomputing capacity and private investment in AI datacentres. It also says energy needs will have to be addressed through the AI Energy Council and options such as renewables, advanced nuclear and grid improvements.

In plain English: the AI boom is not only about clever software. It is also about substations, planning permission, cables, cooling, land, water and long-term energy choices.

Why ordinary UK readers should care

Most people will never visit an AI datacentre. But they may feel the effects indirectly.

First, there is the question of public priorities. If the UK wants faster AI tools in hospitals, schools, councils and businesses, it needs enough infrastructure to run them. That could be useful. AI can help with research, admin, accessibility, fraud checks, customer support and many other tasks. But infrastructure choices involve trade-offs, especially when the electricity grid is already under pressure from electric vehicles, heat pumps, homes and industry.

Second, local communities may be asked to host large datacentres. These sites can bring jobs and investment, but they can also raise concerns about noise, energy demand, water use, visual impact and whether local people see much benefit. The best time to ask those questions is before a project is waved through, not after construction begins.

Third, there is a climate point. The UK’s AI plans are being developed alongside net-zero commitments. If departments disagree about the likely energy demand, it becomes harder for the public to judge whether the plan is realistic. Clean power can reduce the emissions impact, but it does not remove the need for honest accounting.

This is not a reason to reject AI

It would be too simple to say “AI uses electricity, so AI is bad”. Almost every modern technology has an energy cost, including streaming video, online banking, cloud storage and mobile networks. The useful question is whether the benefits are clear enough, the costs are honestly measured, and the people affected have a say.

Some uses of AI may be worth the extra compute. A system that helps researchers discover new medicines, improves public services or cuts waste in a supply chain could deliver real value. Other uses may be harder to justify, especially if they involve low-value novelty features, endless generated spam, or tools that simply shift costs onto workers and consumers.

This is a good moment to remember a wider ManyHands theme: practical AI is not magic. Whether you are using AI at work, giving an assistant more permissions, or relying on AI search, the same habit helps. Ask what the tool is for, what it costs, what data it uses, and who is accountable when it goes wrong.

What to watch next

The immediate issue is transparency. If one public document suggests a large AI datacentre build-out and another part of government appears to model much lower energy growth, ministers need to explain how the numbers fit together.

For readers, three questions are worth following:

  • Will local planning decisions clearly state the energy and water needs of AI datacentres?
  • Will the government publish consistent forecasts for AI compute demand, grid upgrades and emissions?
  • Will communities hosting AI infrastructure get visible benefits, not just construction disruption and higher demand on local networks?

The UK’s AI ambitions may still prove useful and worthwhile. But if AI is going to become part of everyday life, the infrastructure behind it needs to be discussed in everyday language too. People should not have to understand a compute roadmap to ask a simple question: who pays for the power behind the bot?