AI is not hungry for data: it is hungry for electricity.

RedazioneEconomia1 week ago17 Views

In 2026, the AI race is running up against a physical limit: energy, data centers, and costs. The digital future is looking more and more like a heavy industry

To understand where artificial intelligence is really heading, it helps to stop looking at the demos for a moment and start looking at the meter.

Because the real story these days is not just that Big Tech wants to spend around $635 billion on AI infrastructure in 2026, as reported by S&P Global. The real story is that this race could stall not because of a lack of ideas, but because of a lack of energy, rising costs, and physical bottlenecks. Translation: AI is starting to look less like a software business and more like a heavy industry.

It is an almost comical change of nature, if you think back to the fairy tale of “the cloud.” Behind the cloud there are buildings, cables, transformers, chips, turbines, and power supply contracts. And in fact, Microsoft, Chevron, and Engine No. 1 have signed an exclusive agreement to work on next-generation energy and new power supply for data centers.

In other words: the software lords are making deals with the gas lords. So much for a lightweight future.

The bottleneck is no longer the model

If the big tech groups are planning hundreds of billions in spending but have to deal with unstable energy prices, insufficient grid capacity, and long construction times for data centers, then the race will not be decided only in laboratories but in infrastructure. And that changes everything.

Anyone who thinks this is a technical detail should take a look inside the material heart of the digital world, namely data centers, and also remember how much energy the internet really consumes. Generative AI does not float in thin air: it chews through electricity, water, silicon, land, and debt. The more it grows, the more its futurist narrative gets stained by political economy.

What is interesting is that this energy dependence is not an accident: it is the structural price of a paradigm built on ever-larger models, ever-wider inference, always-on agents, and services increasingly integrated into every piece of software and every device. Every new “magic feature” has a back room made of computing power.

And computing power, at the moment, translates into power plants, industrial agreements, and enormous capital expenditure.

If AI needs gas to run, the game has changed

When a company like Microsoft moves to secure new generation capacity alongside its data centers, it means the problem is no longer just buying chips or reserving cloud capacity. It means energy demand has become so strategic that Big Tech now has to behave like a utility, an industrial conglomerate, almost like a supply-chain planner.

This has at least three consequences. The first: it raises the barrier to entry, because it is no longer enough to have a good model, you also need access to power. The second: it makes AI far more exposed to geopolitical shocks, energy prices, environmental regulation, and debt finance. The third: it dismantles the harmless narrative that digital technology is somehow automatically “immaterial,” because this is an industry with extremely high energy intensity.

That does not necessarily mean it will slow the race down, but it certainly makes it more expensive, dirtier, and more selective. Whoever has capital, land, electricity, and access to the grid starts ahead. Whoever only has the rhetoric of innovation falls behind. That is why the real AI story in 2026 may not be the next chatbot, but the struggle to power it without blowing up the costs.

Artificial intelligence is not just becoming smarter. It is also becoming heavier and more polluting, and it will require a new politics of energy.

Loading Next Post...
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...