Phantom data centers: What they are (or aren’t) and why they’re hampering the true promise of AI


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


In the age of AI, public utilities are now facing a new, unexpected problem: Phantom data centers. On the surface, it may seem absurd: Why (and how) would anyone fabricate something as complex as a data center? But as AI demand skyrockets along with the need for more compute power, speculation around data center development is creating chaos, particularly in areas like Northern Virginia, the data center capital of the world. In this evolving landscape, utilities are being bombarded with power requests from real estate developers who may or may not actually build the infrastructure they claim.

Fake data centers represent an urgent bottleneck in scaling data infrastructure to keep up with compute demand. This emerging phenomenon is preventing capital from flowing where it actually needs to. Any enterprise that can help solve this problem — perhaps leveraging AI to solve a problem created by AI — will have a significant edge.

The mirage of gigawatt demands

Dominion Energy, Northern Virginia’s largest utility, has received aggregate requests for 50 gigawatts of power from data center projects. That’s more power than Iceland consumes in a year. 

But many of these requests are either speculative or outright false. Developers are eyeing potential sites and staking their claims to power capacity long before they have the capital or any strategy around how to break ground. In fact, estimates suggest that as much as 90% of these requests are entirely bogus.

In the early days of the data center boom, utilities never had to worry about fake demand. Companies like Amazon, Google and Microsoft — dubbed “hyperscalers” because they operate data centers with hundreds of thousands of servers — submitted straightforward power requests, and utilities simply delivered. But now, the frenzy to secure power capacity has led to an influx of requests from lesser-known developers or speculators with dubious track records. Utilities, which traditionally deal with only a handful of power-hungry customers, are suddenly swamped with orders for power capacity that would dwarf their entire grid.

Utilities struggle to sort fact from fiction

The challenge for utilities isn’t just technical — it’s existential. They’re tasked with determining what’s real and what’s not. And they’re not well-equipped to handle this. Historically, utilities have been slow-moving, risk-averse institutions. Now they’re being asked to vet speculators, many of whom are simply playing the real estate game, hoping to flip their power allotments once the market heats up.

Utilities have groups tasked with economic development, but these teams are not used to dealing with dozens of speculative requests at once. It’s akin to a land rush, where only a fraction of those claiming stakes actually plan to build something tangible. The result? Paralysis. Utilities hesitate to allocate power when they don’t know which projects will materialize, slowing down the entire development cycle.

A wall of capital

There’s no shortage of capital flowing into the data center space, but that abundance is part of the problem. When capital is easy to access, it leads to speculation. In a way, this is similar to the better mousetrap problem: Too many players chasing an oversupplied market. This influx of speculators creates indecision not just within utilities but also in local communities, which must decide whether to grant permits for land use and infrastructure development.

Adding to the complexity is that data centers aren’t just for AI. Sure, AI is driving a surge in demand, but there’s also a persistent need for cloud computing. Developers are building data centers to accommodate both, but differentiating between the two is increasingly difficult, especially when projects blend AI hype with traditional cloud infrastructure.

What’s real?

The legitimate players — the aforementioned Apples, Googles and Microsofts — are building genuine data centers, and many are adopting strategies like “behind-the-meter” deals with renewable energy providers or constructing microgrids to avoid the bottlenecks of grid interconnection. But as real projects proliferate, so too do the fake ones. Developers with little experience in the space are trying to cash in, leading to an increasingly chaotic environment for utilities.

The problem isn’t just financial risk — although the capital required to build a single gigawatt-scale campus can easily exceed several billion dollars — it’s the sheer complexity of developing infrastructure at this scale. A 6-gigawatt campus sounds impressive, but the financial and engineering realities make it almost impossible to build in a reasonable timeframe. Yet, speculators throw these massive numbers around, hoping to secure power capacity in the hopes of flipping the project later.

Why the grid can’t keep up with data center demands

As utilities struggle to sort fact from fiction, the grid itself becomes a bottleneck. McKinsey recently estimated that global data center demand could reach up to 152 gigawatts by 2030, adding 250 terawatt-hours of new electricity demand. In the U.S., data centers alone could account for 8% of total power demand by 2030, a staggering figure considering how little demand has grown in the last two decades.

Yet, the grid is not ready for this influx. Interconnection and transmission issues are rampant, with estimates suggesting the U.S. could run out of power capacity by 2027 to 2029 if alternative solutions aren’t found. Developers are increasingly turning to on-site generation like gas turbines or microgrids to avoid the interconnection bottleneck, but these stopgaps only serve to highlight the grid’s limitations.

Conclusion: Utilities as gatekeepers

The real bottleneck isn’t a lack of capital (trust me, there’s plenty of capital here) or even technology — it’s the ability of utilities to act as gatekeepers, determining who is real and who is just playing the speculation game. Without a robust process to vet developers, the grid risks being overwhelmed by projects that will never materialize. The age of fake data centers is here, and until utilities adapt, the entire industry may struggle to keep pace with the real demand.

In this chaotic environment, it’s not just about power allocation; it’s about utilities learning to navigate a new, speculative frontier so that enterprises (and AI) can thrive.

Sophie Bakalar is a partner at Collaborative Fund.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers



Source link

About The Author

Scroll to Top