Join our daily and weekly newsletters for the latest updates and exclusive content covering cutting-edge AI. Learn more
In the the era of AIUtilities now face an unexpected new problem: ghost data centers. At first glance, this may seem absurd: why (and how) would anyone build something as complex as a data center? But as demand for AI skyrockets, along with the need for more computing power, speculation around data center development is creating chaos, particularly in regions like Northern Virginia, the data center capital of the world. In this evolving landscape, utilities are bombarded with requests for electricity from real estate developers who may or may not In fact build the infrastructure they claim.
Fake data centers represent an urgent bottleneck in scaling data infrastructure to meet compute demand. This emerging phenomenon prevents capital from circulating where it really needs it. Any company that can help resolve this issue — perhaps take advantage of AI to solve a problem created by AI – will have a significant advantage.
The mirage of gigawatt demands
Dominion Energy, Northern Virginia’s largest utility, has received aggregate requests for 50 gigawatts of power from data center projects. That’s more energy than Iceland consumes in a year.
But many of these claims are either speculative or downright false. Developers eye potential sites and claim their power capacity long before they have the capital or a strategy on how to innovate. In fact, estimates suggest that up to 90% of these claims are entirely false.
In the early days of the data center boom, utilities never had to worry about false demand. Companies like Amazon, Google and Microsoft – nicknamed “hyperscalers” because they operate data centers with hundreds of thousands of servers – submitted simple power requests and the utilities were simply delivered. But today, the frenzy to secure power capacity has led to an influx of applications from lesser-known developers or speculators with questionable records. Utilities, which traditionally service only a handful of energy-hungry customers, are suddenly inundated with orders for electrical capacity that would dwarf their entire network.
Public services struggle to separate fact from fiction
The challenge for public services is not only technical: it is existential. They are on a mission to determine what is real and what is not. And they’re not well equipped to handle that. Historically, utilities have been slow, risk-averse institutions. They are now being asked to control speculators, many of whom are simply playing the real estate game, hoping to change their power allocations once the market heats up.
Utilities have economic development groups, but those teams aren’t used to handling dozens of speculative requests at once. This is akin to a land rush, where only a fraction of those claiming stakes actually plan to build something tangible. The result? Paralysis. Utilities are reluctant to allocate electricity when they don’t know which projects will come to fruition, slowing down the entire sector. development cycle.
A wall of capital
There’s no shortage of capital flowing into the data center industry, but that abundance is part of the problem. When capital is easily accessible, it leads to speculation. In some ways, this is like the best mousetrap problem: too many players chasing an oversupplied market. This influx of speculators creates indecision not only within utilities but also within local communities, which must decide whether or not to grant permits for land use and infrastructure development.
What adds to the complexity is that data centers are not just for AI. Of course, AI is driving increased demand, but there is also a persistent need for cloud computing. Developers are building data centers to accommodate both, but it’s increasingly difficult to tell the difference between the two, especially when projects mix. AI hype with traditional cloud infrastructure.
What is real?
Legitimate players – the aforementioned Apple, Google and Microsoft – are building real data centers, and many of them are adopting strategies such as “behind the meter” deals with renewable energy providers or building micro- networks to avoid network interconnection bottlenecks. But as real projects proliferate, so do fake ones. Developers with little experience in the field are trying to take advantage, leading to an increasingly chaotic environment for utilities.
The problem is not just financial risk – although the capital required to build a gigawatt-scale campus can easily exceed several billion dollars – it is also the sheer complexity of infrastructure development on this scale. A 6 gigawatt campus sounds impressive, but financial and technical realities make it nearly impossible to build in a reasonable time frame. Yet speculators are throwing around these massive numbers, hoping to secure power capacity in hopes of reversing the project later.
Why the network can’t meet data center demands
As utilities struggle to separate fact from fiction, the network itself becomes a bottleneck. McKinsey recently estimated that global demand for data centers could reach up to 152 gigawatts by 2030adding 250 terawatt hours of new electricity demand. In the United States, data centers alone could represent 8% of total electricity demand by 2030a staggering figure considering little growth in demand over the past two decades.
However, the network is not ready to accommodate this influx. Interconnection and transmission problems are endemic, and estimates suggest that the United States could run out of electricity capacity by 2027 to 2029 if alternative solutions are not found. Developers are increasingly turning to on-site generation, such as gas turbines or microgrids, to avoid the interconnection bottleneck, but these stopgaps only highlight the grid’s limitations .
Conclusion: public services as guardians
The real bottleneck isn’t a lack of capital (trust me, there’s plenty of that here) or even technology – it’s the ability of utilities to act as gatekeepers, determining which is real and which is only playing the game of speculation. Without a robust developer selection process, the network risks being overwhelmed with projects that never come to fruition. The age of fake data centers is here, and until utilities adapt, the entire industry may struggle to keep pace with real demand.
In this chaotic environment, it is not just about allocation of power; it’s about utilities learning to navigate a new speculative frontier so that business (and AI) can thrive.
Sophie Bakalar is a partner at Collaborative fund.
DataDecisionMakers
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including data technicians, can share data insights and innovations.
If you want to learn more about cutting-edge ideas and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.
You might even consider contribute to an article to you!