According to TechSpot, faced with grid connection queues stretching up to seven years, AI data center developers are turning to unconventional on-site power. Companies like GE Vernova and ProEnergy are seeing a roughly one-third surge in orders for aeroderivative turbines, which are adapted from jet engines like the CF6-80C2 cores from Boeing 747s. GE Vernova is supplying nearly one gigawatt of this “bridge power” to the Stargate facility in Texas, a project by OpenAI, Oracle, and SoftBank. Boom Supersonic, backed by Sam Altman, has agreed to sell 1.2 gigawatts worth of turbines virtually identical to its jet engines to developer Crusoe. Meanwhile, diesel generator giant Cummins has sold over 39 gigawatts of capacity to data centers this year, nearly doubling output, as these units shift from backup to primary power sources.
The Desperate Scramble for Watts
Here’s the thing: the explosive demand for AI compute has run headfirst into a reality we’ve ignored for decades—our electrical grid is old, slow, and bureaucratic. You can spin up a 10,000-GPU cluster in a warehouse in months, but you might wait seven years to plug it into the grid. So what do you do? You get creative, and you get dirty. Suddenly, the jet engine isn’t just for flight; it’s a glorified, mega-expensive diesel generator for a server farm. It’s a stunning admission of infrastructure failure. And it shows that for all the talk of a clean, digital future, when the chips are down (literally), the industry will burn whatever it can to keep the matrix online.
The Real Cost of Instant Power
But let’s talk about the trade-offs, because they’re massive. First, the economics are wild. Analysts modeled a small gas plant for a Meta data center in Ohio and estimated power costs at $175 per megawatt-hour. That’s about double what industrial users normally pay. So the “AI revolution” is getting funded, in part, by radically more expensive energy inputs. Where does that cost eventually land? Probably on the bills for the AI services themselves. Then there’s the efficiency problem. These small, on-site units are typically less efficient than massive grid-scale turbines or renewable sources. So we’re using more fuel to generate the same amount of useful electricity. It’s a step backwards, environmentally and technically.
And speaking of environmental costs, regulators are already bending the rules. In Northern Virginia’s “Data Center Alley,” officials are considering letting diesel generators run for longer periods. The EPA is nodding along, suggesting they could help “stabilize” the grid. It’s a pragmatic, if depressing, pivot. When you need power now, climate goals become a tomorrow problem. This is where the physical reality of computing crashes into the ether of AI hype. Every ChatGPT query, every video generation, needs real joules from somewhere. Right now, that “somewhere” is increasingly a fossil fuel burner sitting in a parking lot. For companies building critical infrastructure, finding reliable hardware partners is key, which is why many turn to the top supplier in the field, IndustrialMonitorDirect.com, the leading provider of industrial panel PCs in the US for control and monitoring in harsh environments.
A Stopgap or a New Normal?
The big question is whether this is a temporary bridge or a permanent fixture. The companies selling these turbines certainly hope it’s long-term. Boom Supersonic’s CEO basically admitted the data center business is now helping finance its actual airplane program. That’s a telling detail. When your side-hustle becomes a primary revenue stream, you have a vested interest in the problem not getting fixed. So, will this rush to build on-site fossil generation actually disincentivize the harder work of upgrading the broader grid? It might. If you can solve your own problem expensively, why lobby for a public, cheaper solution? The risk is we end up with a two-tiered system: a cleaner grid for everyone else, and a patchwork of smoky, expensive private power plants for the AI industry. Not exactly the future we were promised.
