It’s been three years since ChatGPT kicked off a frantic sprint to build the physical infrastructure for a new species of intelligent machines.
Seeing the race as zero-sum, the top tech companies have collectively teed up more than a trillion dollars in capital expenditures to expand computing power. Utilities, sensing both threat and opportunity, are planning to spend another trillion dollars to upgrade the grid over the next three decades, much of it to serve the hyperscale electricity appetite of AI data centers.
The narrative of the last few years, driven by figures like OpenAI’s Sam Altman, was intoxicatingly simple: Large language models are the on-ramp to artificial general intelligence, and the only bottleneck is scale. Just feed the machine more data, more chips, and more electricity, and it gets smarter.
This logic triggered an unprecedented construction boom, and cemented the belief that we were entering a decade-long electricity supercycle.
When we launched the first Transition-AI conference in 2023, a few months after ChatGPT blew up, the questions were big and optimistic: How could these tools help operators manage a more complex grid? Could AI speed up clean energy deployment?
At this year’s installment, the tone was very different. The conversation shifted from the solutions AI might unlock to the fundamentals of the market: financing risk, project sequencing, and the urgency of grid planning.
And as 2025 ends, another massive vibe shift is underway. As Latitude CEO Scott Clavenna detailed this week in the AI-Energy Nexus newsletter, something feels off about the current moment. There’s fresh skepticism over AI’s potential, a growing resistance to data centers, and a sense that we’re headed into precarious financial territory.
And with the energy and digital economies increasingly intertwined, the evolving market is raising important questions about risk and the long-term outlook for electricity demand.
The limits of scale?
First, the macro question: What happens if the AI bubble stops expanding?
The U.S. economy now leans heavily on AI-driven capital spending. There’s a push to make as many big deals as possible while the capital is flowing. But the circular deals between hyperscalers and developers, increasingly baroque financial instruments, and the exposure of data centers in pension funds and REITs are starting to worry analysts. If AI revenues don’t catch up to AI capex, the whole ecosystem gets wobbly. And that is not a good thing for long-term energy investments.
Then there’s the technical side of the vibe shift.
Gary Marcus, an AI skeptic who has spent years warning about the limits of LLMs, argues that the core flaws of ChatGPT — brittle reasoning, hallucinations, the absence of real understanding — haven’t meaningfully improved despite nearly a trillion dollars of training runs and compute poured into the problem. “Thousands of people (literally) have tried to tell me that scaling would solve all these concerns — but it hasn’t. Not even close,” he wrote in his newsletter this week.
For more on the growing tension at the AI-energy nexus, listen to the latest episode of the Open Circuit podcast:
Three years after promising a rapid march to AGI, Sam Altman has softened his tone and repositioned ChatGPT as more of an engagement product than an emergent mind. The pivot raises an uncomfortable question: If scale alone doesn’t get us to this promised intelligence, what happens to the energy and infrastructure projections built on that assumption?
And then there’s Ilya Sutskever, OpenAI’s co-founder and former chief scientist, who recently told the podcaster Dwarkesh Patel that the scaling era may be ending, because the returns on compute infrastructure for LLMs are flattening. Sutskever believes the next breakthroughs won’t come from supersizing models; they will come from new algorithms, new training methods, and more efficient use of the data centers we already have.
“From 2020 to 2025, it was the age of scaling,” Sutskever said. “But now the scale is so big… is the belief really that if you had 100x more, everything would be so different? I don’t think that’s true. So it’s back to the stage of research again, just with big computers.”
This matters because expectations about future electricity demand are built on the belief that data center scaling will continue on its current path. If the returns are flattening, though, the long-term curve of load growth becomes a little harder to read.
Growing demand, uncertainty, and risk
Of course, for now the demand is very real.
Even if AI turns out to be more like a platform shift — think desktop to mobile — and less a transformative technology like railroads or steam engines, it will still require a staggering amount of mathematically heavy compute. There’s already a lot of growth baked in.
Grid Strategies’ newest estimate puts U.S. peak load growth at 166 gigawatts by 2030, which is four times higher than projected just two years ago. More than half of that projected growth (roughly 90 GW) comes from data centers. That’s an unprecedented concentration of demand in a single customer class.
But even the forecasters are uneasy. “We think the data center number is too high,” Grid Strategies’ John Wilson told Latitude Media. The problem is not whether demand is growing; it’s how poorly the utility industry is equipped to model it. Traditional utility econometric forecasting, which assumes demand grows gradually with population, income, and normal economic activity, has a hard time capturing a world where a hyperscaler shows up and asks for 1,000 megawatts in two years.
The challenges in interconnection queues are not helping. Developers are overwhelming utilities with speculative, “phantom” load requests, making load forecasting even more difficult. Some developers are effectively staking claims before they’ve secured customers, capital, or even real business plans. This was reflected in the latest BloombergNEF numbers on data centers, which showed high attrition in the development pipeline.
”There’s so much noise in those numbers that makes policy makers and regulators very uncomfortable,” said Mike Kramer, the VP of data economy strategy at Constellation, speaking in June at the Transition-AI conference.
A signed interconnection agreement no longer guarantees that power will show up when promised. “Will-serve letters don’t mean anything anymore,” said Allison Clements, a FERC commissioner, also at Transition-AI. Utilities are overwhelmed, supply chains are constrained, and local opposition can still derail critical upgrades.
And those interconnection uncertainties spill directly into the next problem: financing. If developers, utilities, and regulators can’t trust the load signals coming through the queue, then what about investors? As Peter Nulsen, managing director at Generate Capital, explained at Transition-AI, financing these projects now means navigating load risk, contract risk, credit risk, and sequencing risk — all at the same time.
Today, many projects require energy infrastructure and data centers to be built simultaneously. That multiplies the number of entities with timelines that must be in lock step: the data center developer, the tenant, the energy developer, the EPC, the utility, the landowner, and the capital providers. If one of them slips, the entire project slips.
“It’s one thing to do a 500-megawatt energy project. It’s another thing to build the load at the same time you’re building the energy,” said Nulsen.
That creates a new set of financing risks. Financiers want executed tenant agreements before they invest, but hyperscalers prefer short-term, often five-year, contracts, while power assets need 20-year financing. As Nulsen explained, “Maybe I can underwrite one renewal. But two or three? Now you have to think a lot harder.”
Beyond a vibes in the AI-energy nexus
Layered on top of those risks is a more existential question: Will the customer even exist in a decade? Traditional infrastructure investors are comfortable underwriting the credit of major, long-standing companies like Walmart or regulated utilities. But in the AI era, they are suddenly being asked to take long-term positions on the financial durability of young companies like OpenAI or Anthropic.
The recent “code red” inside OpenAI underscored those risks. According to reporting in the Wall Street Journal, the company is bleeding users and fretting about losing market share to Google’s Gemini. Altman has warned of “economic headwinds,” while OpenAI’s CFO hinted at — and then walked back — the option of a government bailout. As Gary Marcus put it, OpenAI has “massively overextended” itself, and future fundraising could come with harsher terms and a lower valuation.
That’s the tension headed into the fourth year of this race. We are preparing to build hundreds of billions of dollars of long-lived energy infrastructure for an industry that is still in its earliest phase: figuring out its business models and contending with technical limits.
AI has already reshaped energy planning, supercharged investment cycles, and changed the scale of data centers. But a lot of this activity is based on vibes. And eventually, vibes collide with reality.
Which is why, when we convene Transition-AI again in San Francisco this April, we’ll stay focused on the mechanics that actually determine what gets built: the risks, the timelines, the planning tools, and the capital structures this era demands.
The post Three years on, AI’s vibes come up against grid realities appeared first on Latitude Media.
via Latitude Media https://ift.tt/oEMzpq2
Categories: Energy