China Geopolitical
Half Of US Data Centers Scheduled To Start In 2026, Will Be Canceled Or Delayed
Half Of US Data Centers Scheduled To Start In 2026, Will Be Canceled Or Delayed
Just over two years ago, we first penned our views on "The Next AI Trade", which looked beyond the hyperscalers and the data centers supporting the AI revolution, and instead focused on the energy and logistical needs that would be so very critical in allowing the US to dominate China in the existential race to first reach Artificial General Intelligence (which many have dubbed the next nuclear arms race due to its
Tickers: AGHLMAG
Half Of US Data Centers Scheduled To Start In 2026, Will Be Canceled Or Delayed
Just over two years ago, we first penned our views on "The Next AI Trade", which looked beyond the hyperscalers and the data centers supporting the AI revolution, and instead focused on the energy and logistical needs that would be so very critical in allowing the US to dominate China in the existential race to first reach Artificial General Intelligence (which many have dubbed the next nuclear arms race due to its profound civilizational implications). It was here that we defined the "Power Up America" basket as the next AI trade.
Yet as one can see in the chart below, after outperforming the AI Data center and the TMT AI baskets in 2024 and much of 2025, the Power Up America trade has lagged and clearly underperformed, as some investors have started to express doubt that the US would ever be able to "grow" into its massive AI computing needs... with dire consequences for record AI capex budgets, something the market has yet to grasp.
And unfortunately, with every passing day, the outlook for the US AI revolution looks increasingly more dim.
That's because, as Canaccord Genuity analyst George Gianarikas writes, "the American data center boom is hitting a formidable wall of logistical friction." He is referring to the latest outlook by Sightline Climate, which is also reinforced by recent articles from Bloomberg and others, and reveals a sobering reality for 2026: nearly half of the nation's planned 16-gigawatt capacity faces cancellation or delay, with only 5 gigawatts currently under construction.
This inertia stems from a volatile mix of local permitting hurdles, community resistance, and a desperate reliance on overextended global supply chains for critical components like transformers and helium.
That's right: half.
That's right: despite $700BN+ of expected 2026 hyperscaler capex, nearly half of the data centers scheduled to begin operations in the US
in 2026 "will either face delays or outright cancellations." The data, which comes from Sightline Climate's 2026 Data Center Outlook, suggests that just 30% - 50% of the ~16 GW of planned US capacity for the year will face risks, with only ~5 GW currently under construction!
And the horizon only grows darker in the coming years. By 2027, the gap between ambition and reality widens further, as a mere fraction of the announced 21.5 gigawatts has actually broken ground. Worse, according to Futurism, data centers slated to open in 2027 are progressing far more slowly than anticipated. "Only about 6.3 gigawatts worth of computing infrastructure are actually under construction, compared to 21.5 announced gigawatts."
And then visibility drops to virtually nothing beyond 2028 as uncertainty increases materially in the outer years. According to the article, "things get even dodgier in the coming years, with the vast majority of data centers planned for launch between 2028 and 2032 having yet to even break ground. There are a further 37 gigawatts of planned infrastructure which haven’t even received a firm completion date, only 4.5 [gigawatts] of which have actually begun work."
This trend suggests an increasingly uncertain future for the industry, where power constraints and grid instability cast long shadows over projects slated through 2032.
But while one can pretend the future is irrelevant, the same limitations are visible in the here and now: according to the SightLine report, "at least 16GW of data center capacity is slated to come online this year across 140 projects. 53% will be grid connected, 3% will be powered solely by on-site power, and 25% have not disclosed their powering strategies. We expect 30-50% of these projects to be delayed. Only 5GW is currently in construction."
And the punchline:
"We expect 30-50% of 2026 projects to be delayed, driven by power constraints (25% of projects have not disclosed powering strategies), increasingly effective community opposition, and potential grid equipment shortages. 11GW of 2026 capacity remains in the announced stage with no signs of construction, despite typical build times of 12 to 18 months. Itʼs still possible for this capacity to come online, but it would need to dramatically accelerate."
Which brings us to the question we raised more than two years ago: how will the US modernize its ancient power grid and build out the huge energy supply needed to power up the AI revolution. Here, too, it appears there has been little progress:
"On-site and hybrid power punch above their weight when measured by capacity. Grid-connected projects still lead at 40% of total capacity, but on-site generation and hybrid approaches together account for close to half of announced capacity, far exceeding their share by project count. A small number of gigascale, grid independent campuses account for this capacity, including New Era Energy & Digitalʼs 7GW project in Lea County, Homer Cityʼs 4.5 GW coal-to-gas redevelopment in Pennsylvania, and Crusoeʼs 1.8GW natural gas and renewables project in Cheyenne, Wyoming. These projects are large enough to require their own generation plant, and have the capital to fund it. Waiting for the grid to supply this level of capacity could take a decade."
The problem, as Canaccord warns, is that "without a radical acceleration in domestic manufacturing and grid integration, the digital expansion of the late 2020s risks stalling into a series of unfulfilled promises."
Others agree: in a note published over the weekend by Goldman Executive Direct Shreeti Kapa, she wrote that at a recent dinner with investors, the overwhelming consensus was that "there is simply not enough compute and every player is acutely compute constrained – bottlenecks from fabs to permitting for data-centers to power to memory to labor are real and are here to stay for some time to come. I wasn’t sure what to make of it – if its consensus is it peak, or is the imagination for scale of AI demand is so great among a very small sub-segment of investors & technologists here in the valley and the rest of the word is yet to catch-up?"
While imaginations may indeed by running wild, the hard limitations in the real world are indeed starting to catch up: we recently highlighted OpenAI's decision to pause its UK Stargate project - a partnership with Nvidia and Nscale to deploy up to 31k GPUs - citing the UK’s prohibitive energy costs and regulatory hurdles. The project was to be based across several sites including Cobalt Park and a dedicated "AI Growth Zone", enabling OpenAI's models to provide local compute for critical public services and highly regulated industries including finance and national security.
UK energy prices represent a key bottleneck to AI infrastructure development. According to the report, UK's industrial prices "are among the highest in the world" and have been a key gating factor delaying companies from building AI infrastructure. According to a spokesperson from OpenAI, “we continue to explore Stargate U.K. and will move forward when the right conditions such as regulation and the cost of energy enable long-term infrastructure investment.”
OpenAI and Nscale maintain plans to develop the project in the future. According to the OpenAI spokesperson, “We see huge potential for the U.K.’s AI future... London is home to our largest international research hub, and we support the Government’s ambition to be an AI leader. In the meantime, we are investing in talent and expanding our local presence, while also delivering on the commitments under our MOU with the government to adopt frontier AI in UK public services.”
Bloomberg also chimed in earlier this month, writing that "as the global AI race heats up, there is a huge rush to build data centers fast. There’s no lack of money chasing these projects, with tech giants Alphabet Inc., Amazon.com, Meta Platforms Inc. and Microsoft Corp. committed to spending more than $650 billion this year alone. Yet neither ambition nor capital is enough to materialize all the necessary components."
Here Bloomberg again quotes the Sightline data, noting that "almost half of the US data centers planned for this year are expected to be delayed or canceled" and as one big reason for the delay Bloomberg cites the shortage of electrical equipment, such as transformers, switchgear and batteries: "They are needed not just for powering AI, but also for building out the grid that is seeing increased consumption from electric cars and heat pumps. US manufacturing capacity for these devices cannot keep up with demand, and the scarcity has caused data center builders to rely on imports."
At its core, the problem is the lack of domestic manufacturing which makes sense for a country that has outsourced much of its industrial base to China in the past century, and despite loud promises of reshoring, there are few tangible results.
Indeed, while over the past 10 years, the US government has tried a series of policies to reshore manufacturing, they haven’t yet yielded a significant boost to domestic capacity, forcing businesses to look to China regardless of the tariffs or the alleged national security risk. As a result, the US now finds itself in an absurd Catch 22: the US needs crucial parts from China to dominate it in the AI race, while China needs advanced chips from American companies to stay in the race.
The biggest bottlenecks, understandably, have been observed in the power space - the same space we aggressively pitched two years ago as enabling the AI revolution, hoping that whoever was in charge of the US would take America's chronic energy deficiency seriously. It appears we may have been overly optimistic. One thing is clear: data centers have rapidly grown in size and now consume more electricity than their predecessors a decade ago. That demands bigger transformers, which safely pull electricity from the high-voltage grid to feed to tiny computer chips. Without the right transformers, there’s no way to make the data center work.
Before 2020, these high-power transformers typically arrived 24 to 30 months after an order was placed. Those timelines were “totally manageable in the old world” when data centers didn’t need such large transformers or at such short timelines, says Philippe Piron, chief executive officer of GE Vernova’s electrification division. But AI companies “want something typically in less than 18 months.”
The spike in demand from data centers and grid expansion have pushed up prices and extended delivery times to as much as five years. That is why some, like Crusoe, have even resorted to refurbishing old transformers from shuttered power plants as a stopgap measure.
Meanwhile, a far greater looming problem is where will the US source the dozens of Gigawatts needed to power up the AI revolution. So far Trump's promises of a nuclear renaissance have remained just that, with virtually no new nuclear power plants breaking ground, while the push for small modular reactors - a ray of hope in an otherwise dreary landscape - is still years away from practical results, let alone scale.
Oh, and there is the question of who pays for all this: by now everyone knows about the hundreds of billions in capex the hyperscalers will spend over the next few years.
What fewer people know is that this money won't be enough. According to an analysis by JPMorgan, it will take no less than $5 trillion to fund the AI cycle, and even with the massive capex - and debt outlays - the US government will still be on the hook for over a trillion to close the funding gap.
It's not just power: as Canaccord writes, beyond the power-related technicalities "lies a fraught sociopolitical reality".
Consider the following: The Maine House of Representatives approved a moratorium on large-scale data centers until 2027. This pause allows a newly formed coordination council to weigh innovation against environmental and resource stewardship. The House passed the bill 82-62, advancing it to the Senate. The goal of the bill, according to state representatives, is not to fight innovation, but as a pause for planning to improve stewardship of the state's resources and limit financial and environmental impacts on the state's citizens. In addition to the moratorium, "the bill also creates the Maine Data Center Coordination Council, and instructs the council to provide strategic input, facilitate planning considerations and evaluate policy tools to address data center opportunities."
Simultaneously, OpenAI faces mounting scrutiny as Florida’s Attorney General launched an investigation into the company following the release of safety-critical chat logs. And then there was last week's firebomb attack on Sam Altman's home: while the police are still investigating, and there are many reasons why someone may want to express their "displeasure" with the man behind ChatGPT, the reality is that, as we warned last August, "between exploding electricity bills and lack of jobs for grads, a new luddite revolution is coming - they will be burning down data centers within a year."
between exploding electricity bills and lack of jobs for grads, a new luddite revolution is coming - they will be burning down data centers within a year
— zerohedge (@zerohedge) August 25, 2025
Sure enough, these institutional shifts arrive as a recent Quinnipiac University poll - which looked at AI use and its impacts on daily life, education and healthcare - confirmed the public is growing increasingly wary of AI’s deepening integration into healthcare, education, and daily life. Here are some of the findings showing just how rapidly public sentiment has turned against AI:
The bottom line is that the time for talk has long passed, and yet for all the posturing, the US government continues to act as if a victory against China in the AI race is a given. It is anything but, especially with America's own society rapidly turning against the next industrial revolution.
As Canaccord concludes, "Not only are the energy constraints mounting, but so are the sociopolitical ones. Something's got to give."
Tyler Durden
Sun, 04/12/2026 - 22:38