← Back to stories Retro green circuit board with connectors, representing early computer technology.
Photo by Nicolas Foster on Pexels
凤凰科技 2026-04-08

Dell warns of a 625× surge in AI accelerator memory demand by 2028 — supply shortfall likely to persist

Big projection, simple drivers

It has been reported that Dell Technologies (戴尔科技集团) CEO Michael Dell warned publicly that total memory demand for AI accelerators could rise 625‑fold from 2023 to 2028. Two factors account for the arithmetic: NVIDIA (英伟达)’s H100 class introduced 80GB cards in 2023, and single‑accelerator memory capacity is expected to climb to roughly 2TB — a 25× jump — while deployments of accelerators across global data centers are forecast to expand another 25×. The result, Dell argued, is exponential memory demand in the coming five years.

Tight supply, long lead times

This surge collides with a fragile supply side. It has been reported that the global memory industry was at a cyclical low in 2023, and three major DRAM manufacturers cut or delayed expansion after losses, leaving inadequate capacity buffers. New DRAM fabs typically take about four years from planning to volume production, and front‑end process expansion is proceeding cautiously. Reportedly, industry consensus is that the supply shortfall could last through 2028, with no quick fix on the horizon.

Demand beyond hyperscalers — and geopolitical friction

Why the rush for memory? Governments and enterprises are pouring money into “sovereign AI” stacks and large language model infrastructure — many of the world’s top 25 economies are reportedly pursuing national AI projects — while hyperscalers and enterprises keep raising AI capital expenditure to boost productivity. That demand surge comes amid geopolitical headwinds: export controls, trade frictions and the geographic concentration of advanced semiconductor capacity (notably in Korea and Taiwan) complicate capacity planning and technology transfers. Will policy and investment keep pace with hardware realities? Not without time and significant capital.

What this means for cloud users and governments

Higher prices, procurement delays and tougher competition for AI memory are likely near‑term outcomes. For China and other countries pushing sovereign AI, the message is stark: local capacity ambitions will help strategic autonomy but cannot be realized overnight. The industry faces a multi‑year mismatch between explosive memory needs and the slow, capital‑intensive rhythm of DRAM expansion.

AISpace
View original source →