Interesting Economic Phenomena Amid Rising Memory Prices: What Is AI Pushing Out?
The short answer: AI is gobbling memory, and everyone else is paying
A sharp, AI-driven surge in DRAM prices has reshuffled markets across Asia. The memory rally has been most visible in South Korea — where Samsung and SK Hynix (SK海力士) dominate — helping the KOSPI surge while other markets lag behind. Why the squeeze? Training and serving large language models need massive volatile memory and high‑bandwidth DRAM (HBM). It has been reported that ChatGPT–class services now serve hundreds of millions of users, and that extraordinary demand for live inference and model state is translating directly into elevated memory consumption.
Who gets kicked out — the "crowding‑out" of consumer devices
Economists would call this a crowding‑out effect. Big cloud and AI firms treat memory and HBM as strategic inputs and will pay higher prices to secure capacity. Consumer-facing makers — think Nintendo, Xiaomi (小米), and PC and phone vendors — face elastic customer demand and cannot simply pass on costs, so they get squeezed. It has been reported that MediaTek (联发科) is preparing for weaker smartphone chip shipments in 2026, and gaming hardware like the next Switch has reportedly been hit by parts shortages. The upshot: ordinary buyers are effectively bearing an “AI tax” through higher prices and scarcer inventory — the tax incidence in action.
Why HBM changes the calculus — and why China matters
HBM is not just a pricier SKU; it consumes far more wafer and advanced packaging capacity per useful bit than commodity DRAM, so vendors divert production toward higher‑margin HBM for AI customers. That reallocation tightens supply of standard memory and pushes spot prices even higher. China is an important wild card. Domestic supplier CXMT (长鑫存储) has reportedly been pricing aggressively, undercutting global rates, and Beijing’s push for on‑shore capacity — amid U.S. export controls and broader tech geopolitics — could alter the balance over time. But for now, western, Korean and U.S. suppliers still account for the lion’s share of global DRAM.
Bigger picture: a longer, AI‑stretched memory cycle
This episode may mark a lengthened memory supercycle. Where past DRAM booms often peaked in roughly 15–18 months, analysts now warn the cycle could extend toward 2027 as AI demand persists and new fabs take years to come online. The ripple effects are broad: from electricity grids strained by data centres, to commodity suppliers and industrial foundries seeing renewed orders. So is AI merely a software revolution? The market answer is no — it’s remaking hardware priorities, supply chains and who ultimately pays the bill.
