← Back to stories Close-up of colorful patch cables connected to a network panel with labels, indoors.
Photo by cottonbro studio on Pexels
IT之家 2026-03-25

National Supercomputing Internet single‑user free token (词元) quota reportedly raised to 30 million

What changed

It has been reported that the National Supercomputing Internet (国家超级算力互联网) has increased its single‑user free token (词元) quota to 30 million. The move—reported by ithome—appears aimed at giving developers, researchers and smaller companies greater access to large‑language‑model inference and other AI workloads without immediate cost barriers. In Chinese tech discussions the term 词元 (tokens) refers to the subword units that models process; more tokens mean longer prompts, larger experiments and more practical application testing.

Why it matters

Why does a token quota matter? Because tokens are the direct meter of how much a model can be used. A 30‑million free‑token allowance substantially expands what an individual researcher or startup can try: fine‑tuning experiments, extended chat logs, batch inference for prototypes. For Western readers unfamiliar with China’s compute landscape, the National Supercomputing Internet functions as a centralized, state‑backed pool of high‑end compute resources intended to broaden access beyond the biggest cloud incumbents.

Geopolitical and industry context

The upgrade comes as China accelerates domestic AI capabilities amid export controls and sanctions on advanced chips from the West. Reportedly increasing free access to national compute resources can be read as both industrial policy and a practical response to constrained supply chains: more domestic compute circulation helps firms and labs iterate without relying on foreign infrastructure. It also matters commercially—cloud providers and model vendors in China will watch closely as free national quotas reshape early‑stage development and competitive dynamics.

Uncertainties and next steps

Details remain thin. It has been reported that the new quota applies on a per‑user basis, but eligibility rules, duration and how usage is measured were not described in the report. Developers and enterprises should expect further announcements from platform operators. In the meantime, 30 million tokens—if sustained—could materially lower the cost of experimentation for China’s AI ecosystem.

Policy
View original source →