‘GEO’ Services Are Flooding the Chinese Internet With Misinformation
Chinese companies are reportedly using a new tactic — generative engine optimization, or GEO — to manipulate AI-driven feeds and search rankings and to push promotional content masquerading as organic information. The result, it has been reported, is a flood of AI-generated or AI-optimized material across domestic online spaces that amplifies commercial messages and distorts what ordinary users see. Who decides what counts as "organic" when models and ranking systems can be gamed at scale?
What is GEO?
Generative engine optimization is essentially SEO for the age of large language models and recommendation algorithms: firms craft prompts, content networks and synthetic accounts to steer model outputs and platform rankings toward desired outcomes. These tactics target everything from domestic search engines to social apps — platforms including Baidu (百度), WeChat (微信), and Douyin (抖音) — and can take the form of tailored prompts, coordinated posting, or content farms designed to teach models to prioritize certain claims or products. Reportedly, some operators combine automated generation with human moderation to keep outputs plausible and avoid platform penalties.
Why it matters
The surge in GEO activity arrives amid intensifying tech competition and regulatory scrutiny. Against the backdrop of export controls, sanctions, and a push for domestic AI capacity, Chinese firms are under commercial pressure to monetize attention quickly — even if that risks eroding information quality. The phenomenon raises questions for users, platforms and regulators: how to detect coordinated GEO campaigns? What tools can platforms deploy without stifling innovation? And could these tactics, if left unchecked, alter public discourse beyond China’s borders? It has been reported that Chinese regulators and some platforms are beginning to explore detection and deterrence measures, but the arms race between generative tools and moderation systems appears likely to continue.
