Why a Chinese scholar's predictions about Iran went viral: The war bill behind the AI bubble
Viral forecast, uncomfortable logic
Jiang Xueqin (江学勤), a Canadian scholar of Chinese origin, made three stark predictions in a 2024 video that has suddenly circulated across global social media: Donald Trump would win the election, the United States would go to war with Iran, and the U.S. would lose that war. It has been reported that his clip and a longer appearance on the Breaking Points show drew millions of views in days. The first two outcomes already occurred; the third remains a live, incendiary claim — and not because Jiang is primarily a military analyst, but because his argument ties battlefield outcomes to the financial arteries of AI.
The money, the servers, the leverage
Jiang’s central point is structural: the current U.S. AI boom is heavily financed and physically anchored in the Gulf. It has been reported that Amazon Web Services, Microsoft Azure and Google Cloud have all expanded major data‑center projects across the UAE, Saudi Arabia, Qatar and Bahrain, and that Gulf sovereign funds — notably Saudi Arabia’s PIF, Abu Dhabi’s ADIA and Mubadala — have poured large sums into U.S. tech and chip companies. Why does this matter? Training large models needs vast, cheap power and capital. The oil‑dollar cycle — born from a 1970s U.S.–Saudi security‑for‑petrodollars compact — recycles Gulf liquidity into Wall Street and Silicon Valley. Reportedly, some Gulf investments into AI and infrastructure reached into the hundreds of billions in 2024–25. And it has been reported that at least one Amazon data center in the UAE suffered an attack, underscoring that compute infrastructure can become a target as easily as oil tankers or refineries.
What happens if the Gulf stops pumping?
Jiang sketches a domino scenario: Iranian asymmetric strikes and proxy attacks make Gulf energy and data hubs unsafe; oil chokepoints spike prices; sovereign funds pause overseas commitments and repatriate capital; Gulf data centers shutter or become too costly to run. The result is a twin shock to U.S. AI — a squeeze on both compute capacity and the steady capital flows that underwrite high valuations and sprawling infrastructure. Reportedly contentious claims that political donations and private ties have influenced wartime decisions are part of his thesis too, and should be treated cautiously, but they add a domestic‑political dimension: war can reconfigure both foreign policy and domestic emergency powers. Is the AI boom a technological inevitability, or a financial edifice propped up by a fragile geopolitical bargain? For Western users and investors, Jiang’s viral argument is a reminder that software and services rest on ships, pipelines, servers and sovereign balance sheets — and that geostrategic shocks can quickly turn an innovation story into a systemic risk.
