← Back to stories Close-up of hands holding a smartphone displaying the ChatGPT application interface on the screen.
Photo by Sanket Mishra on Pexels
虎嗅 2026-03-29

AI Can Write Poems, but Why Can't It Produce "Comfort That's Just Right"?

Test and finding

It has been reported by Huxiu that a simple test — simulating a user saying “my cat died today” — exposed a persistent weakness in contemporary chatbots. Multiple models were tried, from general-purpose large models to specially tuned emotional-companion bots. Some quoted philosophy. Some offered practical advice. One produced a Rainbow Bridge poem. None, the author argues, reliably produced the elusive “comfort that's just right.” Why? Because sympathy is not a task you can fully metricize.

Where models fall short

The piece lays out concrete reasons: AI lacks timing, nuanced boundaries, relationship memory and the awkward, earnest fumbling that often signals real care. Training regimes reduce comfort to labels and targets — decrease negative keywords, increase positive tokens — producing textbook replies that can feel cold. Can a model learn to be silent at the right moment? Reportedly, one model did surface a hybrid pattern offering both an outlet and a pause, hinting that non-intrusive companionship might be teachable. But the core gap remains: context and shared history are not just data points, they are living traces of relationships.

Broader implications

The lesson matters beyond a thought experiment. Chinese firms and startups are racing to commercialize companion AIs even as the global AI ecosystem contends with geopolitics — from U.S.-led export controls on advanced chips to debates over cross-border data flows — that shape how models are built and deployed. The Huxiu commentary suggests a shift for trainers: not to make AI mimic an idealized, flawless human, but to make it a clearer mirror that foregrounds what only humans can offer. Some forms of care, the author concludes, should remain human — AI can accompany, but it shouldn't pretend to replace the messy, fallible warmth that binds us.

AI
View original source →