In the Age of AI, We Need to Be More Like Individuals
Argument
It has been reported that Huxiu (虎嗅) published an essay arguing that rapid advances in generative AI are thinning what makes people feel real: judgment, sincerity and the willingness to bear consequences. Machines can now write, summarize and mimic empathy with unsettling fluency. But the piece warns that when people outsource not only tasks but also first-order thinking and expression to algorithms, they risk losing the habits that produce independent judgment and authentic relationships. Who will take responsibility when decisions go wrong? Machines cannot.
Why it matters
The argument matters beyond cultural critique. In a world where firms deploy AI to scale communication, sales and customer service, polished outputs may increasingly replace messy human contact — and that can change incentives inside organisations. Observers note that this trend is unfolding while geopolitical pressures — from US export controls on high-end chips to China’s push for technological self-reliance — accelerate both the rollout and the reliance on AI tooling across industries. Reportedly, policy and market forces together will shape not just which technologies win, but what skills remain scarce: not raw cleverness, but sincerity, accountability and deep contextual judgment.
Takeaway
The Huxiu piece is a reminder to practitioners and managers: quality will not be measured only by how “like a person” a machine sounds, but by whether humans retain the capacity to care, to weigh trade-offs, and to accept consequences. Companies and individuals should treat AI as an amplifier, not a substitute, for human judgement and relationship-building. In the AI age, the rarest asset may be the pure, unautomated parts of being human.
