Now it's science students' turn to be anxious
Nvidia CEO flips the script
It has been reported that Nvidia (英伟达) CEO Huang Renxun (黄仁勋) told a podcast audience something striking: the most valuable future skill may not be writing code, but writing language — even suggesting that English is becoming a new kind of programming language. Short and provocative. It reverses a two‑year narrative that placed humanities graduates at the front line of AI displacement. Why does this matter? Because the person saying it sits at the top of the AI infrastructure stack and sees how models, tokens and GPUs are actually used.
From execution to definition
The shift Huang describes is not about rescuing liberal‑arts majors; it's about a deeper production change. In the large‑model era, machines can execute many tasks — generate text, translate, draft code — but they execute what they are told. The bottleneck moves from “Can you code?” to “Can you specify what you want?” Clear specification, structured goals and persuasive expression become the levers that guide AI. Prompt engineering, in this framing, is less a craft of tricks and more the externalization of thinking: good prompts reveal clear thought, and that is closer to classical humanities training than to pure engineering.
Talent market and geopolitical context
That reweighting is already visible in hiring and career paths. It has been reported that top AI firms are recruiting product managers and communicators with strong language skills; examples cited include Daniela Amodei at Anthropic and Lin Junyang (林俊旸), with humanities backgrounds, rising in Chinese AI teams such as Alibaba’s (阿里巴巴) Tongyi Qianwen project. At the same time, this debate sits against geopolitical headwinds: Nvidia GPUs and other high‑end chips are central to AI development, and export controls and US‑China tensions complicate access and strategy. So technical scale, token budgets and supply constraints interact with the new premium on specification skills — firms that can define and orchestrate AI effectively will extract disproportionate value.
What students and employers should take from this
The takeaway is stark: AI will compress the value of shallow, routinized tasks and amplify whoever can turn fuzzy problems into crisp directives. Science and engineering students are not doomed — but they should ask themselves whether they can articulate why something matters, not just how to build it. Employers agree: the future split may not be between arts and sciences, but between executors and definers. Who commands the AI army? The one who can say, distinctly, what it should do.
