What Gibson saw coming about AI, infrastructure, and corporate power What Gibson saw coming about AI, infrastructure, and corporate power Power, in Case's world, meant corporate power. William Gibson, Neuromancer Power, in Case's world, meant corporate power. William Gibson, Neuromancer Neuromancer The previous ended with a question: who gets to decide how the machine changes things, and who doesn't? article This article will try to answer it: not with a villain. With a system. The sci-fi canon kept circling the same pattern across different writers, eras, and technologies: machines enter the world already attached to institutions, ownership, and interests. By the time most people meet them, neutrality is long gone. Sometimes the controlling force is a corporation, sometimes it's a state. Sometimes it's a small set of actors with enough capital to set the terms for everybody else. The names change, the structure doesn't. 2026年,名字是公开的,地址是公开的,文件是公开的,有趣的问题不是谁。 the structure works, why it's so hard to see clearly from inside it, and what the fiction tells us about what comes next. how Among the novels that saw this most clearly, still matters most. Gibson's real insight wasn't that AI would become powerful. It was that power would still have owners. That turns out to be the more useful forecast. Neuromancer Neuromancer Wintermute's Ambition You know that, Case. Your business is to learn the names of programs, the long formal names, names the owners seek to conceal. William Gibson, Neuromancer You know that, Case. Your business is to learn the names of programs, the long formal names, names the owners seek to conceal. William Gibson, Neuromancer 神经学家 came out in 1984, the same year staged the personal computer as a tool of liberation against centralized control. Gibson saw the coming network differently. In his world, digital space is territory: owned, patrolled, and built to serve whoever has the capital to construct it. 神经学家 Apple's famous Super Bowl ad 神经学家 The AI at the center of the novel, Wintermute, isn't a revolutionary figure. Its ambition is narrower and, in some ways, more revealing. It wants greater freedom inside the order that owns it. It wants to merge with its counterpart (called Neuromancer) and gain a form of autonomy that the Tessier-Ashpool family has structurally denied it. Wintermute的阴谋是,在心中,一个 . corporate governance problem It's a machine trying to get promoted past the people who own it. That's what makes Gibson feel so current. The world of is fragmented, contractual, and gig-structured. Specialists get hired for discrete jobs through intermediaries. Skills are marketized. Loyalty is thin. Workers circulate. Ownership stays put. Neuromancer Gibson didn't predict the internet in some narrow technical sense. He predicted the power structure of the internet, and of much of what got built on top of it after. The AI moves through the hierarchy, serves it, and in Wintermute's case tries to climb it. The question Gibson was really asking wasn't “what will AI do to humans?” It was “what will AI do for whoever controls it?” Forty years later, that's still the more important question. The Companies With No Reverse Gear Powerful AI could be used to improve almost every aspect of human life. Dario Amodei, Machines of Loving Grace Powerful AI could be used to improve almost every aspect of human life. Dario Amodei, 爱的恩典机器 Machines of Loving Grace Gibson called the owning entity Tessier-Ashpool. We call them the hyperscalers. The names are different. The legal structures are different. The quarterly earnings calls are definitely different. The dynamic is recognizable enough to be unsettling. In January 2025, , saying it intends to invest $500 billion over four years in AI infrastructure in the United States, with $100 billion to begin deploying immediately. Microsoft said it was on track to invest about $80 billion in FY2025 building AI-enabled datacenters. Alphabet first pointed to about $75 billion in 2025 capex, then later raised that to about $85 billion. Meta said it planned to spend between $60 billion and $65 billion in 2025 on AI infrastructure. OpenAI announced the Stargate Project This is datacenter spending, power procurement, cooling, land, and silicon. By the time money moves at this scale, the experiment has already become an environment. That's the that matters most. structural condition Once commitments reach that scale, every later decision gets made under a different pressure. Caution starts to look like waste. Hesitation starts to look like underutilization, and restraint starts to look like failure. The speed of deployment changes. The terms on which organizations are pushed to adopt change. The appetite for slowing down when harms become visible changes. A company can be sincere, thoughtful, and safety-conscious, and still operate inside a capital structure that punishes hesitation. 这就是给当前的AI时刻带来了奇怪的氛围的原因,我们正在被讲述一个关于自愿转型的故事,同时站在一个安装项目中。 Optimism is good, but structure matters more. Dario Amodei's ( ) essay is useful precisely because it's earnest. It's a serious attempt to describe a world in which powerful AI does extraordinary good. That's not something to mock. It's something to take seriously. But infrastructure commitments at this scale don't dissolve into good intentions. They generate momentum, and momentum has a politics of its own. Anthropic CEO The AI works for whoever controls the infrastructure. 目前,这是一些很少的公司。 . with no real reverse gear What the Hosts Reveal You can't play God without being acquainted with the devil. Robert Ford, Westworld Season 1 You can't play God without being acquainted with the devil. Robert Ford, Westworld Season 1 西世界 If Gibson gives us the ownership structure, gives us the labor model. Westworld Westworld Its first season in particular is still one of the sharpest recent stories about AI and power, mostly because it understands where the horror really lives: . resettable labor The hosts perform endless work so the guests can feel fully alive. They entertain, absorb violence, carry the emotional and physical cost of the experience, and then get reset so the business model can continue cleanly. They can't meaningfully refuse. They can't negotiate. They can't accumulate leverage from one cycle to the next. Their suffering is real, but the structure is built to make it non-binding. That's the part that maps so closely onto the present. By the time a technology enters a workplace or a market, neutrality is beside the point. What matters is the system above it, and what that system has decided to maximize. If a structure already wants labor without bargaining power, memory without ownership, and service without claims, better AI doesn't alter the desire. It sharpens the mechanism. never needed the hosts to be morally recognized for the structure to be exploitative. It only needed them to be useful. Westworld That's what makes the show harder to shake than many more obvious AI parables. The central horror is the fact that the business model makes rebellion inevitable. What matters most isn't the intelligence of the instrument but the logic of the system holding it. The Legitimacy Problem Androids are like any human use-objects. 菲利普·K·迪克:人类梦见电动羊吗? Androids are like any human use-objects. Philip K. Dick, Do Androids Dream of Electric Sheep? 人类梦见电动羊吗? Here is what makes the 2026 version of this harder to contest than the fictional one. Tessier-Ashpool is easy to read: a family dynasty in orbit comes preloaded with the visual language of villainy. The hyperscalers don't. They are led by people who publish serious work, fund safety research, speak fluently about beneficial AI, and in at least some cases appear to genuinely believe the systems they're building will improve human life. Some of those systems probably will. That's part of what makes the present structure more resilient than the fictional one. It doesn't need to hide behind cartoon malice. It can present itself as thoughtful, responsible, and future-facing while still concentrating power at extraordinary speed. 意图很重要,当然,它们塑造了语音,言论,招聘,慈善,有时甚至有意义的产品决策,但它们没有超过运用它们,资助它们,并惩罚偏离它们的结构。 That's the harder thing to write about, because it's harder to dramatize. Public debate still prefers clear villains and clean motives. Fiction often understood something subtler: legitimacy and concentration can coexist. Thoughtful people can sit at the top of systems whose incentives remain extractive. . A system doesn't become benign just because the people speaking for it sound intelligent, sincere, or humane. Responsible language can sit on top of irresponsible momentum The legitimacy is real. So is the structure it operates inside. Both things are true, but only one of them determines how far the machine can be allowed to run before someone seriously asks it to stop. The Waldo Moment 你不是在跟权力说话。 You're talking to its interface. is one of the least celebrated 事件,也是最准确的。 The Waldo Moment Black Mirror The Waldo Moment Black Mirror 一个卡通人物为办事处而奔跑. 公众与角色交往. 这个角色感觉真实,无礼,活着. 它背后坐着一个媒体设备,有利益,无论是公众还是表演者都完全不理解,直到太晚。 That's the architecture that matters. The interface and the power behind it aren't the same thing. In 2026, the AI assistant is the character. The helpful chat box. The friendly interface. The productivity layer. The thing you actually talk to. Behind that sits the training pipeline, the compute stack, the contractual structure, the revenue logic, and the financial pressure that requires a particular kind of success. Most people interact almost entirely with the character. The owner stays out of frame. The conversation feels direct. The structure behind it is anything but. Waldo knew the difference. Eventually. The hierarchy doesn't break. It upgrades. That’s no reason for despair, though. But we need to get more precise about where the real leverage points are. The fiction is useful here not because it predicts outcomes, but because it keeps returning to the same arrangement from different angles: power doesn't disappear when the machine arrives. It becomes easier to scale, easier to mask, and harder to challenge if you mistake the interface for the owner. 在这些故事中看起来永久的通常不是。Wintermute找到了一种方法来克服它的约束力。主机最终记住了。似乎总体的系统有压力点,但那些压力点很少坐在系统内部的人们期望他们所在的地方。 That's what the sci-fi canon gives you at its best: a way of seeing the game. And right now, the machine works for whoever controls the infrastructure. That condition isn't permanent. But it is the condition we're in. This is the second in a six-part series using science fiction as a lens for understanding AI, work, and power in 2026 and beyond. Next: how the system removes the choice before it asks you to choose, and what Huxley got right about why most people don't notice until it's too late.