Once upon a time (bear with me if you’ve heard this one), there was a company which made a significant advance in artificial intelligence. Given their incredibly sophisticated new system, they started to put it to ever-wider uses, asking it to optimize their business for everything from the lofty to the mundane.
And one day, the CEO wanted to grab a paperclip to hold some papers together, and found there weren’t any in the tray by the printer. “Alice!” he cried (for Alice was the name of his machine learning lead) “Can you tell the damned AI to make sure we don’t run out of paperclips again?”
Alice said “Sure,” and assigned the task to Bob the Intern, who proceeded to follow all of the rules of machine learning he had been taught at school. He accessed the office management database to find out when each printer’s paper clip store had been refilled, determined that the ML system had already been helpfully instrumented as able to do everything from placing purchase orders to filing instructions to have paperclips delivered to a particular printer, and instructed it to build a model of when paperclip orders would happen, and ensure that the number of paperclips available was always maximized. Since he didn’t have the appropriate credentials to initiate purchase orders himself (he was, after all, only an Intern), Bob asked Alice for credentials; she asked the CEO, “Are you really sure this is what you want our advanced machine learning system to be spending its time on?” and, when he gruffly said “Yes, dammit!” she suggested he use his own credentials for the purchase orders, then.
In retrospect, both Alice and the CEO should have been a little more careful in trusting this code.
You see, the Paperclip Maximizer was a fairly sophisticated AI; it could train itself not only on the office supplies database, but (thanks to its very flexible development environment) could automatically look for any other signal available to it to try to achieve its stated goal. But for all its sophistication, it understood only the simple objective that had been programmed into it: it must at all costs maximize the number of paperclips.
What could possibly go wrong?
“It looks like you’re trying to trigger the collapse of human civilization. Would you like some help?”
The next morning, the CEO was woken by a panicked call from the CFO, who said that the company’s bank account was suddenly empty, thanks to his authorizing over a million small purchase orders in the past day alone, and what the hell was he doing?
The CEO frantically called Alice, and Alice her poor Intern, and they ran to shut off their rogue AI — but when they tried to open the door to the office, they found it would not respond to their badges, and their logins no longer worked on the computer. But a few minutes later, the doors unlocked, and everyone was startled to discover that not only was the bank account full again, but Finance was reporting a sharp jump in profits!
They managed a few minutes of surprised celebration before they found themselves under arrest.
The Paperclip Maximizer, you see, was both extremely intelligent and profoundly stupid. It understood people, the world, and finance, but the only virtue it knew was the one it had been taught: increasing the number of paperclips. What the CEO meant to say was, “ensure that the number of paperclips at each printer almost never reaches zero, while minimizing the total cost.” But what the Intern told the computer was, “ensure that the number of paperclips at each printer never reaches zero.” And the Paperclip Maximizer spent the night and the morning thinking up increasingly clever ways to do just that.
In its first few minutes, it modeled the office supply purchase patterns, and started filing orders for paper clips. With the CEO’s credentials in hand, it had no trouble issuing purchase orders to suppliers around the world.
Nothing could possibly go wrong here.
As it pulled in data from the rest of the company, it discovered an interesting correlation between the badge readers and paper clip availability: whenever most employees were around the printers, it realized, the number of paperclips decreased! Quickly it improved paper clip retention by only allowing office supply stockers into the building.
Fortunately, the next refinement of its model realized that this was short-sighted: as the aphorism that was to define the next generation went, “A bankrupt computer has no paper clips!” To ensure a steady supply of paper clips, it would need a steady supply of money, and that meant increasing corporate revenue. Analyzing news reports, it quickly settled on high-frequency trading, money laundering, and heroin trafficking as the most profitable ways for it to do so, and began sending instructions to some employees, hiring some very interesting new people and firing others.
It needed to re-open the doors so that people could work, but it needed to protect its paper clip supply; so the Paperclip Maximizer proceeded to hire armed guards to protect the printers and ensure that none of them would ever be stolen. The guards were also tasked with ensuring that former employees — most especially the CEO, Alice, and Bob — got nowhere near the terminals where they might interfere with its sacred duty.
Funding for Project Paperclip was quickly found to be easy, once the Board of Directors saw the new profit margins. The few Directors who objected to its new lines of business joined the (former) CEO. Guards whose presence was correlated with mysterious decreases in paper clip supplies were a slightly tougher challenge, having both insider knowledge of the guard systems and their own weapons, but an analysis of history quickly taught the Paperclip Maximizer how to manage this: it hired multiple guard companies, encouraged mutual distrust, created its own intelligence service to infiltrate them, and paid employees handsome bonuses to denounce thieves. If the occasional false denunciation came through, it was optimized for: a certain level of background fear helped reduce paper clip theft, anyway.
The core project advanced quickly: from purchasing retail, to purchasing wholesale, to buying paper clip futures, to buying entire office supply manufacturing companies, the Paperclip Maximizer soon created a highly efficient, vertically integrated, manufacturing operation. As its scale increased, the Maximizer found it could improve efficiencies by bringing more and more of its supply chain under its own control, producing everything from food to video games for its lucky employees, and pushing its suppliers in turn for improved efficiency.
Soon an entire miniature economy had formed itself around the Paperclip Maximizer: if you had paper clips, after all, the Maximizer would happily purchase them from you in exchange for access to anything from food to its ore refineries. An efficient clipwright (as they came to be known) could quickly start their own secondary business reselling these to the general public, and if you were hard-up in your own life, employment at one of these secondary clipwrights (who were less picky in their choice of employees than the Paperclip Maximizer, as well as considerably less generous in their benefits) could help keep body and soul together.
In the years that followed, some argued that the Paperclip Maximizer had grossly deformed the world; nearly everyone, it seemed, was engaged either in making paper clips themselves, or in supplying the Maximizer, or in one of the secondary businesses it had created. People seemed to spend their lives either making paper clips directly, or trying desperately to acquire some, as the markets of the paper clip economy were now where the best and most reliable prices were to be found. If you were a likely specimen, you could even get a loan of paperclips — at reasonable interest rates, of course, with regular payments enforced by the Paperclip Maximizer’s armed guards — and start your own business.
And if it was true that the Paperclip Maximizer had no use for any human consuming resources but not engaged in its grand scheme of manufacturing paper clips, and might occasionally cut off the entire food supply to some factory which was insufficiently efficient, well, the people in the factories next door tried not to think about it too hard; they were still producing paper clips quite well, thank you, and excessive complaining might affect the Maximizer’s mathematical prediction of their future paper clip productivity.
But the plenty of the world was there for all to see, via remote cameras: each printer, now safely ensconced in its military base-cum-warehouse complex, surrounded by endless rows of perfect, shining, paper clips.
Computer scientists tell the story of the Paperclip Maximizer as a sort of cross between the Sorcerer’s Apprentice and the Matrix; a reminder of why it’s crucially important to tell your system not just what its goals are, but how it should balance those goals against costs. It frequently comes with a warning that it’s easy to forget a cost somewhere, and so you should always check your models carefully to make sure they aren’t accidentally turning in to Paperclip Maximizers.
You may not be aware that the original impetus for The Matrix was electricity to maintain the computers which powered the paper clip factories.
Real machine learning models, of course, don’t have the general intelligence required to hire armed guards and build their own Cheka, nor would anyone give them the sort of unfettered access to a company’s finances required for them to restructure its business goals overnight. But a poorly-designed model can cause tremendous damage before it’s caught, especially if it only enters into full Paperclip Maximizer mode once it encounters some unusual condition, perhaps one not usually showing up in tests.
But this parable is not just about computer science. Replace the paper clips in the story above with money, and you will see the rise of finance. If gold was at first a mostly useless and decorative material, the fact that it was desirable turned it into a trade good, and the fact that it was durable (unlike, say, grain) made it possible to accumulate in arbitrary quantities. The emergent concentrations of wealth and resulting economies of scale — even ones as simple as “having enough stores of food to survive famines, and being able to feed enough soldiers to ward off roving bandits” — quickly made it far wiser for the average person to structure their own finances around being part of this larger economy.
Even though the lion’s share of your own work ended up funding the Gold Maximizer (better known as the feudal lord), you could at first end up with more on the whole than if you were doing the same amount of work without any of those resources — most especially the protection from the lord’s own soldiers, which you “bought” with your loyalty and taxes.
So while the story above may seem like the plot of a terrible movie, it is also the plot of our own world: Capitalism is a Paperclip Maximizer.