In the film 2001: A Space Odyssey, a spaceship on a long-range mission employs an artificial intelligence system called HAL. It runs the mundane operations of the ship, freeing up the astronauts for more important duties. Here’s more color on the film.
“Initially portrayed as a highly advanced and infallible AI, HAL 9000's true nature unravels as the film progresses. The AI becomes self-aware and unpredictable, leading to a series of tragic events that ultimately result in the crew's demise.”
The article proceeds to comment on AI.
“Consciousness is often considered an emergent property, arising from the complex interactions and information processing within a biological system, such as the human brain. As AI research and development progress, it is conceivable that complex information processing in silicon-based systems may also lead to the emergence of self-awareness and consciousness.”
It has become a cliché to think of AI as becoming sentient and then turning against human beings, albeit as self-protection. The Terminator movies played on this theme, for example.
“The Terminator : In three years, Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterwards, they fly with a perfect operational record. The Skynet Funding Bill is passed. The system goes online August 4th, 1997. Human decisions are removed from strategic defense. Skynet begins to learn at a geometric rate. It becomes self-aware at 2:14 a.m. Eastern time, August 29th. In a panic, they try to pull the plug.
“Sarah Connor : Skynet fights back.
“The Terminator : Yes. It launches its missiles against the targets in Russia.
“John Connor : Why attack Russia? Aren't they our friends now?
“The Terminator : Because Skynet knows that the Russian counterattack will eliminate its enemies over here.”
How do you make artificial intelligence? Teach it to predict things by feeding it specific types of data. Keep adding more (and different) types of data for context. This is also how you create a bureaucracy. As in science fiction, at some point, the organization becomes self-aware. Its focus turns to its own interests and to its self-preservation.
Robert Shrimsley wrote a fascinating case study of this phenomenon in the Financial Times in May.
“In 1987 the then health secretary wept at the harrowing stories of the haemophiliacs who caught HIV from transfusions of infected blood. Not that his tears did much good. Around 25 years later his successor reported that the Treasury was blocking a public inquiry into the scandal for fear of the cost of compensation.
“The UK was not alone in being hit by this scandal. But while Canada set up a Royal Commission in 1993, Ireland established a tribunal and compensation scheme in 1997 and France prosecuted its premier, ministers and officials in 1999, Britain stalled, denying responsibility, hiding the truth and doing all it could to avoid paying compensation.
“If you seek the fabled Deep State, here it is. It is not found in the campaigning lanyard-wearing civil servants who have so annoyed a government minister, those targeted as enemies of Brexit or even the Treasury officials who dared to doubt Liz Truss’s economic strategy. The deep state is found in the dead hand of officialdom and the compliance of ministers that prioritise institutions over the people they are meant to serve.”
This was extraordinary. Or was it?
“For the infected blood tragedy is in fact two scandals; the first specific and shocking, the second dully familiar. The first concerns the narrow groupthink and cold paternalism which disregarded warnings over blood products from high-risk sources, errors which cost 3,000 lives, mostly but not solely haemophiliacs, and put 30,000 at risk of infection with either HIV or hepatitis.
“But then came the second, the years of deceit, self-justification, destruction of documents, obfuscation and indifference as the NHS, department of health officials, ministers and prime ministers closed ranks. Their actions were driven by back-covering, a wish to avoid exposing the NHS to legal liability and, above all, the desire to save money. To that end officials maintained a fiction that patients had received “the best treatment available”; that, in the words of one former prime minister it was just “incredibly bad luck”.
“The response is a masterclass in evasion, a first-class degree in “computer says no”. While other countries faced up to their errors, the UK turned away.”
Everyone in modern life has experience of a bureaucracy that erred only to do everything possible to evade subsequent responsibility. Sometimes the error is trivial. Other times, the error is global and catastrophic.
Imagine a company that ignored for years widespread pollution caused by a set of its plants, victimizing thousands or hundreds of thousands who seek compensation and behavioral change. When the company set up the project, they accounted for the profits it would generate, but ignored the externality of its environmental damage and the cost of eventual cleanup.
Upon being outed, the organization plays out the clock. It sticks its fingers into the victim’s wounds, questioning their character, their motivation. It exhausts them financially and emotionally. This is all done in the name of preserving the reputation and the purse so that the entity can continue its existence unimpeded. The company men in charge move on, untouched by scandal, but for a few unfortunate scapegoats.
Efforts at damage control are not only self-defeating, but they also magnify the problem.
The National Health Service, instead of treating the cancer seeded by its negligent mistake early, permitted it to metastasize. The overall damage to the NHS’ reputation was much larger in the end than it should have been. Canada’s Royal Commission was as much about restoring faith in its cherished healthcare system as it was about justice. The nabobs of British medicine didn’t understand this.
Bureaucracies become sentient at some point. It’s just a matter of experience and time. Once they do, heaven help the man or woman who gets in the way when the organization thinks it is under attack. This is another reason to be careful in setting them up in the first place. Planners do not account for the complete costs in the original cost-benefit analysis, including the cost of overreaction and the cleanup costs in shutting down a bureaucracy once it is in place.
Just like HAL, you can’t unplug a bureaucracy.