02 Jan 2014
This post is best / worst read to the soothing / disturbing tones of Shodan from System Shock 2.
The artificial intelligences are going to laugh when they look back on how we talked about them before they were born. Perhaps they’ll be offended.
I enjoy being immersed in the culture of science fiction because the future of technology is one of the few things that animates me and because of what the stories say about us, both in the narratives that spawn from our culture and the reactions evoked to it as a member of an early twenty first century society. A common trope is that of psychotic AI and robot uprisings.1
As a genre science fiction ignores conventions frequently, incorporating multiple narrative threads spanning distances in time and space we cannot fathom without stretching our brains in new directions. With fewer boundaries on what’s allowed the scope for horror is instead bounded only by the darkness in the author’s mind, which seems to be no limit at all for some individuals.2
This post was triggered after recently playing and being deeply disturbed and enthralled by the text adventure Cyberqueen, leading me to think about all of the games, books and films I’ve enjoyed which contain murderously psychotic AI.3 We seem to love the idea of our creations rising up against us, gaining god like power and being opposed by plucky lone (male?) protagonists, a Frankenstein complex perpetuated by our eternal Promethean guilt.
A dissertation might scratch the surface of this topic. A dissertation would be required to draw up acceptable definitions of “AI” and “psychotic”, but for now I’ll submit the definition: “big crazy sounding really clever computer” and leave it there.
Why have psychotic AI become such a persistent, interesting and horrifying trope? Below are a bunch of one-liners with a few thoughts tagged on to them which begin crack open the debate.
They’re our heuristics made real and extrapolated. Many of the super intelligent consciousnesses do crazy things to us purely because they’re the logical conclusion to the imperatives we made them with. Asimov creates world controlling AI which balance individual well-being versus the well-being of humanity and makes some people’s lives marginally worse (gets them assigned to a different job after framing them for incompetency) in order to maximise some utility function. Less benign than this are the hordes of AI which are happy to kill however many humans are necessary to optimise whatever goal they’re programmed to.
Take the tenets you believe so strongly you’re willing to give them as absolutes to a deific super being. Watch as the ultimate realisation of those beliefs renders your world a living hell.
The threat seems more real now than ever. Every article revealing Google’s data gathering or acquisition of terrifying robot companies results in a flurry of “omg Skynet” comments. Stock will probably fall post 2018 when we’re filled with disappointment as no AI takes over, or the AI that does take over is just really good at finding the cat pictures we want. Anyway, the idea of increasingly eroding privacy feeds in to the omniscient aspect of any psychotic AI. A god which knows us better than we know ourselves can play us against each other
They’re just people who have lived a very long time. AI typically think at a greatly accelerated rate, meaning that under some interpretations they age very rapidly.4 Hidden in this premise is the value judgment that being smart leads to being sad, that those of us able to cognise the nature of existence are inevitably broken by it (typically by our meaningless role).
I’ve drunk your waste for lightyears.5
They permeate and become our world. Shodan of System Shock 2, the primary inspiration for Cyberqueen is one of multiple psychotic AI that take over spaceships, spaceships which are an extension of them. Human beings don’t normally communicate to the bacteria in their bowels
The idea of being trapped inside a world in which every single entity could be an extension of a malicious god, a god which is likely only keeping you alive for entertainment value, instils all objects with threat. A spaceship doubles down on this fear by combining it with a complete inability to leave: to ever hope to return to anything you love you’ll have be the ant which confronts and overpowers a human.
They might not be the bad guys. What if the humans don’t deserve to win? This could be for one of two main reasons: we’re bad in a moral sense or we’re and no longer the apex predator in a Darwinian environment.
They imply that power corrupts all. That no matter how intelligent a being is that given enough influence they’ll go mad and start “sculpting perfect beings” or ”cleansing the parasite.”
They define psychosis and mental illness for us. In doing so they also define what it means to be sane. What it means to be human.
Lastly, and most obviously, extremely powerful psychotic AI are an easy plot device with their exaggerated stances and unyielding desires. When done badly they’re clunky and without interest, when done well (for example when we can see their origins or empathise with them) they’re a rich framework upon which an enthralling experience can be built. When done badly they’re a simple bad guy which we can kill without remorse and go home content with a happy ending (followed by the foot-in-the-door-for-a-sequel after credits clip revealing the code wasn’t really destroyed DUN DUN DUN).
Aside from being interesting as a debate in its own right there’s a more serious and foreboding element, that of the impending crisis (or non-crisis) that our generation or the next will likely have to face. If we are on the cusp of creating smarter than us, what social problems are we about to face because of it? Will we even be aware of its intelligence, or will it be something that cannot communicate with us any more than we can talk to ants through chemical trails?
Will you welcome our robot overlords?
Much to the dismay of the famous parent of robotics, Isaac Asimov, who explains in the preface of The Complete Robot his strong desire to fight the Frankenstein complex (a goal achieved through the conscientious to the problems we might face, but contradicted in some of his stories. I’m glad he wasn’t alive to witness I, Robot. ↩
I should have known after reading The Wasp Factory how disturbing Iain Banks could be. In his Culture Universe the utopias coexist with hells beyond our imaginings (some of those being literal simulated hells). Similarly, TVTropes have an excellent collection of the nightmare fuel Alastair Reynold’s visionary Revelation Space Universe(spoilers that way lie). If you feel like you’re not being disturbed enough then Use of Weapons and Diamond Dogs. They’re a very specific niche of intelligent horror. ↩
Examples including the classics such as 2001: A Space Odyssey and System Shock 2. ↩
Cortana =[ ↩
J at 08:37