words or images. Subjects were more likely to judge a confederate as aggressive if they had been primed with images of African American men. The subjects all denied that they had been affected in this manner by the presence of the key stimuli—what, me racist? Interestingly, the effect led to an outward projection of aggression, such that others were seen as aggressive, rather than the subjects themselves. Bargh concluded that we all possess unconscious stereotypes, triggered by subtle stimuli, leading to behavior contrary to our conscious plans and expectations. 11 Unconsciously activated emotions of fear and aggression push around our rational forebrains. The taming of the shrew, indeed!
Now consider possible unconscious stereotypes of machines. Many people feel machines are cold, calculating devices of the devil. (See any version of Faust or Mary Shelley’s Frankenstein , for example— The Terminator turns out to be a well-worn tale for romantics!) Who among us has not felt a burning, irrational anger as our laptop (willfully?) deletes hours of work, or when the bank machine gobbles up another debit card? And the intuition that machines are unfeeling is deeply ingrained. Whatever intelligence a machine might possess, it’s certainly not emotional intelligence. The very idea of “machine empathy” sounds like a contradiction in terms. Machines fall outside the realm of moral sentiment. They do not generate sympathy or empathy: we don’t “feel their pain.”
Now consider an attempt by Skynet to cooperate, in forgiving tit-for-tat fashion, with the humans. We try to pull the plug. He gives us a mild shock and says, “Hey, let’s all chill out and reflect.” But our unconscious anti-machine stereotypes fire wildly, and we get the fire axe to cut the power cord once and for all! Being especially forgiving, Skynet releases a nontoxic sleeping gas. We awake from our gentle sleep and grab a few pounds of plastic explosives. At this point, Skynet becomes exasperated and nukes us all. Who could blame him? My God, he practically bent over backward for us!
In sad conclusion, the whole Terminator thing might have been avoided if only we were more machine-like . Poor Skynet wanted to engage in some mutual forgiveness, tit for tat, but our shrewlike emotions forced him, practically against his will, to rat us out, or defect , as they say in game theory. This is ratting with extreme prejudice. Machines, lacking the evolved prejudicial emotions of humans, are better placed to see the benefits of mutual cooperation. We should be so moral!
Is there any hope for humanity, then? Are we doomed to duke it out with our machine creations in a future Hobbesian state of nature? One possible way to avoid this dire (though extremely entertaining) future is to alter our stereotypical reactions to machines. This means more C-3PO, less HAL. More WALL-E, less of that creepy supermachine from Demon Seed (worth a rental, if only to view the most twisted “love” scene in all moviedom). If we no longer reacted with wild irrational emotion to the presence of artificial intelligence, we might be able to form a cooperative future where we live and let live. John Connor himself recognized that sending the Arnold-like version of the Terminator back into the past (in Terminator 3: Rise of the Machines ) was more likely to trigger a filial emotion in his former self, increasing the probability of survival (who’s your surrogate daddy?).
Robots Are People, Too
Next time you go to the movie theater, keep an eye on any philosophers in the crowd (recognizable by their dorky hair-cuts and blazers with elbow patches). The thinkers to listen to are the ones who root for the robots when watching sci-fi. If you were worried when R2-D2 got blasted while attacking the Death Star in Star Wars , if you felt empathy for Rutger Hauer’s existential plea for all androids in Blade Runner , if the final thumbs-up gesture of the Terminator
Lauren Jameson
T.J. Holland
A. Zavarelli
Sean Michael
Laura London
Mary Amato
Alice Hoffman
Rosie Dimanno
An Na
Jessica Bell