It is a tiresome cliché of discussion about artificial intelligence that people keep referencing an Arnold Schwarzenegger B movie from 1984. (We all know which one.)
The idea, in line with the Frankenstein complex that Isaac Asimov had already had occasion to criticize a half century before (to no effect, apparently), is that we will create a powerful artificial intelligence, and it will kill us all.
They pay very little attention to how it did so in the movie--specifically initiating a nuclear exchange. The fact would seem the more significant given that the movie was made in, and came out during, the "Second Cold War" and the associated "Euromissile crisis" when the danger of nuclear war, and the protest movement against war and against nuclear weaponry, was particularly strong--the years of the miniseries The Day After, and movies like WarGames, and novels like David Brin's The Postman and Stan Lee's (the "other" Stan Lee's) Dunn's Conundrum--such that this was not implausibly on James Cameron's mind. (Indeed, Terminator 2 can seem a reminder, if any were needed, that that danger did not vanish with the Cold War's end in the manner that those intent on lionizing the victory would have liked the public to believe, while trigger-happiness with nuclear weapons was, again, a significant element of Cameron's 1989 film The Abyss.)
But people never think of The Terminator as a warning about the dangers of nuclear weaponry, without the arms race in which "Skynet" would never have been created, let alone given the means with which to devastate humanity--means humans in the long run might have used no less destructively even without the involvement of artificial intelligence. My guess is that this is partly because the memories of the nuclear danger have been forgotten and buried, with opinion-makers by no means eager to revive them (they blatantly call people alert to that danger "cowards," the world-view of a Barry Goldwater become the mainstream here as in so many other areas of political life), while there is instead a preference for fixating on less plausible and less troublesome dangers--an AI apocalypse, like a zombie apocalypse, being a lot less politically contentious than fears of nuclear or climate catastrophe, with catharsis through thinking about the former perhaps a way of taking the edge off of worries about the latter. Another evasion, of the kind that so characterizes political life in our time.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment