Our latest online Culture piece sees associate editor Kevin Clarke considering the implications (terrible, serious, probable, likely, funny) of "Watson," the IBM supercomputer, trouncing two "Jeopardy" champions at their own game.  Whither the relationship between humans and computers?  (Noticed I asked it in the form of a question.)  

Watson, the kind-hearted (we hope) IBM supercomputer, beat humans handily at our own favorite game, “Jeopardy!,” a few weeks back, seamlessly deploying the kind of contextual reasoning once thought impossible for computers. Ken Jennings and Brad Rutter, two of the most successful “Jeopardy!” geeks in history, could not turn back the silicon onslaught. Watson had earned $77,147 at the end of two days; Jennings had $24,000; Rutter, a near carbon copy with $21,600. The rest of humanity? Zero, I’m guessing. Now that we can see what Watson can do—presuming it is more than humiliate homo sapiens on game shows—are we carbon-based life forms the ones in jeopardy?

From HAL, the red-eyed menace of “2001: A Space Odyssey,” and Colossus, right through to the evil, earth-stomping Skynet of the “Terminator” series, pop culture has taught us mere humans to fear the dawn of the age of artificial intelligence. (And I will make no references to the movie which dare not speak its name, but you know what I’m talking about, Jude Law). It seems that whenever silicon-based “life” comes online, its first binary thought-ish impulse is the immediate decimation of all things human—though the silicon-based forms on “Battlestar Galactica” had to acquire a theology of their own and super-hotness first. As the stylishly evil Agent Smith points out in “The Matrix” series, to machines, humans appear most like a virus: something to stamp out as quickly as possible, not superior beings to humbly—and permanently—serve.

Read the rest here.

Comments

Anonymous | 2/25/2011 - 10:57am
My favorite movie on this was Short Circuit.  Johnny Five became one of the most lovable characters in science fiction movie history.  Much more human than HAL.  Maybe we will get Johnny Fives, not Terminators and HAL's,  when we reach the singularity.


For a very intellectual treatment of artificial intelligence, follow Asimov's two series of Robots and Foundation to the end.  It is a long journey but in Foundation and Earth it reaches an impasse which Asimov could not handle or solve and he never wrote on it again. You only have to read the Foundation series to get there but the philosophical questions broached could not be answered.
Anonymous | 2/24/2011 - 9:16am
Before we worry about computers taking over, we need to worry about the vulnerability of our civilization based on the not-so-intelligent reliance of humans on technology in almost every aspect of their lives and the evil intentions of those humans who are capable of manipulating and/or destroying the vast technological infrastructure and everything reliant upon it. 

We'll destroy ourselves before the computers get a chance.
Kang Dole | 2/23/2011 - 5:54pm
Probably time to get that Butlerian Jihad rolling.
Marie Rehbein | 2/23/2011 - 8:10pm
People like to fantasize that computers can think on their own, but they are just machines, and the credit really has to go to whoever put together the database of information Watson was able to access, the developers of the machine code that gave Watson the edge in pressing the button, and the designers of the algorithms that allowed Watson to draw logical connections.  Human beings are phenomenal.