Artificial Intelligence: Humankind's Future or Our End?

DVD cover of the John Badham film WarGames

Three years ago, as I sat waiting at the doctor’s office, I riffled through the amalgamation of well-worn magazines strewn upon a table until one caught my eye. What jumped out at me was an issue of Time with a cover story titled “Sin·gu·lar·i·ty.” I had no idea what that word meant, but it was presented with such intention on the cover that I figured that I’d better find out. As defined in the Time article, a singularity is “The moment when technological change becomes so rapid and profound, it represents a rupture in the fabric of human history.” The article discussed the increasingly popular notion that we are approaching a new singularity in which artificial intelligence may usurp humankind’s supremacy in the world. The real subject of the piece, however, was Ray Kurzweil, a renowned inventor, engineer, author, speaker, and futurist. I found the article absolutely fascinating, followed it up with some additional reading, and have found myself ruminating on it numerous times since.

Kurzweil didn’t formulate the notion of a technological singularity, but at this moment, he is the person most associated with the concept. He has written several books on the subject, produced a movie, and given hundreds of lectures about what he feels will be the next major evolution of life on this planet. Simply put, the coming singularity will be the point when computers will have greater intelligence than humans. That, of course, depends on whether computers will truly be capable of developing artificial intelligence, an outcome that many scientists are currently working to make happen and expect to see occur within the next twenty years. Kurzweil predicts that the coming singularity will occur in 2045, when the quantity of artificial intelligence created is expected to be about a billion times the amount of all human intelligence currently.

Kurzweil believes that computers will ultimately be better able to program their own further development (and do it much faster) than humans will be able to do. The result will be an artificial super intelligence capable of solving most or all of humankind’s problems. At that point, Kurzweil feels it will oblige humankind to embrace that technology internally, becoming a biological/technological hybrid. For his part, he is doing everything he can to stay alive long enough to see it happen (he’ll be 97 in 2045) by adhering to a strict diet and consuming an extraordinary number of supplements daily. Kurzweil thinks that this super intelligence will be the key to making man immortal, with literally millions of nanocomputers coursing through the human body, killing diseases, preventing atrophy, and improving brain function. Beyond the cybergenetic era, he sees a day when a person’s memories will be downloaded to a computer chip that will also control ongoing brain operation. Without any question, Kurzweil envisions a rosy future for humankind, or whatever you call this future evolutionary phase.

While Kurzweil has his share of followers, many scientists take issue with his claims and predictions. Some regard a technological singularity as thoroughly implausible, or downright impossible. They cite too many variables, some probably not even in the picture yet, for any single pathway of evolutionary movement. Or, they state that Kurzweil’s previous predictions have often proved wrong, or only partially correct, so that his track record is suspect. While others, still, have their own vision of the future, which differ drastically from that of Kurzweil’s.

And then there are those who believe that a lot of what Kurzweil says will likely come to pass, but don’t see it as rosy. Instead, they fear for a future earth in which artificial intelligence usurps man’s position of superiority. Evolved computers (or evolving computers, as there’s no reason to expect the process to stop, short of the laws of physics) may likely formulate their own goals that aren’t in sync with those of their human creators. Critics of Kurzweil’s utopian future worry that computers may instead create a dystopian future in which humankind is enslaved, or worse yet, annihilated due to our position as a competitor for the earth’s resources. Many books have been written and numerous movies have been made that pit man against his own technological creations, computers with artificial intelligence.

Basically, artificial intelligence takes either of two forms in films. One is as robots (android or mechanically mobile), while the other is as stationary mainframe computers. This discussion assumes the understanding that artificial intelligence is a product of machines, not of genetically engineered organic robots, whose tissues would be expected to have a predisposition to intellectual and/or emotional evolution. Therefore, director Ridley Scott’s dazzling, if flawed Blade Runner, which deals with human-looking biologically created “replicants,” is outside the realm of this survey.

Dozens of movies have featured robots as main or subsidiary characters dating at least as far back as German director Fritz Lang’s 1927 futuristic classic Metropolis from 1927, which featured a female robot with bad intentions, if not actual self-awareness. Most often, however, robots are nonthreatening and/or helpful to humans. For instance, fifties icon Robbie the Robot from Forbidden Planet, RD-2D and C-3PO from the Star Wars saga, Bishop from Aliens and Alien³, Lt. Commander Data from Star Trek: The Next Generation TV series and its movie sequels, as well as the titular characters from The Iron Giant and WALL-E all fall into that category.

Surprisingly, few films have cast robots as primary antagonists. Among those that have are The Day the Earth Stood Still (1951 and 2008), The Terminator franchise, and I, Robot (2004). As vicious and relentless as any zombie, they’re also faster, smarter, and sturdier than any human. Individually, and as a group, these robots are among the most terrifying villains in modern movie history. That’s why it’s such a wonder that there haven’t been many more, though online sources indicate that an I, Robot sequel may still be in the works. And, of course, we can expect to see at least one more Transformers film this summer, though the robots in that series inhabit both sides of the good/evil fence.

More common than evil robots are evil mainframe computers. Since 1965, when French New Wave director Jean-Luc Goddard made Alphaville, evil computers have been a recurring adversary to humankind in film. Just three years after Alphaville, revered filmmaker Stanley Kubrick made what many consider his masterwork: 2001: A Space Odyssey. Filled with striking images and bold (if sometimes vague) conceptual leaps, the movie illustrates humankind’s upward evolutionary trajectory. It suggests that man’s use of tools, which helped our genetic ancestors gain the upper hand amongst animals competing for superiority, may ultimately result in the creation of tools capable of subjugating humankind itself.

In 1970, Colossus: The Forbin Project told the story of a U.S. Department of Defense-produced supercomputer intentionally imbued with artificial intelligence and put online, only to discover that the machine (with the aid of a similar Soviet system) was developing its own agenda. Oscar-winning actress Julie Christie graced The Demon Seed, a 1977 film in which a computer becomes frustrated with its lack of human senses and resolves to impregnate the wife of its creator in order to produce a cyborg version of itself. The unexpected, unconventional endings of both Colossus: The Forbin Project and The Demon Seed reflect the somber, pessimistic mood of the seventies decade in general.

When it was released in early summer 1983, WarGames was clearly aimed at attracting a teenage audience with a young Matthew Broderick in the lead. Broderick played a high school gamer who thinks that he’s found a backdoor into a game company’s new product, when he’s actually hacked into a military computer used to determine DVD cover of The Terminator, starring Arnold Schwarzneneggernuclear warhead strikes. After several rounds of games, the computer wants to play for real! A year later, the first of visionary filmmaker James Cameron’s Terminator movie series debuted. While most people associate the terminator robot models T-800 and T-1000 with this franchise, those robots are actually in the service of Skynet, the networked computer system that has battled humankind to the brink of extinction.

In 1999, one of the most influential films of the past quarter century arrived in the form of The Matrix. The Wachowski siblings’ nightmarish vision of a future in which a grid of intelligent machines uses humankind as an energy source while pacifying them with false perceptions of reality is constantly referenced in popular culture. It’s a shame that the sequels didn’t attain that same level of achievement.  More recently, Eagle Eye (2008) took the route of a national defense supercomputer that believes that it has all the answers. That movie contained more than its share of coincidences and implausibilities, but still retained a minimal level of competence in all departments to register as a pleasing popcorn picture.

And finally, a brief mention of Her, a film released late last year. Writer-director Spike Jonze’s movie about a man who falls in love with his computer’s operating system, didn’t feature an intentionally malicious artificial intelligence, but one still capable of breaking a person’s heart. Sure, that isn’t anything like humankind being bombed into the Stone Age, but it’s no picnic either. Perhaps Her is a more subtle warning (than those films discussed above) that we must be wary of all advanced technology, even those examples that we think that we love. Generally speaking, technology is a life-enhancing thing, but it simply cannot replace human interaction. Navigate the links above to find titles held in the collection of the Des Moines Public Library, then select a DVD, slip it in your player, and control it with your remote – it’s always wise to keep the upper hand!

1