Thursday, June 11, 2009

Racing Down the Information Superhighway: Computers in 1990s Film

By Nader Elhefnawy
Originally published in the THE INTERNET REVIEW OF SCIENCE FICTION, February 2009


When President Barack Obama recently pledged to "renew our Information Superhighway," it struck me that I hadn't heard that phrase in a long time. Indeed, this wording, which was supposed to evoke the futuristic, instead has acquired the exact opposite quality. It's retro, recalling 1990s-era thinking about computing and computers—and among other things, their depictions in pop culture. In particular, I think of the decade's outpouring of computer-themed films.

Of course, Hollywood had sporadically made computer-themed films before, like 1957's Desk Set or 1977's Demon Seed. Quite a few are clustered in the early 1980s, shortly after the introduction of the personal computer, like best screenplay Oscar nominee WarGames (1983), the early cyber-romance Electric Dreams (1984), and the John Hughes comedy Weird Science (1985). The related interest in household robotics in that decade (which proved to be just a flash in the pan) also turned up in the background in quite a few films and shows, like the TV series Silver Spoons (1982-1987) and Rocky IV (1985)—though these generally took a backseat to more dramatic mergers of robotics and AI, as in The Terminator (1984) and Short Circuit (1986).

Nonetheless, the theme would be far more prominent in the films of the 1990s, and this was no accident. It was the falling price of computing power, and the widened availability of Internet access, which turned the personal computer from an expensive toy, status symbol, or at best, occasionally useful luxury, into a consumer essential at that time (at least, for the industrialized world's "middle classes," though access also increased outside this group). And of course, there was the "tech boom" surrounding it, the idea that this was the one sector of the economy that really mattered, and that it was truly exploding. (1)

Put another way, this was a time when computers were both ubiquitous enough for unprecedented numbers of people to be in conscious, direct contact with them—and at the same time, new and novel enough for the fact to be topical. While only a comparative few of them were significant successes, and these outnumbered by mediocre and commercially disappointing films, even the flops make for interesting time capsules, and several of them would go on to be surprisingly influential, both via cult followings, and the inspiration they would provide other, more successful creations.

Hackers as Heroes
Just as computers became more commonplace in the popular consciousness (its cinematic component included), so did the sorts of people associated with them, and in particular, hackers. By the mid-1990s there had already been a handful of films featuring hacker heroes, like the aforementioned WarGames, but they became increasingly common as the 1990s progressed. The caper film Sneakers was slightly ahead of the curve in 1992, the deluge really hitting around the middle of the decade. (2) Sandra Bullock followed up her success in 1994's Speed with The Net the next summer, in which she played a software analyst who finds herself caught up in an elaborate computer infiltration scheme. The film was a middling performer at the box office, but spin-offs followed nonetheless, including a television series that lasted for one season (1998-1999), and more recently, a straight-to-video sequel, 2006's The Net 2.0.

That was more than could be said for Hackers, which centered on a number of young hackers who end up finding out about (and forced to combat) an oil company's cyber-security chief attempting to bilk his employers out of millions (in part, with a plot extorting a ransom in exchange for not staging multiple oil spills using a computer virus). Released the following autumn, it is remembered (by those outside its cult following) more for its cast than anything else, which included a very young Angelina Jolie in her first starring role in a major feature film (she'd already starred in the 1993 cyberpunk B-movie Cyborg 2), Jonny Lee Miller (who would be made famous by Trainspotting the next year), and the ever-obnoxious Matthew Lillard.

This turn naturally presented filmmakers with the challenge of somehow conveying the experience of hacking besides pointing the camera at a sweaty face staring into a screen, a pair of hands pounding away at a keyboard (which manages to both be uninteresting to look at, and explain almost nothing about what they're really doing). (3) Appropriate visual metaphors were hard to come by, however, as the considerable and much-argued-over efforts in Hackers demonstrated. And of course, for sheer spectacle and visceral excitement, computer battles come nowhere close to more traditional hand-to-hand combat, shootouts, car chases and explosions, so that computers were most interesting when creating big effects outside cyberspace. (4)

Not surprisingly, a more successful approach than centering a whole film on hackers and hacking was the inclusion of those elements in a broader plot. In its updating of the Bond series, 1995's Goldeneye, besides offering a host of unmistakably early post-Cold War tropes (Russian mobsters, loose Soviet nukes, leftover Cold War vendettas), prominently featured a pair of dueling computer programmers, Boris Grishenko (Alan Cumming) and Natalya Simonova (Izabella Scorupco). (5) And in Independence Day the next summer Jeff Goldblum's David Levinson took out the alien mothership with a computer virus in a twist on the denouement of H.G. Wells's War of the Worlds. Of course, this did not prove to be the last time a computer programmer had the fate of the world in their hands.

The Neon-Lit Alley Up Ahead: Cyberpunk on Screen
The future envisioned in cyberpunk was commonly viewed as a reaction to many of the economic and political developments of the 1980s: American industrial decline relative to Europe and Asia, the intensification of global economic integration, the "privatization" of economic life and the unshackling of corporate power that went with it, a greater polarization between the rich and poor. Putting it that way, of course, makes it seem rather dated, but sure enough, these far outlasted the decade. If anything, in the '90s it enjoyed greater force as a result of the collapse of the Soviet bloc (opening a large part of the world more completely to those same forces), and the economic stagnation of Japan (and the fad for neo-mercantilism that its rise had encouraged). "Globalization," no longer hindered by the division of the world into rival ideological blocs, or petty, predatory economic nationalisms, was now the watchword.

And of course, there was a certain inertia that carried a stereotyped '80s-style "bleak future" through the science fiction films of the early part of the decade, turning up even in some unlikely places, like 1992's Freejack (in which it proved a far less colorful backdrop than the future Robert Sheckley portrayed in the source novel, 1959's Immortality, Inc.) and 1993's adaptation of Nintendo's classic video game, Super Mario Bros. (the directors of which were, notably, the co-creators of the Max Headroom television series).

Still, it became less fashionable to depict urban decay and discuss the darker side of corporate power, and more so to anticipate cheerful consequences to neoliberalism-run-amok, as Nora Ephron did in You've Got Mail.(6) One result is that the 1990s saw no English-language cinematic expressions of that vision on a par with 1980s films like Blade Runner (1982) or Robocop (1987, sequels in 1990 and 1994). However, even as the long-anticipated Neuromancer film continued to languish in "development hell" (as it still does today), there were film versions of two of William Gibson's classic Sprawl stories, Johnny Mnemonic (1995) and New Rose Hotel (1998) (admittedly, not a computer-themed story, biotech being the source of the MacGuffin). (7)

Neither film was a notable commercial or critical success. However, Johnny Mnemonic is noteworthy for being based on a script by Gibson himself. Contrary to what one might guess, he considerably fleshed out the plot of his short story, producing a conventional chase film which we have little difficulty following, and in which we do, ultimately, get to know what is in Johnny's head, and why it matters. The movie, which suffered from, among others things, laughably bad acting from just about all the principals (Dina Meyer, perhaps, excepted), is also worth noting for giving us a visual depiction of Gibson's highly influential vision of the experience of cyberspace. (Some also view the casting of Keanu Reeves as the protagonist as a precursor to his later starring in The Matrix.)

New Rose Hotel stuck surprisingly close to the original story, which was a strength as well as a weakness. The original short story is dazzlingly crafted and quite poignant, but its particular tale of love and betrayal, depicted as a succession of remembered moments in the memory of the nameless narrator, hardly lends itself to cinematic adaptation. The result is stylish and atmospheric, but heavily dependent for its effect on the cast, which included Willem Defoe as the narrator (credited here as "X"), Christopher Walken as his buddy and partner in crime, Fox, and Asia Argento the clear stand-out as Sandii.

Realities, Real and Otherwise
The cyberpunk social vision aside, there was also the matter of the particular technologies on which those stories so often hinged, in particular the mind-machine interfaces that redefined the nature of experience—which, by this point, had far escaped cyberpunk's bounds. Indeed, by the early 1990s stories of people sucked into video games (in the manner of 1982's Tron), and computerized figures which emerge into the real world (as with Kelly LeBrock's Lisa in Weird Science), were already so clichĂ© that I remember science fiction magazine submission guidelines including them in their lists of story topics they did not care to see in their slush piles. (8) Nonetheless, the hype surrounding virtual reality injected new life into the basic concept by making it appear that it would not be long before we could lose ourselves in virtual environments, or see virtual entities so "real" as to be indistinguishable from ourselves.

There was, of course, the Stephen King story "The Lawnmower Man," which hit the big screen in 1992 and spawned a big screen sequel in 1997. Like the Michael Crichton book on which it was based, the 1994 film Disclosure shoehorned the use of the technology into its bigger story about gender politics. So did 1995's Virtuosity which, twelve years before American Gangster, had Denzel Washington and Russell Crowe battling it out (except with the good guy and bad guy roles reversed). VR also figured prominently in Strange Days that October, and 1999 gave us David Cronenburg's Existenz, The Thirteenth Floor, and of course, The Matrix.

As the list indicates, most takes on the subject were epistemological (and other) horror stories, the blurring of the line between reality and fantasy a source of nightmare rather than dream (even in 2000's The Cell, in which a therapeutic use of the technology was explored). In Existenz, the allure of the virtual actually fostered an ideological reaction against it, and the film begins with an attempt by a fanatical anti-VR activist on the life of celebrated game designer Allegra Gellar. Most other films, however, were less ambiguous in their sympathies, their heroes clearly the ones battling the masters of illusion, most dramatically in The Matrix.

The Computer Apocalypse
With so much attention paid to networked computers, and so much anticipation regarding the speed with which the technologies involved would develop, it is hardly surprising that some were anxious; or that villains emerging from these technologies, like malignant artificial intelligences, would again become popular cinematic villains.

They were on the whole less popular during these years than catastrophes of a more natural sort (like the tornadoes, volcanoes and menacing celestial objects of so many big-budget releases), or science fiction tropes with a more "retro" quality (like the aliens of Independence Day, and 1998's Godzilla). Still, they turned up in films like Virtuosity and The Lawnmower Man 2, and of course, The Matrix, the last of which, as the comments in the previous sections indicate, could be seen as cap, summary and synthesis to this whole thrust in filmmaking, drawing together all these previous streams (as well as a great deal of other elements old and new, from the revival of interest in martial arts films to Cornel West, who had bit roles in the sequels) in by far the most successful cyber-film of the decade.

In the future posited by the film, the world has been conquered by artificial intelligences. In need of an energy source after the humans struck back by blotting out the sun (to cut them off from the solar energy on which they relied), they proceeded to enslave humanity as a substitute, sustaining it in a perpetual simulacrum of 1999 in order to harvest electrical current from human bodies (the titular "Matrix").(9) However, a group of humans escaped, and founded the underground fortress-city of Zion, from which they carry on their struggle against the Machines, liberating such humans as they can in the hope of one day freeing the whole species. And in the middle of it all is Keanu Reeves's "Neo," a prophesied newcomer who just may be "The One" to end the war in their favor.

The result was not just a much-quoted, oft-imitated and frequently parodied blockbuster, but a cultural moment that impacted everything from fashion and interior design to philosophy, evident in the vast academic literature to which it gave rise. (10) Its two sequels, while still commercial successes which had their fans, did not have quite the same impact as the original, but it is hard to see how they could have. The unrealistic expectations so often surrounding the later installments of a series, and the practical difficulty of delivering on the first film's promises in a cinematically satisfying way aside, by November 2003, when The Matrix Revolutions hit theaters, the '90s were clearly over.

And Afterward
At the other end of that stream of filmmaking, we have also found ourselves on the other side of a crucial transitional period in regard to information technology, just as we previously had with many of the technologies that came before it, from the airplane, to atomic energy, to the space launch rocket. As in all those other cases, ideas that once seemed radical have become simply a part of the background—while assumptions common among the True Believers were dashed. It has become a banality to say that today's crop of young adults (at least, above a certain minimum of affluence) grew up "online," taking cell phones, e-mail and Internet search engines for granted as ever-available utilities, the alternatives to which they are scarcely aware of. Yet, other, associated ideas have fallen by the wayside, like the prospect of the virtual quickly and conveniently substituting for the physical. (Consider, for instance, how little driving to work telecommuting has actually spared us, as the oil price spike of 2003-2008 painfully reminded us all. (11))

Naturally, film has reflected this. Hacking became a standard action movie trope, sometimes more central to the plot, as in 2001's Swordfish, or 2007's Live Free or Die Hard (a fun film, though to me at least, the idea seemed more dated than its makers must have intended), at other times less so. (12) And tellingly, the requisite skills increasingly became part of the repertoire of more well-rounded protagonists. In 2000's Charlie's Angels, the team hacks an air-launched missile in mid-flight, but we were never really meant to take it seriously, to instead see it as just one more preposterous testament to the titular trio's talents. By 2006's Casino Royale, though, the quiet mention of a rebooted James Bond doing his own hacking was simply a detail.

The same goes for other cinematic uses of the computer, where the novelty faded even faster. The publicity for You've Got Mail highlighted the movie's "introduction" of IT into romantic comedy (essentially, by remaking 1940's The Shop Around The Corner with AOL substituted for old-fashioned letter writing), but this did not stay "innovative" long enough for the list of distinctly computer-themed romances to get much longer. When American Pie hit theaters the following summer, the technological innovation of the sequence in which Jim's clumsy attempt to seduce Nadia ends up being web-cast to their whole school attracted little notice. In short, just as hackers became a standard part of action movies and thrillers, an online component became routine in movie romances and sexual liaisons.

Unlike personal computing and e-mail, virtual reality never became part of our everyday lives with the speed expected (in 1999 Ray Kurzweil predicted that "VR" would be standard stuff in our households by 2009), let alone the means by which we interact with cyberspace as suggested in the Sprawl stories of William Gibson. Accordingly, it retained something of its exoticism even after interest in it waned. (13) Interest in the cyberpunk conception of the future waned, too, with movies set in the near-future less prone to reflect that stream of thinking about how society will develop in the future. (14) Indeed, it is the earlier version of the machine apocalypse presented in The Terminator that continues as a franchise on the big screen, the next film in the series coming out this summer. As that fact demonstrates, computers will go on appearing in movies, prominent in them to varying degrees, but the combination of ubiquity, newness and expectation that existed in the 1990s is now behind us.

1. This of course proved to be nonsense, as many investors realized to their pain back in 2000, but it did affect popular attitudes profoundly.
2. "In Hollywood, where there's around one idea a year, movie concept '95 seems to be 'anything computerized,'" ran a Newsweek article covering a slew of films appearing that year which included Johnny Mnemonic, Virtuosity, The Net, Strange Days and Hackers. "Hollywood's Big Idea: A Hard Disc's Night," Newsweek Jun. 19, 1995.
3. Through it all, however, the viewer got little sense of what hacking actually involved, and came to assume that it was a sort of black nerd magic that could support any plot convenience. Indeed, the viewer got little sense of what computer use in general was like. (Just compare the speed and dazzle of the Internet in The Net with what computer use actually was like at the time, or what it's like now for that matter.)
4. Or at least, in a version of cyberspace that looked much like that outside world. Part of the success of The Matrix, certainly, was its presentation of cyberspace in exactly that way, as a simulacra of 1999, except more exciting from an action movie perspective, because the laws of physics didn't apply there, and the principals could act as if they had superpowers.
5. Rather more low-key, 1994's Clear and Present Danger included a hacker among Jack Ryan's allies, Greg German's "Petey," and an online confrontation between Ryan and his antagonist, Robert Ritter.
6. Perhaps more noteworthy than the film's rather weak serving of romantic comedy was the film's paean to New Economy capitalism (and expression of "postmodern" New Economy conservatism), not only in its celebration of corporate superstores, Starbucks coffee and plain old consumerism-as-self-realization, but its presentation of business mogul Joe Fox as the Nice Guy Everyone Loves (Tom Hanks, just being Tom Hanks), while left-wing intellectual Frank Navasky is a pretentious jerk, played by the Nasty Guy Nobody Loves (Greg Kinnear, in the kind of role he seems to have specialized in ever since).
7. Both stories appeared in the Burning Chrome collection, which I reviewed for Tangent Online back in 2007.
8. The list of such films is lengthened considerably if one counts in stories revolving around the prospects of artificially implanting memories, as in 1983's Brainstorm or 1990's Total Recall (a theme which notably was part of Strange Days).
9. The idea of the human race confined to virtual reality simulations was not quite as novel as it seemed to many then. Olaf Stapledon posited such a scenario in his 1937 classic, Star Maker, over six decades earlier. However, there the technology was used on the "Other Earth" in an attempt by the planet's elites to control a working class that technology had made superfluous. This Marxist speculation, of course, was recast in the films as a question of the balance between human and machine, rather than an extension of social conflicts among humans themselves, though some have read the films human vs. machine conflict as a metaphor for the latter.
10. In particular, the Matrix was widely embraced as a metaphor by all who believe the reality we are conventionally presented with is not representative of how things really are, especially if the veil mistaken for the actuality is a system of control, as with many Marxists.
11. The ubiquity of work-at-home scams (they seem to outnumber real opportunities by a factor of 48 to 1) testifies powerfully not only to the desire of people to be able to work at home, but how rare it is for those who want them to actually find such jobs.
12. Live Free or Die Hard, notably, was based to John Carlin's "A Farewell to Arms," an article he published in the May 1997 issue of Wired magazine.
13. Instead of convincing sensory immersion, what we got was fuller detailing and more intricate interactivity in the worlds we see on our computer screens, as in the Sims series of games, and World of Warcraft (which have yet to be the subject of their own English-language film, despite a few memorable sitcom appearances, included a celebrated South Park episode, "Make Love, Not Warcraft"). Whether that will link up with the promise of photo-realistic video games in the next couple of generations of console remains to be seen.
14. There have been exceptions, of course, including 2002's Minority Report (based on the Philip K. Dick short story by that name), 2003's Paycheck (again, based on the Dick story), 2004's I, Robot (which actually hybridized both Eado Binder and Isaac Asimov), 2006's Children of Men, and 2008's Babylon A.D. (an adaptation of Maurice G. Dantec's novel Babylon Babies). (Stretching the meaning of "near-future," one might also include 2001's A.I..) Nonetheless, that most of these films were based on older science fiction (even when incorporating some more recent concerns and anticipations, as with rising sea levels in A.I.) is noteworthy.

Monday, June 1, 2009

On Heckler

The documentary Heckler (2007), co-produced by and starring Jamie Kennedy, is about the "heckling" of artists (in the main comedians, but creators of all sorts).

It consist mainly of interviews with celebrities running the gamut from a "Who's Who" of stand-up comedy, to filmmakers both celebrated and reviled, to journalist Christopher Hitchens.

The subject appears a valid one.

There is no denying that critics are often ignorant, narrow-minded, unfair, petty and just plain vicious in their assessments. (Indeed, it is both striking, and sad, how inspired people can get when tearing someone else apart, and especially in the case of "Internet" critics, how much time and effort they're willing to put into their attacks.)

There is no denying that certain prejudices tend to prevail among critics, and that playing the critic too much for too long can produce an undue harshness, and even an obsession with finding fault, in their reviewing.

There are also some grounds for suspicion on the part of working artists toward critics who are not coming from the same place (more commonplace, I suspect, in film than literature, where there are more practitioners, a large portion of which are active reviewers). No one should review a work in a medium or genre they do not like or understand, or without some sense of how artists actually work.

Nonetheless (and this is, admittedly, coming from a critic), to say that there is a lot of bad criticism out there, and that it can be problematic for performers and other creators, is not to say that no one has the right to express an uncomplimentary opinion (or that all such opinions are necessarily "heckling"), which is an absurdity. To suggest that any opinion about a movie, or a comedy routine, proffered by anyone not in the business is baseless and illegitimate is equally absurd.

And those on camera in the documentary seem to say exactly these things, frequently, perhaps unsurprisingly as most of it seems to consist of the interviewees venting about the raw deals they feel they got in the past in an exercise in catharsis and revenge rather than reasoned argument. The result is not only unbalanced, but backfires in making an astonishing number of the interviewees look like raving, hate-filled egomaniacs, at least as bad as the hecklers against which they are lashing out. (Kennedy appears especially clueless in his apparent inability to understand that someone out there might have a valid and honest dislike films he has made.)

Making matters worse, the line-up presented here is likely to make even a broad-minded viewer feel that some of them deserved at least some of what they got (even while they feel sympathetic to those of the interviewees they do admire)-all the more so after seeing them at their nastiest here. (Interestingly, some of those who appear in the film have publicly "heckled" each other, a point not mentioned in the course of the documentary, though the response to that would have been interesting.)

And of course, the fact remains that for all their troubles, the people doing the complaining here are the ones who've won the Dream Jobs. There are far, far worse places to be than theirs--something most of those depicted so unflatteringly in the documentary certainly realize, not that you'd guess it from this doc.

Subscribe Now: Feed Icon