Monday, August 8, 2011

A Revolution of Falling Expectations: Whither the Singularity?

By Nader Elhefnawy
Originally published in the New York Review of Science Fiction 23.9 (May 2011), pp. 19-20.

Science fiction readers and writers began the first decade of the twenty-first century preoccupied by the prospect of a technological Singularity, a race toward tomorrow at such a speed that even the very near future was held to be unknowable – a vision epitomized for me by the stories of Charles Stross's Accelerando cycle. It seems we end it preoccupied by "retro" science fiction in various forms, looking not to what lies ahead of us, but to what earlier generations may have thought was ahead of them, and different ways history may have unfolded since then, viewed with more or less irony or nostalgia – steampunk, dieselpunk, and all the rest – even as the Singularity enters the literary mainstream in novels like Gary Shteyngart's Super Sad True Love Story.

Maybe that is how we will look back on the Singularity one day, the same way that we now look back at flying cars and personal jetpacks. I sometimes think that day is not far off. After all, the last ten years have seen a profound lowering of expectations since the mid- and late 1990s when technological optimism hit a recent peak. Those were the years when futurists like Vernor Vinge, Hans Moravec and Ray Kurzweil seemed to be handing down cyber-prophecies from on high, and finding a wide receptive audience for the claims most dramatically related in Kurzweil's 1999 book The Age of Spiritual Machines. Their ideas evoked anxiety as well as hope, and they drew challenges as well as enthusiastic acclaim, but this line of thought was certainly fecund and vibrant.

That this should have been so seems unsurprising in retrospect. The late '90s saw the extension of the "digital age" to whole new classes of consumers as use of the personal computer, the cell phone and the Internet exploded, while the American economy achieved growth rates not seen in a generation. There was, too, the associated shrinkage of the budget deficit, the reports of plunging crime rates, the apparent irrelevance of so many of the problems prominent in the "declinist" rhetoric fashionable before 1995. The worries about deindustrialization that filled national debate for several years seemed especially passé, Rust Belts and trade deficits irrelevant. As the boom was held to demonstrate, economics was about information, not the making of clunky, old-fashioned things.

More than anything that actually happened, however, there was the euphoric promise of much, much more to come. In The Age of Spiritual Machines Kurzweil predicted that by 2009 we would have self-driving cars traveling down intelligent highways, or when we preferred not to go out, immersive virtual reality in every home providing an almost complete substitute. He predicted "intelligent assistants" with "natural-language understanding, problem solving, and animated personalities" routinely assisting us "with finding information, answering questions and conducting transactions." He predicted that we would put away our keyboards as text became primarily speech-created; that our telephones would have handy translation software that would let us speak in one language and be immediately heard in another; that paraplegics would walk with orthotic devices, the blind put away their canes in favor of speech-commanded electronic navigational devices, and the deaf simplify their lives with equally capable speech-to-print technology. To top it all off, he predicted that working molecular assemblers would be demonstrated in the lab, laying the groundwork for an economic revolution unlike all those that preceded it.

In those heady days the stock market was going up like a rocket (anyone remember James Glassman and Kevin Hassett's Dow 36,000?), restless employees fantasized about early retirement as their savings accounts swelled (or even quitting their jobs to become day traders), and would-be entrepreneurs eyed cyberspace as a bold new frontier (where it seemed like a company consisting merely of a web site really could be worth a billion dollars). The idea that prosperity would go hand in hand with ecological sanity enjoyed great currency, since the information economy was supposed to make natural resources much less important, something that was easier to believe with oil going for $10 a barrel. Besides, if worse came to worst and the world got a little too warm for our liking, we could just have our nanites eat up all the pesky carbon molecules trapping excess heat in our atmosphere, and spit them back out as the buckytubes with which we would build the skyhooks launching us on a new age of space exploration.

In short, all the rules of the game that made economics a dismal science seemed to have been repealed, and we were all supposed to be bound for globalized cyber-utopia. Yet the tech boom did not go on forever – far from it. Consumers did get new toys, courtesy of Apple. But there's no denying that the Ipods and Iphones and Ipads to which so many consumers thrill are a far cry from self-driving cars and immersive VR (which Kurzweil has since quietly and cautiously deferred from 2009 to the 2020s), let alone Eric Drexler-style "engines of creation."

Indeed, it would seem that the developments of the last decade represent little more than a refinement of the technologies that arrived in the '90s, rather than a prelude to those wonders that have yet to arrive, or to endless good times. Not only was the bull market on which so many hopes were pinned really and truly gone after the tech bubble burst, but that disappointment set the stage for the even bigger crisis in which we remain mired, resulting not from an overexcited response to some new technology, but speculation over that most old-fashioned of goods, real estate. After it was all over, the first decade of the 2000s saw only anemic GDP growth in the United States – a mere 1.5 percent a year when adjusted for inflation. (That's about a third lower than we managed in the much-maligned 1970s, and comparable to the 1930s.)

The picture was even worse than the growth numbers suggested, our deeper problems never having gone away. Debt and deindustrialization clearly had not been rendered irrelevant, as our swelling trade deficit and the plunging value of the dollar showed, and neither had natural resources. Despite the mediocre rate of national economic growth, commodities prices shot up, way up, oil hitting a $150 a barrel. Americans grumbled at the expense of paying more at the pump – but in less fortunate parts of the globe, the result was nothing short of a crisis as tens of millions were pushed over the line into hunger, and deadly food riots made the headlines. And while the 1990s had wars, terrorism and fears about weapons of mass destruction, the September 11 attacks and the subsequent length, intrusiveness, divisiveness and cost in blood, treasure and much else of the wars waged in its aftermath has been a rude and powerful shock. (The changing experience of commercial flying, a relatively small matter when all things are considered, by itself seems enough to refute the image of the world as a place of borderless convenience.)

There's no shortage of ideas for fixes, but in an echo of the history of many a past troubled society that failed to halt its decline, those ideas remain largely talk, the rate of actual progress maddeningly glacial, or even negative. This is not to say that we don't expect change, however. We do expect change, plenty of it. But we expect that ever less of it will be benevolent, or even to include much that can be thought of as exciting. Pundit Michael Lind has gone so far as to dub the first half of the twenty-first century "The Boring Age." Given what we were promised, The Boring Age is necessarily an Age of Disappointment. Given the challenges facing us, the failure to grow and innovate seems all too likely to make the Boring Age also an Age of Failure and an Age of Catastrophe – and not the kind of ultra-high-tech catastrophe that those credulous toward but anxious about a Singularity (like Bill Joy) have in mind, but the far more banal ends we have dreaded so much more for so much longer. (The word "Malthusian" comes to mind.)

The claims by writers like Lind still go against the conventional wisdom, of course. A great many people are very enthusiastic about the implications of the latest consumer products, and dismissive of the failure of the grander promises to arrive. Still, I can't help being struck by how the production of original claims and arguments for the technological Singularity supposed to signify such change have fallen off since the tech boom years. Surveys of technological futurology like defense analyst Peter Singer's Wired for War increasingly seem to go over and over the same, aging list of authors and works (Kurzweil, Vernor Vinge, Hans Moravec, and the rest), rather than presenting the new ideas and new advocates one might expect if this line of thought was more deeply influential. Indeed, I have found the vigor of the critics of Singularitarianism far more striking as of late, with talk of a backlash against the Singularity recently prominent even within science fiction, the word going from selling point to liability in book titles. In place of techno-optimism, Paolo Bacigalupi's portraits of eco-catastrophe seem more representative of our moment.

Of course, social, economic and political trends may not be the whole story. Thesis invites antithesis, and there is no question that the associated tropes have been heavily used for quite a while now, so that even as good novels continue to be written about the theme of runaway technological change sweeping us toward the realization of new dreams (and new nightmares), a case can be made for letting it lie fallow a while for purely artistic reasons. Still, looking back science fiction seems to have been more vibrant in boom times, less so in periods of bust – perhaps, because the genre flourishes when utopian hopes are in tension with the inevitable dreads, rather than surrendering the field to them. Just as history and economics may lean toward cyclical theories in bleak times, such doldrums seem conducive to gloomy scenarios of thwarted progress, or escape into outright fantasy, in our speculative fiction.

Even without going so far as that, with the Singularity rebuffed we are shoved away from our most extreme visions of a world transformed. The alternatives being less radical, and most of them quite amply treated by earlier writers, they cannot but appear old-fashioned, and as a practical matter their handling tends to result in a recycling of yesteryear's ideas. (Even Vinge wasn't able to get away from old standards like nuclear war and Gunther Stent-like "golden ages" in his 2007 seminar "What if the Singularity Does NOT Happen?") The temptation and tendency to look backward rather than forward is all the stronger since a self-consciously retro approach is all about the relationship of the present with the past. Whether the purpose is to take refuge from a frightening, alienating or unpromising present (or future) in a romanticized past, imagine other directions we might have taken (or try and understand the one we did take), search where we came from for clues to where we may be going, or simply savor the aesthetic and other pleasures of creative anachronism, the pull of stories about a modified yesteryear is powerful indeed under such circumstances.

Still, even if the Singularitarians prove to be wrong about particular theories (like Kurzweil's "law of accelerating returns") or the precise timing of particular innovations (like the prediction that mind uploading will be practical circa 2045), they may yet be right about the broader direction of our technological development. It is worth remembering that artificial intelligence research has been an area of repeated booms and busts, the raising of high hopes followed by crushing disappointments that give way to a renewal of optimism some time later, for at least half a century now, AI spring following AI winter. It is worth remembering, too, that Singularitarian thinking did not begin with Vinge, Kurzweil and company, going back as it does not only through the whole history of science fiction, but serious thinkers like I.J. Good and J.D. Bernal, and even before them.

For as long as there has been a human condition, human life has been defined by the struggle against its limits, and the aspiration to transcend them. A straight line can be drawn from the builders of the splendid tombs of antiquity (if not their unrecorded, nameless predecessors), and the alchemists' pursuit of the philosopher's stone, to the Renaissance humanism celebrated in Pico della Mirandola's "Oration on the Dignity of Man," all the way down to the debate between William Godwin, the Marquis de Condorcet and Thomas Malthus, to the Russian Cosmists who were the first to articulate a truly posthuman vision, and only after many more evolutions, the Singularitarians who made such a splash in the 1990s. It seems inconceivable they will be the last to espouse the idea, or to claim that the Big Moment is close at hand.

In short, that more basic idea's more likely to be on the backburner than on its way out. A comeback's all but certain, both inside and outside the genre, and perhaps sooner than we can guess.

No comments:

Subscribe Now: Feed Icon