Friday, January 27, 2012

Thoughts on "Brigadoom"

Watching the fourth season Sanctuary episode "Fugue," I was again struck by the tradition of "musical episodes" in recent television series, especially those in the science fiction and fantasy genres.

This is usually discussed as having begun with the Buffy the Vampire Slayer episode "Once More, With Feeling" – but as has been the case with many a genre trope, Lexx did it first, with "Brigadoom." And pulled off a very tough act.

It is worth mentioning here that when the show started airing on the Sci-Fi Channel back in 2000 (with a handful of episodes plucked out of the second series, without regard to the show's development, or the season's story arc), I wasn't impressed with what I saw. Lexx was certainly . . . different. I was intrigued by some of the characters and concepts, but the episodes themselves didn't amount to much. When I saw the commercial for a musical episode, I was skeptical.

But that didn't last, and neither did my earlier skepticism about the show. "Brigadoom" (cowritten by show creator Paul Donovan, and the late Lex Gigeroff) didn't simply have a few of the lines sung instead of spoken (the way Fringe's "Brown Betty" did, for instance), but instead offered a full-blown musical which dramatized the back story of a central character, and presented a turning point in the season's story arc (which was itself a whopper, the title of its final episode, "End of the Universe," not being overstatement).1 It helped, too, that the songs were memorable.

As the show's own writers have said in various interviews (mine included), the quality of the show varied wildly from one episode to another. "Brigadoom" was the show at its very best, which was great, and it has since been a fan favorite.

Alas, the achievement of cast and crew here has been overlooked in the general tendency to ignore or put down the show, which has been extreme, even compared with other TV space operas, toward which critics and audiences (hardcore fans aside) are famously ungenerous. Still, those writing about such episodes should remember that early moment in the history of television science fiction characters suddenly breaking out into song.

1. It is worth noting that this wasn't Lexx's first musical moment. The second part of the four-part miniseries that comprises series one, "Supernova" (which aired two years before "Brigadoom") also contains a brief but striking musical interlude.

Thursday, January 26, 2012

On the Eureka Paradigm

Popular expectations regarding technological progress in areas from artificial intelligence to fusion, from cloning to space, have commonly been overblown. The response on the part of many is simply to regard futurists' predictions with irony, but this has never seemed a useful position to me. The reality is that we can hardly avoid making guesses about what's to come, given that we are so often obliged to think in the long-term, and to plan, especially where science and technology are concerned. Guessing poorly can have serious implications for public policy (some of which I recently touched on in my Space Review article, "Space War and Futurehype Revisited"). Consequently, instead of irresponsible dismissal of the whole enterprise of prediction, the appropriate response is to try and get better at making predictions, and at responding to predictions appropriately. (Even if the best we can hope for is to get things somewhat closer to right somewhat more often, or simply better-position ourselves to cope with the inevitable surprises, this seems to me well worth our while.)

That said, the number of pitfalls in the way of those studying such issues is staggering. Today it is almost impossible to point to any discussion of public issues which is not carried on in a fog of misinformation, and disinformation spread by vested interests and ideologues. The situation is even worse when expert knowledge is crucial (as in virtually any discussion into which science enters), and the uncertainties are large (as when one makes guesses about things that haven't happened yet, exactly the issue here), because of how much more difficult it becomes for even informed observers to make their own judgments. (Think, for instance, of how Creationists succeeded in manufacturing a "debate" over "intelligent design.")

However, this falls far short of exhausting the list, and I would argue that one big stumbling block has been a deeply flawed "folk perception" of science and technology shaped, in large part, by media produced for (and often by) people without expertise in this area. Consider, for instance, what we get in so much pop science journalism. News items about developments in science and technology, just like news items about anything else, are typically intended to be brief and punchy and accessible (read: attention-grabbing and entertaining). Breathless stories about exciting new possibilities fit these criteria far better than critical, nuanced pieces attentive to the obstacles in the way of realizing those possibilities. (Indeed, many of them make it sound as if the possibility is already a reality, as does the questionable title of this otherwise useful piece: "Quantum Keys Let Submarines Talk Securely.")

Such stories skew perceptions, making it seem as if a great many ideas are much more developed than they really are, and the press's lack of a memory worsens matters. After telling readers and viewers about the thrilling innovation that just might change all our lives (with the "might" sometimes in the small print), there is generally no proper follow-up; the story just fades away even as the impression it created remains. The failures lapse into obscurity, while successes are likely to get so loudly trumpeted that it can seem as if the latter are all that exist.

These attitudes may be reinforced by the endless repetition of the claim that we live in an era of unprecedentedly rapid technological progress, making historical precedents irrelevant, and implying the imminent bursting of all the old boundaries. They are reinforced, too, by a tendency to identify "technology" with "information technology" (which goes as far as Google's news aggregator compiling, under the heading of technology, numerous items that are not about technology as such, but rather the financial fortunes of IT firms), which has the effect of making the state-of-the-art in consumer electronics the yardstick of technological progress, a perception which can be very misleading given that other areas (like food and energy production, medicine and transport) have seen much slower change. Raymond Kurzweil's "Law of Accelerating Returns," which and the technological Singularity with which it is associated, are a particularly extreme version of this kind of thinking, but this futurehyped, IT-centric outlook is indisputably mainstream, so much so that the correctives offered by observers like Robert J. Gordon, Jonathan Huebner, Bob Seidensticker and Michael Lind have had little traction with Apple brand-worshipping consumers who believe that their new cell phone is the telos of human history.

Still, for all its flaws, it has to be admitted that pop science journalism is far less influential than fiction, and especially television and movies, which appear to play far and away the biggest role in shaping the popular image of science. Last year a poll taken in Australia found that TV and movies were the primary source of information about science for three-quarters of those surveyed, a pattern likely typical for other developed countries. One likely consequence of this is our habituation to the thought of technologies that are only imaginary, with "hard," technologically-oriented, extrapolative science fiction set in the near future likely having the strongest impact on our thought as "rumors of the future" (as suggested by scholar Charles E. Gannon's study Rumors of War and Infernal Machines).

Additionally, science fiction routinely dramatizes the processes of science and engineering, and in the process propagates a particular view of them. The realities of science and engineering as specialized, collaborative, often slow-moving activities, participants in which often cope with multiple, knotty problems at once, some of which may be theoretical in nature; as dependent on massive amounts of organization, equipment and money controlled by people who are not scientists (indeed, often are scientific illiterates), and so susceptible to the vagaries of economics, business, politics and even personal whim; as under even the best circumstances subject to fits and starts, to dead ends and flashes-in-the-pan, with projects to develop new technologies often concluding in nearly useless prototypes, on the drawing board, or even at the concept stage; are given short shrift.

As an example of what we are much more likely to see, consider the Syfy Channel show Eureka, which centers on an imaginary town in the Pacific Northwest inhabited by a large portion of the United States' scientific elite. Eureka makes some concessions to reality in its depiction of the Global Dynamics corporation running the town, and the involvement of Pentagon functionaries in the affairs of its researchers. However, this is superficial stuff, the episodes mostly playing like twenty-first century "Edisonades," with geniuses equally adept at fields as diverse as astrophysics and molecular biology flitting from one project to the next, many of them undertaken singlehandedly in their garages; casually tinkering with toys as exotic as androids and faster-than-light drives; and during the inevitable crisis, inventing on-the-spot solutions to staggeringly complex problems that somehow always work.

Taken together, all this leaves most people imagining science as nerd-magic, and picturing R & D as a matter of omnicompetent nerd-magicians pulling solutions out of thin air – so that if a problem does need solving all we have to do is get the nerds on it and, voila, it's solved. (And if the nerd-magicians don't work quickly enough, we just have to hector them until they do, the way Sheriff Jack Carter, like many another rugged sci-fi hero, hectors the science types for a fix when there's trouble.) It also leaves audiences thinking that, to use William Gibson's phrasing, "The future is already here – it's just not very evenly distributed" in the literal sense of thinking that every gadget one has ever heard of must be out there, somewhere.

In short, it has them believing in the "Eureka Paradigm" of scientific and technological R& D.

Of course, there are lots of reasons why fiction still depicts science and technology in this way, long after such depictions have lost any credibility they may once have had. A fairly good one is that this simplistic approach is a better fit with dramatic requirements, easier to turn into a compact, intelligible, accessible story in which we get to focus on a few characters as interesting things happen – and to be fair, the point is to entertain, not inform. Yet, it is an appalling basis for actual consideration of how research actually proceeds, and we take such images seriously at our cost.

Sunday, January 22, 2012

Review: Deep Six, by Clive Cussler

New York: Simon & Schuster, 1984, pp. 432.

Clive Cussler's Deep Six (1984) can be thought of as something of a transitional work for the author, occupying a space in between his early, tightly focused, short novels like Raise the Titanic (1976), Vixen 03 (1978) and Night Probe (1981), and the later, sprawling, epic action-adventures which began with Cyclops (1986).

Even though I usually read right through Cussler's books after picking them up (back when I did read them), I started this one a few times before making much headway in it. This had much to do with Cussler's handling of the book's two main plot threads--the first, an investigation of a marine disaster in the northern Pacific, and the second, the mystery following the disappearance of the presidential yacht (with the President, Vice-President, Speaker of the House and the President Pro Tempore of the Senate all aboard). The first mystery engages Pitt and his friends from NUMA exclusively for the first third of the book or so. They only become involved with the second mystery in the book's middle, and play only a minor role in that investigation until the last third of the story, when the connection between the two series' of events finally becomes clear.

Filling the gap in between is a great deal of inside-the-Beltway intriguing among officials of the National Security Council and the Secret Service--which at times left me with the impression that I'd put down Cussler's book and picked up one by Tom Clancy instead. As is usually the case with these novels, Cussler's included, the portrait of D.C. struck me as simplistic and inauthentic, devoid as it is of the sausage factory-like quality of real-life politics. This is a Washington without lobbyists and political action-committees and revolving doors between industry and government, where politicians who take campaign contributions from shady special interests are "bad apples" and the "power elite" is described as "elected"--in short, a sanitized civics class textbook's version of governance (much as seen in other Cussler novels, admittedly, but more problematic here because of the foregrounding of this part of the story). The foreign politics are equally lacking in nuance, down to the foreign villains, who are, not unexpectedly, one-dimensional clichés that occasionally cross the line into racism, with the unsurprising result that the Soviet strategy comes off as astonishingly clumsy next to the tactical and technological genius the KGB and its partners display in executing the scheme.

In short, others have done this stuff before and after and in many cases better, and it is poor compensation for what is missing in the earlier parts of the story, where in their limited appearances Pitt and company merely contribute to a couple of underwater searches (the second of them treated rather briefly), and engage in some mostly stationary detective work. We are more than halfway through the book before Pitt has his first brushes with the bad guys, and it is some time after that before he gets up to his usual antics. The result is a story that gets better as it goes along, with the last third providing exactly the kind of thing for which Cussler's readers come to his books, especially in the action-packed finale full of over-the-top heroics, flashy high-tech and creative anachronism as the clock ticks down to disaster. But getting to that point is occasionally a slog.

Mad Men: A Second Look

I recently finished watching the first three seasons of Mad Men.

The second viewing confirmed many of my first impressions, but I have come away with some new thoughts as well.

Where the acclaim is concerned (the show won its fourth Emmy for Best Drama in a row last year), it would seem critics are responding not only to the aesthetic-nostalgic appeal of the show's recreation of a more glamorous-seeming era; or its iconoclastic portrayal of the early 1960s, (the echoes of Thomas Frank's The Conquest of Cool and Richard Yates' Revolutionary Road not groundless, but greatly exaggerated); or even its giving the audience the guilty pleasure of vicarious indulgence in un-p.c. behavior while still feeling superior to it.

The truth is that almost everything about the show is a perfect fit with highbrow critics' views on what constitutes Good Drama – the slow pace and the heavy use of indirectness and implication (e.g. subtext) to drop a massive freight of irony on the heads of its mostly unlikeable characters that might seem like liabilities to many a viewer a big plus in their book. The upper-middle class social setting, and the premise's allowing for a great deal of writing about writing, media about media, the positioning of identity, domestic life and suburban dissatisfaction as central themes, are likewise much in line with their tastes. And the association of show creator Matthew Weiner with the last cable drama to win such heaping (over)praise, The Sopranos (which worked in a not dissimilar manner), only helps.

All this makes the show not just a triumph of style over substance, but a reminder that pandering to the snobbery of the upmarket review pages (and the viewers who mindlessly follow their lead) can pay real dividends.

Wednesday, January 18, 2012

SOPA Blackout Protest

I found out about the protest against SOPA (H.R. Bill 3261, innocuously named the "Stop Online Piracy Act") too late to participate properly. So I'm posting this message to indicate my solidarity with the protestors - and offering this link to blackout participant Wikipedia's FAQ on the subject.

Until tomorrow, please regard this site as also blacked out.

This blog will resume its normal operations then.

Friday, January 13, 2012

New Review: Scarecrow Returns, by Matthew Reilly

New York: Simon & Schuster, 2012, pp. 368.

Matthew Reilly's Scarecrow Returns (published last year in his native Australia as Scarecrow and the Army of Thieves) opens with a burst of action as a mysterious "Army of Thieves" captures a Russian island in the Arctic Sea – one which happens to be home to a secret Cold War-era research installation, and the site of a Soviet superweapon now in play. This event marks the beginning of a global crisis as seen by the American and Russian crisis response teams. As luck would have it, Marine Recon Captain Shane Scofield, and his longtime friend and comrade-in-arms "Mother," are with an equipment-testing team nearby – and virtually all that the American government can call on to save the day.

This book is Reilly's first Shane Scofield novel since Scarecrow (2003), almost eight years earlier, and naturally I was looking forward to it. Nonetheless, some aspects of the premise initially worried me.1 For one thing, it suggested that the globe-trotting and mystery-solving I had enjoyed in Scarecrow and the Jack West trilogy had been abandoned in a return to the more static adventures with which Reilly began his career, like Ice Station (1998) and Area 7 (2001). I'd enjoyed those books, but felt he'd since superseded them (Reilly himself has referred to them as the work of Reilly 1.0), and wasn't sure how much more juice he could extract from the older concept he'd already executed several times. Additionally, two decades after the Cold War's end the idea of a Soviet superweapon would seem to have passed its "sell-by" date – much as has long become the case with villains left over from the Third Reich. I also wasn't sure what to make of the "Army of Thieves" who comprised the villains, these seeming to be an especially senseless bunch in comparison with Reilly's previous bad guys, whose agendas, however horrific, at least had a recognizable rationale.

Fortunately, the book exceeded my expectations in all these areas. Like the books of "Reilly 1.0," Scarecrow Returns is a three-way collision between teams of special-forces soldiers at a high-tech facility in a remote, hostile landscape, but Reilly manages to keep the material fresh, and the plot and action unfold with a smoothness that reflects his now lengthy experience in telling this kind of tale. The battles are as readable as any Reilly has written (at least, when read with the aid of the numerous illustrations), while being as grand in scale and over-the-top as readers have come to expect – which is to say, unequaled by any writer working similar territory today. Reilly's particular variant on the trope of the "left-over Soviet superweapon now on the loose" is a good one, and his villain is in line with his predecessors, at least, when we get behind the mask. The novel also benefits from a number of new touches, ranging from a scene-stealing combat robot named Bertie, to a French vendetta against our hero – and a few memorable plot twists (which I won't spoil here). Additionally, cartoonish as Reilly's characters are, they are nonetheless a bit fuller and more nuanced here, and their personalities do have a bearing on the tale.

That is not to say that everything is perfect. Readers demanding meticulous treatment of the technical detail will be irritated by such things as Reilly's depiction of a KH-12 satellite as a signals intelligence platform (its function is in fact optical imaging), and his repeated reference to an SS-23 as an intermediate range ballistic missile (when its 500 kilometer range actually makes it a short-range ballistic missile) – details that could have easily been corrected without requiring the slightest changes to the story. There is an incident in one of the battles (in the "Stadium") where the editing appeared to falter. (It seemed to me that Reilly wrote "trench" when he should have written "walkway" – though I'm less than a hundred percent certain of this, as those of you familiar with his action sequences can understand.) Such nit-picks aside, Reilly's use of his over-the-top plot to explore very real geopolitical issues struck me as less clever this time around, the rationale behind the action comparatively muddled, especially when compared with the almost psychic perceptiveness of the villain. (The fact that the weapon's activation frankly seems unlikely to leave any "winners" on the planet is only one of the reasons for this.)

Still, on the whole it's a satisfying read if you're up for this kind of adventure, and fans of previous books are likely to find it well worth their time. However, given the extent to which events in the previous novels bear on the story in Scarecrow Returns, readers new to the series might want to check out the previous installments (Ice Station, Area 7, and Scarecrow) first.

1. I use the original publication dates here, rather than the dates of their release in North America, my edition excepted.

Thursday, January 12, 2012

Lex Gigeroff, 1962-2011

As many of you already know, Lex Gigeroff, best known as one of the three principal writers on Lexx (as well as a sometime actor on that show, in such memorable roles as Barnabas K. Huffington), died last month, on December 24, at the age of forty-nine.

I had only one exchange with Mr. Gigeroff, when I interviewed him for my retrospective on that series back in 2007, but I found him friendly, witty and helpful, as he has generally been to those who knew him, and there is no question that he will be missed.

As might be expected, social media is one avenue through which these sentiments are being expressed: there is now a Lex Gigeroff Memorial page on Facebook (the page, and the memorial event it mentions, came to my attention only after they passed), as well as a tribute video on YouTube, which fans and well-wishers can check out.

Wednesday, January 11, 2012

Thoughts on W. Somerset Maugham's Ashenden

W. Somerset Maugham is perhaps best known for his books Of Human Bondage (1915) and The Razor's Edge (1944), but students of the spy novel know him by another book, Ashenden: Or The British Agent (1928), a collection of loosely connected stories about the adventures of the titular figure in the service of British intelligence during World War I.

In the course of the narration Maugham touched on many of the difficulties of turning the spy story into entertainment with surprising frankness, as when he observed that
Being no more than a tiny rivet in a vast and complicated machine, [Ashenden] never had the advantage of seeing a completed action. He was concerned with the beginning or the end of it, perhaps, or with some incident in the middle, but what his own doings led to he had seldom a chance of discovering. It was as unsatisfactory as those modern novels that give you a number of unrelated episodes and expect you by piecing them together to construct in your mind a connected narrative (Maugham, 7).1
Rather making "a picture out of the various pieces of the jigsaw puzzle" was the prerogative of "the great chiefs of the secret service in their London offices" (Maugham, 101).

Maugham offered another thought of the kind in regard to Ashenden's routine as a case officer when he observed that:
Ashenden's official existence was as orderly and monotonous as a City clerk's. He saw his spies at stated intervals and paid them their wages; when he could get hold of a new one he engaged him, gave him his instructions . . . he waited for the information that came through and dispatched it; he kept his eyes and ears open; and he wrote long reports which he was convinced no one read till having inadvertently slipped a jest into one of them he received a sharp reproof for his levity. The work he was doing . . . could not be called anything but monotonous (Maugham, 101).
As Maugham's remark demonstrates, the "tiny rivet" problem has been a big one for writers across the whole history of the spy genre (though it has loomed increasingly large as time has gone on).

Some authors have responded to this reality by contriving ways to put their protagonists at the center of events, so that not only do they get to see a "completed action," but that the action can be thought of as in large part their own, as writers as diverse as John Buchan and John le Carré, Eric Ambler and Ian Fleming, are known to do. (The approach tends to have the author writing in unlikely coincidences or exceptional organizational circumstances; forcing their heroes to become independent operators, whether as hapless outsiders or insiders forced to go rogue; or simply ignoring bureaucratic realities.) More recently they have deemphasized the rivets and instead concentrated on offering a broad picture of that "vast and complicated machine" – as Frederick Forsyth and Tom Clancy have done. (These describe the machine's operations at length, and rather than focusing their narrative on one character, or a few characters, use a large number of viewpoint characters to show a great many aspects of the machine's functioning, so that the vast plot is really the heart of the story, and the national security state the real protagonist.) And on the whole, writers of spy fiction have been far more prone to present their spies acting like detectives investigating a crime or carrying on a manhunt, heist men planning a black bag job, special operations soldiers, or fugitives on the run, than actual case officers in the business of handling agents.

Maugham, however, works within exactly the framework he describes. Unsurprisingly, he eschews the thriller conventions writers like E. Phillips Oppenheim, John Buchan and H.C. "Sapper" McNeile had already popularized. Instead, the intelligence work tends to be a backdrop to other dramas, as with his adventure in pre-Revolutionary Russia (in which we see almost nothing of what he is actually doing as a British agent). Reading these I was often reminded of Maugham's irony and humor (at its best in the episodes involving the "Hairless Mexican," and Russia, "Love and Russian Literature" being especially funny if you have enough familiarity with the context to get the joke). The result is, unsurprisingly, a book that holds up rather better than many contemporaneous classics of the spy genre.

1. The edition I have cited is the following: Maugham, W. Somerset, Ashenden: Or, The British Agent (Mattiuck, NY: The American Reprint Company, 199?), pp. 304.

Wednesday, November 30, 2011

Review: The Centauri Device, by M. John Harrison

New York: Doubleday Press, 1974, pp. 185.

In M. John Harrison's The Centauri Device, a chaotic twenty-first century saw the Arab-Israeli conflict draw in virtually the whole planet, with an "United Arab Socialist Republic" (UASR) absorbing the Soviet and Chinese blocs, and virtually all of Africa and Asia with them, while an "Israeli World Government" (IWG) assumed control of the Western hemisphere, Western Europe and the Mediterranean littoral, each of these complete with its associated Cold War ideology. These blocs (which by this point are so cosmopolitan that they have ceased to be meaningfully "Arab" or "Israeli" in any sense) promptly extend their conflict into space, all but dividing the galaxy between them.

During that expansion, humanity exterminated the inhabitants of the planet Centauri VII. Following this event a human archaeologist came upon a device the Centauris had been developing before they were wiped out. Its essential function is unknown – some are convinced it is a weapon, others the voice of God, still others a machine capable of forcing people to see reality as it is – but no one doubts its power and significance, which makes it an object of struggle between several parties, including Earth's two governments.

However, it is not enough for them to possess the device itself, as the one thing that is really known about it is that it can only be operated by an actual Centauri. As it happens, space captain John Truck may be that last Centauri (his mother, unbeknownst to him, having been a Centauri survivor of the genocide). This makes Truck also an object of pursuit for all the interested parties, setting the central chase in motion.

As might be expected from this description, Device packs all the essential tropes of the conventional space opera – a planet-hopping journey, starship battles, humanoid alien races, mysterious extraterrestrial artifacts, and galaxy-in-the balance struggles – in what is quite a fast-paced, action-packed story. (Indeed, reading Device I was reminded just how much zip novels often had back when it was possible to publish a "mere" two hundred pager.)

Yet, the book is also remembered as having "ended" the space opera, upsetting all traditional expectations in its use of just about all the genre's conventions. Here humanity is not united inside a scientific world-state, venturing out to the stars in a grand enterprise of exploration (or for that matter, the pageantry of a far-future galactic feudalism). Rather, it is divided against itself in armed-to-the-teeth political factions devoted to ethnic labels and political ideologies that have long since lost all meaning, and far from transcending them in the flight out to space, drags the rest of the galaxy into that stupid, pointless conflict. After all, the species is not the hero, or the victim, in its encounter with extraterrestrials, but the aggressor, the alien invader preying on the rest of the galaxy; far from being the solution to humanity's problems, ideology, cold war and space colonization have done nothing to improve the lot of ordinary people; instead of a shiny, high-tech future, a sense of decrepitude, gritty in the best sense of that horribly overused and abused term, is prevalent throughout the novel; and our cast consists not of scientific geniuses and brave explorers and soldiers, but scheming functionaries and hapless losers knocked around by life. (Indeed, the word "spacer" is virtually a synonym for "loser" in this book.)

John Truck is no exception, neither a square-jawed hero, nor a colorful rogue who can play the hero in a pinch, but just another lowlife who implausibly came into possession of a starship. In the course of the novel he does little more than run away from people he dislikes, "his morals those of a cretin or a small animal" and such courage as he displays "only the courage of desperation" (184), all the way down to the ambiguous, but calamitous finale. Indeed, by the end Truck emerges as an Everyman, the face of a degraded, apathetic, inarticulate, suffering – yet still, very human – humanity.

The Legion of Space, this definitely isn't.

Of course, this cannot appear as radical now as it did at the time of its first publication over thirty-five years ago, and there are all sorts of ways large and small in which the book shows its age as a nearly four decade old New Wave work (its rootedness in the politics of the time, its embrace of the ridiculous in making its not-at-all-ridiculous point, its touches of '70s decadence). Still, Harrison's zany conception of twenty-fourth century Earth and its domain, in its repleteness with memorable touches like starship-flying Pre-Raphaelite anarchists; its searingly bleak landscapes; its vivid sense of what the galaxy (and indeed, history) feels like from the bottom up, which I can compare only with Howard Fast's portrayal of the Roman Empire in his classic Spartacus; its achievement in the creation of the anti-heroic Truck; is not just a major Moment in science fiction history, but a memorable read in itself, fully deserving of its status as a genre classic.

Saturday, November 19, 2011

On the Word "Lifestyle"

The great George Carlin, following his troubles with the Federal Communications Commission, offered his own official policy on inappropriate language, in which he satirized many contemporary usages (a bit which can be found in his Parental Advisory: Explicit Lyrics comedy album).

One of those which stand out in my mind is the word "lifestyle," about which he remarked
you will not hear me refer to anyone's lifestyle. If you want to know what a moronic word lifestyle is all you have to do is realize that in a technical sense, Attila the Hun had an active, outdoor lifestyle.
Would that we had been more attentive. Far from being put in its place, the word has spread like the fungus it most assuredly is, becoming a meaningless catch-all for any characterization of how anyone--or even any thing--lives. Watching Nature on PBS, for instance, I have seen scientists--apparently intelligent, well-educated individuals, whose first language seemed to be English--refer to birds as having a "lifestyle."

Unsurprisingly, "lifestyle" has made a list of "Words Faculty Say They Don't Want to Read Again, Ever" at the web site of Amherst College's Writing Center, with the qualification "unless you're talking about somebody from Hollywood." (By contrast, the writers of the page continue, it "won't do at all to talk about Plato's exciting lifestyle.")

To the end of clarifying what "lifestyle" is, let us be clear that I am not speaking of the term in the sense in which Adlerian psychology uses it, which is almost never referenced, but the "marketing" sense (also listed in the OED), which has not only become synonymous with the term (as Carlin's example demonstrates), but apparently reshaped our understanding of that broader question of how people live.

That understanding would seem in deep need of correction, given that this usage is not to be regarded as identical with "way of life," which is the characteristic manner of living of a whole culture, something with which this term is often confused. Nor is it the same thing as a "standard of living," or a "quality of life," which are measures of how well or poorly people--groups or individuals--live, with the former a materialistic measure, the latter a more holistic one.

Unlike all these other descriptors, lifestyle can only be a matter of personal, individual choice--something not really operative when one unthinkingly abides by the way of life in which they have been brought up, in a context where other options generally do not exist. Even where such options do exist one must have a certain minimum standard of living and quality of life before they can even begin to have a lifestyle--a certain minimum of affluence, leisure and personal freedom. How much? Enough, at least, to permit their accumulated choices to constitute a "style," and implicitly, to allow them to live "in style," should they so choose. Enough to permit self-examination, self-discovery, self-expression and self-realization to rank high on their lists of concerns (rather than getting their next meal, for instance).

Consequently, animals do not have lifestyles. Nor do members of traditional cultures--Medieval peasants, for instance. (Instead these may be said to have a "way of life.") Children do not have lifestyles. (If any individual's making decisions about how they live, it's their parents.) Nor do the poor. (When one does not even have sufficient food or adequate shelter, "style" of life or anything else is not high on their list of concerns.)

I could go on (think of the above list of people who do not have lifestyles as a minimal guideline), but the thing to remember is that the idea of lifestyle belongs to the modern world, and moreover, is a relevant descriptor only for a comparatively privileged minority on the planet today (such as a "somebody from Hollywood" for whom the Amherst Writing Center's list allows an exception). Our thinking otherwise is a matter of glossing over unpleasant facts about inequality and the shallowness of consumer culture and the treatment of mindless "optimism" as a default intellectual setting; a matter of our readiness to look at the homeless and see Jacobim Mugatu's Derelicte fashion line.

An illustration might help at this point. Consider the sitcom Two and a Half Men. Prior to the current season, the show's protagonist was Charlie Harper, a songwriter and musician who punches no clocks, lives in a Malibu beach house, drives upmarket foreign cars (a Mercedes at one point, a Jaguar at another), and conducts himself in the manner of a "swinging bachelor."

Charlie may be said to have a lifestyle.

His brother Alan, however, is an unsuccessful chiropractor who was ruined financially in first one divorce, and then another. He lives with Charlie because he cannot afford a place of his own, and is single not because he chooses to be, but because women find him unappealing.

Consequently, Alan cannot be said to have a lifestyle. The joke, in fact, is that he doesn't have much of a life.

I do not think the point can be made any clearer than this, and so while I would prefer never to hear the word used ever again, I'd settle for simply seeing and hearing it used with some discretion.

Tuesday, November 8, 2011

Watching Jeremiah

J. Michael Straczynski's series Jeremiah (2002-2004) is set fifteen years after a plague wiped out virtually all the adults on the planet--in which time the surviving children have since grown up. One of these is the titular character, who originally wandered the United States looking for "Valhalla Sector," the place where his father may have found a refuge. Along with a recent acquaintance named Kurdy, he finds his way to Markus Alexander, another young man, now residing in the remains of the Air Force's Cheyenne Mountain headquarters, who is working toward the rebuilding of the old world.

The show is comparatively obscure now, its profile on the Internet low (as the brevity of the Wikipedia article discussing the series demonstrates). Loosely based on a Belgian comic book of the same name, it is a spin-off from a property all but unknown in the American market, and ran for just two seasons and thirty-five episodes on the cable channel Showtime before its cancellation. It has not been rerun much (the only instance I can think of is the Sci-Fi Channel's airing the series in a four-hour Thursday night block in 2008, after which it broadcast a few daytime marathons), and all these years on Season Two has yet to have a proper release on DVD (though since 2010, Amazon has been manufacturing discs "on demand using DVD-R recordable media"). As a creation of J. Michael Straczynski, it has also been overshadowed by the huge success of Babylon 5, and perhaps, also received less attention than it otherwise might have had due to his departure from TV land for comic books and writing for the big screen (which has recently included work on the past summer's hit Thor, and the upcoming World War Z).

Still, underwhelming as the commercial response has been, I still found it a worthwhile show, enjoying it rather more, in fact, than the other postapocalyptic drama on television in recent years (CBS' Jericho). Granted, there were ways in which it felt like a repetition of Straczynski's earlier work. The show's first season might be unkindly described as a scaled-down (and given its starker setting, stripped-down) Babylon 5, especially in its first season, with Thunder Mountain analogous to the titular space station, and the heroes' conflict with Valhalla Sector to the Shadow War. There were strong echoes of particular episodes, too, Jeremiah's "Man of Iron, Woman Under Glass" reminding me of B 5's "A Late Delivery From Avalon."

Still, the essential material's solid stuff, and on the whole well-executed. Unlike J.J. Abrams' Lost, Jeremiah (like Babylon 5) did not simply string the viewer along, testing their patience past the breaking point, but rather told a story. Additionally, the series did boast a genuinely different cast of characters, and make use of its setting as more than an alternative dressing for the same concept. The show also moved into fresher territory in the second season, which tied up several of its plot threads while leaving enough possibilities for a third, which I for one would have welcomed. However, Straczynski's well-known difficulties with MGM over the show's creative direction, and the turn away from the genre on television (Showtime abandoning its previously heavy investment in science fiction programming at about that time), resulted in the show's ending when it did--a casualty of the end of the "Golden Age of Science Fiction Television."

Sunday, November 6, 2011

Trillion Year Spree, Twenty-Five Years Later

Twenty-five years ago this month Brian Aldiss and David Wingrove published a massive update of Aldiss' Billion Year Spree (1973), appropriately named Trillion Year Spree: The History of Science Fiction (1986). On first seeing this book a decade ago I was struck with admiration for its scope and comprehensiveness in covering the history of science fiction - unmatched by any other study of the genre I was able to find. That remains the case today, so that I have probably turned to it more frequently than to any other single critical volume.

I was less impressed with some of the book's rather iconoclastic judgments. In fact, on the first reading it seemed to me that they gave the Golden Age authors less than their due (the chapter labeling several giants of the Golden Age "dinosaurs" seemed to me rather snide), while praising the New Wave authors excessively. (At the time, my limited and haphazard reading had left me knowing little of print science fiction's history but the Golden Age writers, while my experience of the New Wave was limited to a couple of Michael Moorcock's Jerry Cornelius stories, a bit of J.G. Ballard and Ursula Le Guin, and some Harlan Ellison, and out of that I'd only liked the Ellison.)

However, as I read more deeply and widely in the genre I also became more appreciative of their appraisals, both of the broad evolution of the genre, and of the achievements, failings and significance of particular editors and writers. Reading Asimov's The Robots of Dawn (1983), a work he published in the same years that Rudy Rucker riffed entertainingly on his Three Laws of Robotics in novels like Software (1982), it was hard for me to not feel that Asimov had become a dinosaur by the '80s. Meanwhile, reading much more Moorcock (especially the Pyat quartet, and the Oswald Bastable novels, and the John Daker and Elric of Melnibone sword and sorcery series'), and the works of Norman Spinrad, and Aldiss' own books, like the brilliant Non-Stop (1958) and Greybeard (1964), made me feel that, if anything, it is the New Wave authors who are underappreciated.

Still, while the book remains essential reading for anyone taking a serious interest in the genre's history, it has one significant limitation: its dating after twenty-five years, almost twice the length of time that lapsed between Aldiss' original publication of his study, and his massive update of it in 1986. Certainly much of what it had to say about the state of science fiction in the 1980s holds true for it today. Yet, these years have not been uneventful as particular works Aldiss and Wingrove cover in their book reshaped it. We have seen the booming of alternate history, steampunk and its cousins; the flourishing of cyberpunk and post-cyberpunk and the new space opera; an explosion of "science fiction about science fiction" and the evolution of the genre in film and television.

Then again, there has been little effort to deal with science fiction's last three decades in a comprehensive way, one reason why I have made the attempt in After the New Wave: Science Fiction Since 1980 – what I hope will not be the last attempt to fill in this gap in the critical literature.

Tuesday, November 1, 2011

Review: Carte Blanche: The New James Bond Novel, by Jeffrey Deaver

New York: Simon & Schuster, 2011, pp. 414.

Jeffrey Deaver's new James Bond novel Carte Blanche makes an interesting contrast with the last Bond novel, Sebastian Faulks' Devil May Care (2008). Where Faulks wrote his book "as Ian Fleming," picking up the series right where Fleming left off in The Man With the Golden Gun, Deaver offers a complete reboot.

Deaver's James Bond is not a veteran of World War II fighting the Cold War, but a veteran of Afghanistan, who saw combat with front-line ground units despite his rank in the Naval Reserve. (How exactly this happened is not actually explained.) He is recruited by the Overseas Development Group, which we are given to understand is a revival of the World War-II era Special Operations Executive (though the ODG's missions in no way parallel those of the older organization), and we meet him early in that career. Where Faulks attempted to write in Fleming's mode, Deaver does not try, his generally clear, competent prose all his own.

Additionally, where Faulks kept his book relatively slender (in line with Fleming's books), Carte Blanche runs to four hundred rather packed pages. Its complicated structure also reflects the demands of such a length, the story involving not one, but two, separate (if intersecting) villainous plots, as well as a third storyline revolving around a related vendetta, and a mystery regarding the death of Bond's parents.

The story is also heavy on the kind of bureaucratic game play that comprises an increasing share of the spy thriller's content these days, even when the villains are the old-fashioned external ones. Bond was always clearly part of an organization, but the sense of this is rather more prominent now, not least because he spends much of the first third of this book in and around headquarters. The analytical side of intelligence work is also far more prominent than before. (Indeed, we learn that Bond has himself been an analyst, before recruitment into the ODG.) And it might be added that MI 5 has never before given Bond such grief, or for such poor reasons.

To his credit, Deaver does as good a job of juggling all these storylines as one might hope, but I felt that the book suffered from a slow start, for the aforementioned reasons. Fortunately, the pace picks up when Bond finally arrives on foreign soil, pursuing his investigation in Dubai and South Africa.

The resulting adventure is packed with action and skullduggery. However, these are rather grounded in comparison not just with the films, but with the Fleming novels themselves, which had their share of over-the-top bits of action and plot: Bond's final confrontation with Mr. Big in Live and Let Die, or his racing against the clock to stop the Moonraker missile's launch, for instance. And when the intrigue gets cleared up, the villains, well-drawn and innovative as they may be, not only seem rather small-time compared with their earlier counterparts, but almost a parody of them – a demented garbage man in love with refuse the way Goldfinger loved gold (picture Shirley Bassey singing that theme song: "He loves only garbage/Only garbage"), a scheming NGO functionary, a lovelorn engineer insecure about his feet (yes, you read that correctly).

I suppose this reflects Deaver's earlier writing about serial killer-types in novels like The Bone Collector, rather than the extravagant madness of Fleming's villains. It also reflects the scaling-down of the military-spy game since World War II and the Cold War (as villains go, terrorists are no match for Large Peer Competitors, especially LPCs with a competitive ideology to sell), and of Britain's role in world affairs (2011 a long way away from the pretensions current circa 1953).

It reflects, too, the tilt away from escapist fantasy and toward "gritty realism" going even beyond what we have seen in the most recent Bond films. This time around we see Bond operating as part of a world of extraordinary renditions and state-sanctioned torture, of government plants of misinformation in the media and open, even gloating disdain for civil liberties, and I have to admit that I didn't care for it. Of course, I sometimes found Fleming's politics jarring too, preferring the films' tendency to play down the Cold War, but I also don't remember Fleming touching such hot-button issues in his books, or Fleming's Bond being so uncritical of the uglier parts of his business - his misgivings about his assignment in the short story "Quantum of Solace" a particularly striking example of this aspect of the character. (Frankly, where Bond's attitude toward his work is concerned, Deaver's version of the character appears underdeveloped and shallow next to the original.)

We also see this tendency in the predictable concessions to the New Puritanism regarding smoking, drinking and sex, and to feminism as well. Of course, this did not begin with Deaver by any means. In John Gardner's first effort with the series, License Renewed (1982), we learn that Bond has cut back on his alcohol intake, and switched to low-tar cigarettes. Apparently he is no longer carrying on liaisons with three married women at once, instead pursuing more socially acceptable (monogamous, long-term, conventionally romantic, etc.) relationships, and his early scene with Q section engineer Ann Reilly reads like a taunt of those holding traditional expectations about where such bits go.

Nonetheless, the concessions are rather larger now. Deaver's Bond still enjoys good food, and good drink, but is an ex-smoker now, who actually admonishes colleagues for their smoking and drinking while on the job. Back at headquarters Moneypenny (now identified to us as a Royal Navy lieutenant) "keeps [Bond] in his place," their relationship friendly rather than flirtatious, and the same might be said too of Mary Goodnight (not his secretary, but his personal assistant now, with much made of her competence). Bond does not act on his attraction to his engaged colleague Ophelia Maidenstone, even as he pines for her so much that he is actually thinking of her when he is with the only character who reads like a classic Bond girl (suggestive name and all), though even she turns out to be something quite different from what fans of the classics might imagine. Bheka Jordaan represents a new mark in the ever-increasing prickliness level of Bond heroines. And then, when the story comes to a close, Bond's association with the woman with whom he is dining is strictly Platonic. Bond even goes out on lousy dates, at the start of the book feeling some relief at being given an out from listening to a beautiful artist go on about how underappreciated she is when the office calls, a bit which struck me as more appropriate to an independent film about (much) lesser mortals.

The term "postmodern conservatism" nicely sums up this blend of right-wing thinking with political correctness, but that is not the entirety of the change. While Deaver handles the travelogue and the luxury well, he simply can't bring back the old romance of travel. This 007, flying aboard Air Emirates, guesses that he is enjoying the "quality service that typified the golden age of air travel fifty or sixty years ago" (164). The old Bond wouldn't have had to guess, because we watched him actually living it, and that's part of what's lost in making Bond a child of Generation Y: he can no longer be romanticized as a representative of an earlier, more sophisticated generation which truly understood "lifestyle," a thing now attributed to monkeys, birds and dinosaurs, apparently. (And speaking of misused terms and expressions, I was dismayed to see Deaver employ the phrase "Back in the day" in the course of the narrative. Run into the ground by adolescents with limited vocabularies and absolutely no sense of the past, it has no place even in the thoughts of this book's hero.)

The result is that while Carte Blanche is in many respects a competent espionage thriller, it left me unsatisfied as a Bond novel. Indeed, Carte Blanche might be said to have done for the novels what Casino Royale did for the films – reimagined Bond for today's market while eliminating much of his personality in the process.

An updated James Bond, it seems to me, is not really James Bond at all.

For the full listing of the James Bond continuation novels (and the reviews of them available on this blog), click here.

Review: Devil May Care, by Sebastian Faulks, writing as Ian Fleming

New York: Doubleday, 2008, pp. 278.

In 2008 Sebastian Faulks, "writing as Ian Fleming," published Devil May Care, a James Bond novel which is novel in its picking up exactly where Ian Fleming left off. Taking place a year and a half after the events of The Man With the Golden Gun (1965), the book informs us that since his battle with Francisco Scaramanga, Bond has been manning a desk at "Universal Exports," part of a "phased return" to the service. Naturally, M judges him ready to go back to his old assignments early in the novel, charging him with investigating a Dr. Julius Gorner, a villainous industrialist who is believed to be involved in a Soviet scheme to flood Britain with drugs – and perhaps, something more than that.

The results of Faulks' effort are mixed. To his credit, Faulks' gets the tone and "feel" of Fleming's writing right for the most part. He also derives a good deal of interest out of the retro context. There is some amusement in seeing the Bond of the novels plunged into Swinging London, which is making itself felt in the unlikeliest of places – a conversation with his housekeeper May in which she tells him about the drug charges against the Rolling Stones, the news from Moneypenny that M has taken up yoga. (Remember, Fleming's Bond was a creation of the '50s, not the '60s.)

Faulks' choice of contemporaneous Iran for the principal scene of the action added to its interest. Not only does it take Bond to a part of the world we haven't seen him in before (with few exceptions, Fleming stuck with the United States, the Caribbean and West European countries as his settings), but it helps reinforce the feel of another era, given how much the depiction of the country differs from what is routine in today's thrillers. It is hardly nuanced or deep, and true to the book's "retro" approach more than a few of the remarks the characters make will strike attentive readers as ignorant and bigoted – but the prejudices are different, and the whole may come as something of a shock to those who imagine Iranian history to have begun in 1979, and the Middle East to have never been anything but a cauldron of fundamentalist insanity and homicidal prudery. Call it the more complex "Orientalism" of an earlier period, when these parts of the world were imagined as colorful, extravagant and sensual, as well as decadent and backward – an object of fantasy as well as nightmare, and a much more suitable scene for James Bondian adventure than it might seem today (the episode in the Paradise Club striking in this respect).

The selection of this setting has yet another advantage, namely the Persian Gulf's having been one of the last scenes where Britain played an independent military role (these were the last years before the end of Britain's commitments "east of Suez"), enabling Faulks to center the story on British action without stretching plausibility too far. To his credit, he proves reasonably adroit in developing his plot to this end.

Still, the weaknesses are not minor ones. While Faulks captures the feel of Fleming's writing, he does not capture the spark it had at its best, and as a thriller the book is a letdown. The inclusion of an assassination attempt on Bond before he even leaves London struck me as a clumsy attempt to correct for the slow start typical of Fleming's novels. Faulks also fails to get full use out of the promising bits of atompunk he introduces, which include an ekranoplan. (Indeed, his prose tends to falter when dealing with the action and techno-thriller bits, a telling instance his description of a Soviet Mi-8 helicopter as "classic." That helicopter can be regarded as classic now, but no one would have described it as that in the 1960s, when it was new gear.) The same goes for his dispatch of Bond behind the Iron Curtain for the first time in the history of the series (the unseen events between You Only Live Twice and The Man with the Golden Gun apart), which should have been a highlight, but ends up feeling perfunctory, merely something to get Bond from point A to point B.

At the same time, if the idea of sending Bond back to the 1960s was to free him from the constraints of political correctness in the manner of Mad Men, then the novel doesn't quite work on that score, Bond's hedonistic flair clearly lacking. (Early in the tale, Bond actually turns down an attractive woman's offer to go up to her room.) The appearance of a female double-o in the story, something I have a hard time picturing Fleming's M (or Bond for that matter) accepting, looks like a concession to twenty-first century attitudes, much more in line with the later screen Bond girls than anything Fleming wrote.

Where Faulks hews closer to Fleming's precedent, he tends to seem plainly derivative, certainly in his creation of Gorner. He clearly owes much – too much – to Moonraker's Hugo Drax, another physically deformed continental who came away from his schooling in England feeling humiliated and hateful, who in World War II fought for the Nazis and after becoming a self-made tycoon, now works with the Soviets to pursue a personal vendetta against Britain, to culminate in a high-tech blow against the country, using means we have certainly seen before in Fleming's books. The repetition is to diminishing returns, and it does not help that this villain's particular scheme is overly complex, diffusing the action and tension, which makes for a poor contrast with the admirable compactness of the Fleming novels.

The result is a book I found consistently readable, with quite a few bits that were better than that, but in the end left a lot to be desired as a continuation of Fleming's series.

For the full listing of the James Bond continuation novels (and the reviews of them available on this blog), click here.

Monday, October 31, 2011

A Primer on the Technological Singularity

In March 1993 mathematician Vernor Vinge famously presented a conference paper, "The Coming Technological Singularity: How to Survive in the Post-Human Era," in which he wrote dramatically of a "change comparable to the rise of human life on Earth," caused by "the imminent creation by technology of entities with greater than human intelligence," whether through the expansion of human intelligence via biotechnology or mind-machine interfaces, or the emergence of "strong" artificial intelligence.

This greater-than-human intelligence would not only exceed the human capacity for mental activity, but, particularly to the extent that it is an AI, be capable of designing still-smarter AI, which in its turn can create AI smarter than that, and so on, in a chain of events far outstripping the original, human creators of the technology – while in the process exploding the quantity of intelligence on the planet.

This "intelligence explosion," theoreticians of the Singularity predict, will result in the acceleration of technological change past a point of singularity, analogous to the term's mathematical meaning – a point beyond which the curve of our accelerating technological progress explodes – resulting in
a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control. Developments that before were thought might only happen in "a million years" (if ever) will likely happen in the next century.
The idea of an intelligence explosion was not new in 1993, having quite old roots in science fiction before then, as Vinge, a science fiction writer himself, acknowledged. (To give but one example, something of the kind happens in the course of the stories collected together in Isaac Asimov's I, Robot, with "The Evitable Conflict" describing such a sequence of events quite presciently.) Additionally (as Vinge also acknowledged) Vinge was preceded by Bletchley Park veteran Irving John Good who is often credited with first giving the a rigorous nonfictional treatment in his 1965 paper, "Speculations Concerning the First Ultraintelligent Machine."

However, Vinge's particular presentation has been highly influential, in part, perhaps, because of its timing, converging as it did with similarly spectacular predictions regarding progress in computing, robotics, genetic engineering and nanotechnology (Eric Drexler's Engines of Creation came out in 1986, Hans Moravec's Mind Children in 1988, Ray Kurzweil's Age of Intelligent Machines in 1990) in what some term the "molecular" or GNR (Genetics, Nanotechnology and Robotics) revolution (which many now take to be synonymous with the Singularity concept). That intellectual ferment would shortly be turbo-charged by the expectations the "tech boom" of the late '90s aroused, and perhaps the pre-millennial buzz in evidence during the approach to the year 2000 as well.

In the years since then, the idea has not only become the predominant theme in "hard" (technologically and futuristically-oriented) science fiction, but made increasing inroads into mainstream discussion of a wide range of topics – for instance, by way of Peter Singer's highly publicized book on robotics and warfare, 2009's Wired for War (reviewed here).

There is, of course, a range of outlooks regarding the implications of a Singularity. Some are negative, like Bill Joy's well-known essay "Why The Future Doesn't Need Us", or for that matter, the darker possibilities Vinge has touched on, like technological accidents or a turn to hostility toward human beings on the part of those intelligences. However, there are also "Singularitarians" who believe not only that the Singularity is possible, but commit themselves to working to bring about a benign intelligence explosion, which they expect (especially in combination with advances in biotechnology, nanotechnology and robotics) to bestow on humanity a whole array of epoch-changing, species-redefining goods, including unprecedented prosperity as vast artificial brainpower explodes our economic productivity while vaulting over the inevitable resources scarcities and cleaning up our ecological messes, and even a transition to the posthuman, complete with the conquest of death (a theme which also occupies much of Kurzweil's writing).

The Case For The Singularity's Plausibility
As stated before, the argument for the possibility or likelihood of the Singularity is largely founded on expectations of a continuing geometrical growth in computing power. Singularitarians like Kurzweil commonly extrapolate from the well-known "Moore's Law" that the circuitry and speed of chips doubles every two years (or less). At the same time, they reject a mystical (and certainly a dualistic, body-mind) view of cognition, believing the basic stuff of the brain as "hardware," the performance of which computers could plausibly overtake.

This raises the question of what the hardware can do. Kurzweil estimates the human brain's performance as equivalent to 10 petaflops (10 quadrillion calculations) a second. As it happens, the IBM Blue Gene supercomputer passed the 1 petaflop milestone back in 2008, while the fastest computer in the world today, the Fujitsu-built "K" computer is capable of 8.2 petaflops a second at present, and expected to attain the 10 petaflop mark when it becomes fully operational in November 2012. Nor does this mark the outer limit of present plans, an exaflop-capable supercomputer (a hundred times as fast) popularly projected to appear by 2019.

Of course, it can be argued that merely increasing the capacity of computers will not necessarily deliver the strong AI on which the Singularity is premised. As Dr. Todd Hylton, director of the Defense Advanced Research Projects Agency's SYNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) program puts it,
Today's programmable machines are limited not only by their computational capacity, but also by an architecture requiring human-derived algorithms to both describe and process information from their environment.
Accordingly, some propose the "reverse-engineering" of the human brain, the object of several high-profile research programs around the world, of which SYNAPSE is only one, some of which are expected to achieve results in the fairly near term by their proponents. Henry Markham, director of the IBM-funded "Blue Brain" project, claims that an artificial brain may be a mere ten years away.

Those arguing for the possibility of human-level or greater artificial intelligence can also point to the improving performance of computers and robots at everything from chess-playing to maneuvering cars along obstacle courses, as in the Defense Advanced Research Agency's 2007 Grand Challenge, as well as the proliferation of technologies like the irritating and clunky voice-recognition software that customer service lines now routinely inflict on the purchasers of their companies' products.

The Case Against The Singularity's Plausibility
Writers like Kurzweil and Moravec can make the argument in favor of the Singularity's likelihood in the early half of the twenty-first century seem overwhelming, but that claim has plenty of knowledgeable critics, including Gordon Moore of Moore's Law fame himself (ironically given how often his prediction with regard to the density of components on semiconductor chips is cited on the idea's behalf).

Some might respond by citing Arthur Clarke's "First Law of Prediction," which holds that "When a distinguished elderly scientist states that something is possible, he is almost certainly right," whereas "When he states that something is impossible, he is very probably wrong." Moore's objection appears more intuitive than anything else, but, besides the epistemological, ontological and other philosophical problems involved in defining consciousness and verifying its existence raised by John Searle's "Chinese Room" argument, among many others (We don't know what consciousness is; and even if we did create it, how could we be sure we did?), skeptics have offered several cogent criticisms of Singularitarian assumptions, which generally fall into one of two categories.

The first category of argument posits that it may simply be impossible to develop really human-like artificial intelligence. Physicist Roger Penrose's 1989 The Emperor's New Mind, in which he argues the inadequacy of physical laws to account for human consciousness – and therefore, the prospects for human-equivalent machine-based intelligence – is perhaps the best-known argument of this kind.

The second is the position that, while artificial intelligence of this kind may be theoretically feasible, we are unlikely to realize the possibility in the foreseeable future. This could be due to either the growth in computing power slowing down sharply before the point of the Singularity (given our likely reliance on undeveloped new computer architectures to continue Moore's Law past – or even through – this decade), or the elusiveness of the "software" of intelligence, which seems a more subtle thing than building a faster computer. (Certainly those whose expertise is in biological systems rather than computer hardware and software tend to be more skeptical about the possibility, pointing to the complexity of human neurology, as well as how little we actually understand the workings of the brain.)

Less directly relevant, but telling nonetheless, is the fact that, far from accelerating, our technological development may actually have slowed as the twentieth century progressed, as Theodore Modis and Jonathan Huebner have each argued on theoretical and empirical grounds.

Indeed, there has been a tendency on the part of "Singularitarian" prognosticators to push the dates of theirs predictions farther and farther into the future as the years go by without the Big Moment happening. I.J. Good declared it "more probable than not that, within the twentieth century, an ultraintelligent machine will be built" (a guess which influenced Stanley Kubrick's 2001). Of course, the century ended and the titular date came and went without anything like the expected result. Later expectations reflect this, Vinge guessing in the early 1990s that the Singularity would occur in the generation-length period between 2005 and 2030, while Kurzweil suggested 2045 as the big date in The Singularity is Near.

Notably, Kurzweil made more modest but also more specific predictions about what would happen prior to the event, including a list of developments he offered in his 1999 book The Age of Spiritual Machines, for 2009. A good many observers revisited his claims when the time came (myself included). Kurzweil himself discussed the results in a 2008 interview with the newspaper Ha'aretz, and while sanguine about his results, others disagreed, and the same newspaper mentioned him in its New Year's Eve list of recent prediction bloopers. (Indeed, since that time Kurzweil has conceded on some points, bumping what he predicted for 2009 regarding virtual reality and automobiles over to 2020 if not later in an article he published in the New York Daily News in December of that year.)

It may be noteworthy, too, that the production of original ideas about the Singularity seems to have fallen off. Surveys of the field, like the aforementioned book by Peter Singer, increasingly seem to go over and over the same list of authors and works (Vinge, Moravec, Kurzweil, Joy, et. al.), most of which are a decade old at this point, rather than the rush of new ideas one might expect were we racing toward a post-Singularity reality. (Indeed, I have found the vigor of the critics more striking as of late.) Even science fiction has taken less interest, the genre actually seeing something of a backlash against the concept.

In short, there is abundant reason to think that the claims made for the Singularity are not just open to question, but that the kinds of technological change on which their predictions are founded are actually coming along rather more slowly than many of its advocates argue. Yet, even if the last decade has tended to validate the skeptics, that is a far different thing from saying it has discredited the idea – especially given how much can still change in the time frame Kurzweil has suggested for the Big Event (to which he has stuck, even after his moderation of his smaller predictions). Moreover, we can at the very least continue to expect to see faster computers, more sophisticated software and more versatile robotics, and that, for as long as human beings continue to dream of transcendence, some will continue to envision technology as the means by which we attain it.

Subscribe Now: Feed Icon