Wednesday, March 7, 2012

The Cancellation of Terra Nova

Terra Nova has now traced most of the trajectory of big-budget science fiction shows on the major American television networks (especially FOX), from a launch with great fanfare and unrealistic expectations – to the ignominious end when the Suits lose their nerve and cancel it. Now we are in the post-cancellation phase where there is still talk of the show being continued on another channel, something that very rarely happens.

I don't get the sense that many will miss Terra Nova the way they did Dark Angel, Firefly or Terminator: The Sarah Connor Chronicles, though. Yes, the show had a promising concept and appealing visuals, for which it won a great deal of praise, but many also complained of lackluster characters and the overall quality of the writing. The complaints seemed a bit exaggerated at times (as other shows, not really much better, attracted far less complaint of the kind), but they were certainly not baseless, and are easier to understand when one considers those who complained mostly loudly – generally those who follow science fiction television most closely.

Despite its futuristic setting, reliance on classic, "hard" science fiction tropes (like travel to parallel universes) and flashy visual effects, the makers of Terra Nova gave short shrift to the tastes of "hardcore" science fiction fans. The show was actually rather light on world-building, extrapolation and general conceptual development compared with the television space operas with which its premise invites comparison. It made no effort to appeal to the considerable portion of this fan base which is intertextually inclined, the way that, for instance, Heroes and Sanctuary, two of the more significant hits of recent years, did. Even worse from their perspective was the fact that it was a family show, quite far removed from the dark, edgy treatment of its themes many of those who'd looked forward to the show expected, given its dystopic image of the twenty-second century, and the ways in which that milieu had presumably marked its characters.

Unsurprisingly, "bland" was a word I frequently encountered in discussions of the show in the blogosphere, and I suspect that the complaints were all the more pointed because, Terra Nova having been intended as lighter, family-friendly fare, the weaknesses of the writing were not obfuscated in the ways to which we have become accustomed – the cheap button-pushing that looks like intellectual, political or dramatic daring to superficial viewers; the obnoxiousness for obnoxiousness's sake on the part of the dramatis personae so often mistaken for a "courageous" willingness to present unlikable characters (while their verbal abuse of each other is praised as sharp and witty dialogue); the head games that can make a show's lapses in logic or coherence instead appear to trusting viewers like part of some intriguing mystery that will be satisfactorily solved later; the soap opera-like subplots which distract the audience from a story going nowhere by fixing their attention on such questions as who is sleeping with whom (or trying to); and the fan service that makes watchers more forgiving of the flaws that do come to their attention. (The remake of Battlestar Galactica offered plenty of all these – from its exploitation of current issues like the War on Terror and the culture war, to its muddled religiosity, to its sexuality – with the result that many a fan was shocked, shocked, when the finale proved that the show's story had never been anything but a pile of nonsense. Lost, too, had plenty of head games and soap opera, and likewise let many a viewer down when it was all over.)

In all this, Terra Nova was comparable in its feel and its fate to genre shows the Big Four made in the mid-'90s before virtually abandoning this territory to syndication and cable - like Earth 2 and Seaquest (which also had an especially large budget, and the Steven Spielberg name attached), and a reminder of the dilemma facing producers of genre shows on the major networks. On the one hand, the hardcore fans looking for innovative science fiction are not numerous enough to warrant their special attention, forcing them to look beyond these to a more general audience (like the families with dinosaur-loving kids the network was clearly shooting for with Nova).1 All too typically they end up pleasing neither group, alienating the genre fans without really winning over the more casual viewers (small-screen spectacle can do only so much from week to week, as Entertainment Weekly's Darren Franich points out), and so – launch with fanfare, disappointment, cancellation, a cycle that was a bit quicker and more intense in the case of Nova, I suppose, because the execs keep their creatives on shorter leashes, and because after the booms of the 1990s and 2000s, genre fans have become more demanding, further distancing them from those larger audiences the programmers hope to capture.2

Still, for all its faults I did think Terra Nova was improving as it went along, the way many science fiction series do on their way to being good or even great - including Star Trek: The Next Generation, which made so much else possible. I never expected anyhing quite so consequential for genre TV from this show, even if it did get the chance that I was fairly sure FOX wouldn't give it – but I would certainly give a second season of the show a look if it were to materialize.

1. I calculated this at 10-15 million for the United States in my article "The Golden Age of Science Fiction Television: Looking Back at the 1990s."

Monday, March 5, 2012

The Irrelevance of Oscar Night?

I remember neoconservative Robert Kaplan lamenting in his essay "The Dangers of Peace" that the "Academy Awards ceremony has achieved a status akin to a national holiday." Now the story many an entertainment journalist tells is that is that the affair seems decreasingly relevant to our pop cultural life – and some seem to fear, already irrelevant. There is, of course, some of the Hollywood press's usual exaggeration of such matters, as David Poland demonstrates over at Movie City News, but there is at least some basis for these concerns. The show's viewership has been declining for years , with the young appearing especially uninterested – the attitude paralleling what seems to be their increasing (for Hollywood, worrying) disinclination to go to the movies. (Indeed, the producers were so concerned with their loss of interest that last year they took some pains to win them back, in part by having Anne Hathaway and James Franco host the ceremony – a move which proved to be a complete failure at elevating this demographic's viewership.)

However, it seems to me less surprising that viewers are losing interest in the ceremony than that they ever took much interest in the first place. I myself have only watched the ceremony beginning to end once – way back in 1995. I haven't even tried to do that since, even through the years in which I vaguely hoped to (someday) pen a screenplay that would be deemed worthy of an Oscar nomination, and read Variety every week as part of my study of the craft – and now that I think about it, didn't see anything at all odd about this.

I suppose I didn't find it all that interesting a show. After all, the thing drags at four hours, and for all the energy and humor even a Billy Crystal brings to the stage, tends to be a rather staid affair. It's also awfully predictable. The press buzzes for weeks in advance about the front-runner, though anyone who's followed the business for any length of time hardly needs its speculations to hazard a good guess about how it will play out; there's a tedious predictability about the whole process of distributing the awards. Certain themes, certain characters, are simply much more likely to receive the Academy's accolades. (I think of the scene from Bowfinger in which Kit rants about how he needs to play a mentally handicapped slave, or the "Tearjerker" episode of American Dad.) Best Supporting Actor/Actress Oscars, while normally given to worthy actors, also function as bones thrown to performances and movies of sorts not typically recognized, because they are too funny, or too popular, or worst of all, too "fan boy" – the last type of movie particularly likely to win big in the technical categories, but be slighted when the rewards for writing, acting and directing are distributed. And so on and so forth.

So why not just read about the winners in the paper (or on the web) the next morning? I'd think. But people watched anyway. I suppose it was the romance of the cinema, the "cult" of the movie star, and the sense that Hollywood was the very center of the entertainment world, with the Academy Awards ceremony the biggest night of the year for all three. Alas, the mystique of all these has suffered badly in recent years.

The Romance of the Cinema
Some of the cinema's loss of its magic would seem to be due to the change in the nature of the movie going experience. Of course, a considerable element of nostalgia is an undeniable part of such arguments. Yet, there seems something ritual, dream-like, even quasi-religious, about people gathering in a dark chamber to be entranced by a flickering image the size of a wall. That was diminished as movie palaces gave way to multiplexes, and seeing a movie became an occasional detail in the course of a day at the mall rather than a destination in itself, with the viewing marred by a pre-movie battering with ads, and then during the movie itself, the cell phone conversations of gadget-addicted oafs. It is diminished further still when instead of a theater screen you see a movie on a TV in your well-lit living room with all the distractions of a household about you and a finger on the pause button, or worse still, the even smaller screen of a laptop or handheld device you watch on a crowded bus or train during your daily commute to work.

These newer ways to watch a film are today far more characteristic of how we go about that activity. In the 1930s and 1940s a clear majority of the population of the U.S. (65 percent) was at the movies each week, which worked out to the average American going to the movies more than thirty times a year.1 By contrast, the average American has gone four to five times a year in the 1980s, 1990s and 2000s, to go by per-capita ticket sales. Additionally, the time they spend in the theater on those rarer occasions has been reduced by the changed practices of exhibitors (like the abandonment of such parts of the experience as pre-movie cartoons, shorts and newsreels, and double-features). The result is that they spend only a small fraction of the time they used to looking at movie screens, perhaps ten hours a year, which is also much less time than they now spend in front of a television or computer screen in a typical week, or even day (doing, among other things, watching the vast majority of the movies they take in).

It might be added that the places where we watch our movies, and the technologies we watch them with, are not the only ways in which the experience has changed. We are far less likely to experience a film as a shared cultural moment, the movie that "everyone" sees increasingly a rare thing now – the distance between highbrow and lowbrow, between old and young, and persons split by innumerable other cultural and demographic divides ever greater. And of course, those diminished moments seem ever more fleeting, as movies come and go ever more quickly. Even an "event" film is likely to be "#1" at the box office for one week, or two, before attention has shifted to the next not-so-big thing.

And film as a whole has given ground to other media, the differences with which have been blurred by more than the smallness of the screens on which we watch them. Those who said in the 1950s that television spelled the end of cinema spoke prematurely, but they were right in identifying it as a very formidable, perhaps in the long-run unbeatable, competitor, one that has long been closing the gap with movie production in technical accomplishment and thematic ambition. In the process it has stolen an increasing portion of film's thunder artistically and commercially, a point evident even in video sales (DVD making television shows a force here such as they never were in the day of VHS). At the same time, one might wonder if moments which saw the medium of film revolutionized, with new ground broken, or something presented to us that we really didn't see before, have not grown scarcer – and owed much more to the genius of the FX artists who have topped themselves year after year in films like Jurassic Park, the Lord of the Rings trilogy, and Avatar than to anything else. Where other aspects of cinematic storytelling are concerned, the medium seems increasingly backward-looking and self-referential, with its remakes and reboots and homages and other kinds of recycling, its movies about movies and the people who make them, when film doesn't seem to be turning into something entirely different (as Jonathan McCalmont suggested in his review of Transformers 3, subtitled "The Ambivalence of the Metallic Sublime").

Indeed, in an age in which Ipods are treated like life support devices, Ipads and laptops are always in hand, and the cell phones are never off, I would suggest that film is for many (whom one suspects increasingly find the devotion of their full attention to a two-hour narrative a strain) just one more source of sound and image – like YouTube videos of low IQ amateur stuntmen – on which to draw for the highly personalized multimedia blitz in which they swaddle themselves during their waking hours.

The Cult of the Movie Star
One might add to the above the familiar lament that the stars of today fall short of their predecessors in the day of the Dream Factory. Once again, there tends to be a heavy element of nostalgia in such talk, but there still seems to be something to it nonetheless.

Certainly stars in the old moulds, stars like the big box office draws of the studio era, seem at the least an improbability today. In their day stars were held up as exemplars of some contemporary, yet archetypal, ideal, in a way that seems to have been possible only in the hothouse of a more genteel screen-world with very particular attitudes toward such matters as social class and gender. The machismo of Clark Gable and Errol Flynn and Humphrey Bogart, the sophistication of Cary Grant and David Niven, the innocent sweetness of Audrey Hepburn and Marilyn Monroe, all existed in that context – and became impossible when it disappeared. The social signifiers out of which a distinctive, appealing screen image might be built up seem ever-shakier things, while the ideals that they would convey have become a far less certain thing. (Take, for instance, the idea of gentlemanly sophistication. As Jeremy Black observed in The Politics of James Bond, there has long since ceased to be "a secure basis" for its representation, while it has been increasingly unclear that this is a desirable thing for, for example, an action hero to display.) At the same time, the gritty, the crude, the scatological have become pervasive, and entail indignities incompatible with that larger-than-life, better-than-life quality stars are supposed to have (like being on the butt end of toilet humor).

Whatever the prospects for reconciliation between old-fashioned stardom and these changed cultural expectations, the classical idea of the star seems irreconcilable with the reality that today's biggest names tend to become those by appearing in the very productions (indeed, series' of productions) most prone to subordinate, even overwhelm, plot, character – and star – with concept and spectacle. Consider, for instance, Shia LaBeouf as he shares the screen with Autobots and Decepticons, or the number of actors whose biggest movies have them playing an already established, iconic pop cultural figure likely bigger than they can ever hope to be – like Christian Bale in his Batman films. The result is that while it is often lamented that the age of "high concept" has meant fewer good roles for women, it may have meant fewer good roles for anyone in the sense of a role giving an actor a chance to really make their mark in, let alone dominate, a major film – as a star is supposed to do. Stardom seems irreconcilable, too, with the fact that, just as our taste in subject and style is more fragmented, it is tougher than ever to find actors with anything approaching a universal appeal, the Big Names we have instead reminding us of just how divided we are. (Think, for instance, of how men and women tend to respond to Sarah Jessica Parker and Megan Fox, respectively.)

The changing character of publicity doesn't help. The hypertrophied portion of the media living off of public interest in the doings of the rich and famous – E!, TMZ and the rest of the multimedia paparazzi machine – has overloaded us with celebrity gossip-themed infotainment. And much of this constant coverage presents stars and would-be stars as slobs and oafs and worse as we see them going into or coming out of rehab or cop cars or courthouses, or watch "experts" minutely scrutinize the signs of their plastic surgery and their weight fluctuations and their emotional outbursts. There is, too, the Great Leap Into the Sewer of the reality TV revolution, which has turned a whole category of skuzzy, scummy snots into "stars." Meanwhile "real" (e.g. movie) stars routinely get reality shows of their own – on their way down, the indignities of their fall no longer taking place out of sight, but instead served up to their fans in a desperate grasp at another fifteen minutes of fame. That abundance of unflattering material not only makes it much harder to support some carefully cultivated persona, but is a constant reminder that stardom is itself a show – and while this was just as much the case circa 1930, it seems we have grown too conscious of the fact for any amount of postmodern reveling in superficiality for superficiality's sake to fully repair the damage to this delicate illusion.

What remains of old Hollywood-style glamour is, far more than before, shared with the worlds of music and fashion, the allure of which has often eclipsed tinsel town's luster. The '90s was the heyday of the supermodel, but something of the fascination with their industry endures in ways large and small. Much like the Oscars, the Victoria's Secret Fashion Show is an annual television event, broadcast on a major network in prime time. (At a mere one-hour in length, offering lingerie models and that aura of luxurious hedonism the brand's promotion excels at conveying, it is rather a different approach to entertainment – which despite drawing just a fraction of the Oscars' viewership, was judged to have had very good ratings last year.) Meanwhile, Christy and Linda and Cindy and the rest of the faces of haute couture from the supermodel's glory days are still on magazine covers and television screens (in commercials selling anti-aging products they credit with keeping them forever young), while Carla offers another reminder of that era every time news cameras are pointed at the First Lady of the French Republic.

This, too, adds to the difficulties in the way of attaining the larger-than-life, better-than-life quality expected of the star, even with the biggest and most sophisticated of publicity machines squarely behind them. Unsurprisingly, they make it less and less often, with the result that the "A-list" is in decline, and bankability fading as a meaningful concept (as Matthew Belinkie suggests in his post "The End of Movie Stars?"). Indeed, it often seems to me that the function the casting of stars serves in the moviemaking process is less about capitalizing on their draw than convincing the audience that a particular film is indeed a major production deserving of their time and money through the inclusion of well-known (and therefore costly) performers.

Thinking of all this I remember hearing the common refrain among British monarchists that Britain "needs" the royal family because it "doesn't have a Hollywood of its own." I've always found this an unbelievably stupid thing to say, for a multitude of reasons, above all the claim that celebrity worship is some deep-seated human need which can all by itself justify feudal political anachronisms. But it's also the case that even Hollywood isn't what it used to be.

The Centrality of Hollywood
These developments reinforce each other, the diminished status of film making movie stars seem smaller, and the weakening of stars' celebrity changing the way in which audiences respond to film. At the same time, even where film is concerned, Hollywood seems less and less the center of the entertainment universe.

Certainly where the U.S. box office is concerned, only one foreign-language film (excepting Mel Gibson's The Passion of the Christ and Apocalypto) grossed more than $100 million at the U.S. box office, only five more than $20 million, only seventeen more than $10 million during the whole decade of the 2000s according to Peter Knegt of Indiewire. This makes for an average of less than two a year hitting the $10 million mark, and the total from all these films typically accounting for well under 1 percent of the revenue from ticket sales. Moreover, even these movies tend to have been made by directors with some Hollywood experience (like Ang Lee's Crouching Tiger, Hidden Dragon, and Guillermo del Toro's Pan's Labyrinth) with actors known to American audiences from their appearance in Hollywood movies (like Chow Yun-Fat and Michelle Yeoh in Crouching Tiger) – the foreign films that make it even this far generally those that seem least foreign to Americans. And of course, where global moneymakers are concerned, the U.S. still remains far and away Number One, year in, year out, at home and abroad.

And yet, there are ways in which Hollywood is weaker than it appears. There are sizable markets where domestic productions have cut into Hollywood's share of the annual lists of top-earning movies in the past decade, like Japan and South Korea – a fact that has drawn surprisingly little commentary. It also tends to be forgotten that while Hollywood continues to have a near-monopoly on the megabudget blockbusters, a great deal of American production does not travel very well – like Will Ferrell's comedies. (The movie widely regarded as his biggest hit, Talladega Nights: The Ballad of Ricky Bobby, earned over 90 percent of its ticket sales domestically, and much the same has been the case with his other hits Anchorman: The Legend of Ron Burgundy, Blades of Glory and Step Brothers.)

Additionally, it seems that the traditionally insular American market, so much more accustomed to exporting rather than importing pop culture, has become more permeable. It is not only the case that British imports have long been a routine part of what Americans see on their screens, or that today American and Canadian production are increasingly difficult to distinguish from one another. Spanish-language networks like Univision and Telemundo beam product from across Latin America into a great many American homes, and MHz Networks brings far more diversity than that to cable viewers. The profile of the Hong Kong film industry that produced Bruce Lee, John Woo, Jackie Chan and Jet Li has fallen, but Japanese anime and the Korean Wave and Bollywood are strong presences – more cult than mainstream admittedly, but still a part of the landscape. Equally, the originals of the foreign movies Hollywood remakes so frequently are easier to access – typically on cable and video, so that a focus on ticket sales understates their impact, which is not wholly confined to snobbish intellectuals, hardcore fan boys or immigrant communities. This all makes Hollywood seem a smaller place, one that often comes off as self-important and even provincial in its attitudes – not least, the fact that the whole of production from outside the English-speaking world is typically segregated into a Best Foreign Language Film at the Oscars (a far cry from the cosmopolitanism of the Cannes Film Festival). The more sophisticated viewers, those most likely to be appreciative of film as art form and tradition, are also those most likely to see Hollywood's limitations.

A Generation Gap
Every section of the filmgoing audience has been affected by these changes, but these attitudes are more evident among younger viewers than the rest. It is they whose experience of film is least connected with the theater, they who make the fullest use of the new media technologies.2 (Senior citizens love their cell phones as much as everyone else, but I generally don't see them watching videos on such devices.) It is they whose take on stardom and celebrity most fully reflects the age of high-concept, tasteless gag and reality TV. (Indeed, a crucial reason why newer actors are failing to make it as stars may be that this demographic simply can't be sold on the concept.) And it is they who are most prone to look beyond America's movieland not just for their images of glamour, but their entertainment.

It is worth noting, too, that younger filmgoers have virtually no cinematic memory, despite their unprecedentedly easy access to old movies via video, cable and the web. Walk into a college classroom and mention just about any film from before the last decade, for instance – let alone the stuff that used to run on AMC back when those letters stood for "American Classic Movies" – and with very few exceptions you will get blank stares. (I have even had this experience in university classrooms that I knew had more than their fair share of film majors.) Not only do they not share the nostalgia of (mostly) older film fans; they have little consciousness of exactly what it is for which older filmgoers feel such nostalgia.

Consequently, the Oscars seem just one more awards show in a whole season of them, all minutely covered in the press, and one that seems comparatively remote at that in its standards and mechanisms of selection, next to less prestigious, but more audience-oriented – and youth-oriented – presentations like the People's Choice Awards, or the MTV Movie Awards. In many a case, they haven't even heard of the movies up for little statues simply because they don't even see those sorts of films, and those among them who have artier sensibilities are likely to feel that the Academy overlooks their particular favorites (as Julie Gray confesses has been her experience). And all of these awards shows put together appear just one part of the ever-more visible life of the worldwide media-industrial complex (every other quarter of which has its own awards shows, from TV commercials to video games).

None of this is to say that the end of film, or the end of Hollywood, or even the end of the Oscars as an annual event, is at hand. Rather it means they are part of a scene in which cultural influence is more widely distributed; in which film, Hollywood and the Oscars remain, but share their prestige, their power to excite, their claim on audience affections with other media. Additionally, given that the trends that have already wrought the change – like the fragmentation and globalization of media, and the increasing ubiquity of media itself – seem likely only to continue progressing (or even accelerate) in the years to come, it seems that this diffusion and complexification will only continue as time goes by.

1. The data on movie attendance has been taken from Michelle Pautz, "The Decline in Average Weekly Cinema Attendance," Issues in Political Economy 11 (2002). Accessed at http://org.elon.edu/ipe/pautz2.pdf. The figures on per-capita ticket sales were calculated from data from Box Office Mojo.
2. John Anderson observed that on many occasions during this year's ceremony, "participants reflected on movie going as something they remembered fondly from their childhoods. They might have been talking about the Civil War."

Saturday, March 3, 2012

A History of the Spy Story


Collected in THE MANY LIVES AND DEATHS OF JAMES BOND

Offering a history of the spy genre is famously difficult, in part because of the genre's porous boundaries. As Donald McCormick and Katy Fletcher note in Spy Fiction: A Connoisseur's Guide, the term "'spy story' is in itself a misnomer" because it is used as a blanket label not only for the activities of spies in all their forms, but also counter-spies, government functionaries employing spies,
agents . . . hired killers, planters of misinformation, or sometimes even . . . that unassuming little man at the corner shop who operates a kind of letter-box for agents.
Additionally, it is possible to consider any adventure story or war story involving a bit of intelligence gathering or intriguing a spy story of sorts, so that those looking for a beginning often point to Odysseus' scouting of the Trojan lines in Homer's Iliad (making the spy story as old as literature).

Nonetheless, the spy story as we know it has two characteristics which set it apart. One is that it centers on the spy and his activities in that capacity. The other is that it engages with contemporary, real-life politics, rather than those of a historically distant setting (like James Fenimore Cooper's 1821 Revolutionary War novel The Spy: A Tale of the Neutral Ground, often described as the first English-language spy novel), or a wholly fictional one (like Ruritania, in Anthony Hope's 1894 The Prisoner of Zenda). Such fiction is largely a product of the twentieth century, during which it emerged from the intersection of two genres which emerged in the decades prior to it.

The first is the is the tale of crime and detection, a product of Romanticism's fascination with the marginalized and the extreme, and the advent of modern police forces and urban life as we know it. This genre, of course, was flourishing by the late nineteenth century, when Arthur Conan Doyle presented Sherlock Holmes to the world in A Study in Scarlet (1887).

The second is the story of contemporary politics, the new popularity of which is likely traceable to the fact that, as Jan Bloch put it in his 1899 classic The Future of War in its Technical, Economic and Political Relations,
both military and political affairs have ceased to be high mysteries accessible only to the few. General military service, the spread of education, and wide publicity have made the elements of the polities of states accessible to all.
In the century after the French Revolution, Europe's once politically passive subjects had increasingly become conscripts and reservists in their nations' armed forces. They were increasingly readers as well as a result of national education systems and the wider availability and lower cost of books and newspapers, while telegraphs made news more immediate, and photography provided unprecedented illustration of that coverage. Already by the time of the Crimean War (1854-1855), public opinion was playing something like its contemporary role in foreign policy, and the trend continued through the U.S. Civil War (1861-1865), "the first great war in which really large numbers of literate men fought as common soldiers," as Theodore Ropp observed in War and the Modern World. And of course, they were increasingly voters as democratization spread and deepened.

Accordingly, there was not just an audience for writing on these subjects, but a premium on appealing to public opinion, at home and abroad (public opinion in foreign countries also an increasing factor in policy calculations). Fiction was one component of such writing, with the invasion story genre launched by tales like George Chesney's "The Battle of Dorking" a particularly important aspect of it. There was an obvious place for spies in these scenarios, and from fairly early on they depicted foreign agents entering a targeted country to steal secrets, commit acts of sabotage or lie low until the shooting started before joining in the fight. Nonetheless, the espionage tended to be only a small part of the story, and the spies rarely even constituted proper characters. In 1882's How John Bull Lost London, for instance, it is French soldiers infiltrated into the country as tourists who capture the British end of the tunnel linking Dover to the continent, facilitating the arrival of their comrades. The French waiter working in England, who is really part of an invading force, became a cliché.

The convergence between the two genres was already evident in the Sherlock Holmes stories, notably in 1894's "The Adventure of the Naval Treaty," in which Holmes is enlisted to track down a missing copy of a secret Anglo-Italian naval treaty, which the protagonists were anxious might find its way into the hands of the Russian and French ambassadors. This proved only the first of Holmes' forays into such affairs, and Arthur Conan Doyle far from the only writer to take such an interest. William Le Queux's The Great War in England in 1897 (1894) prominently featured a foreign spy in the plot, the villainous "Count Von Beilstein," a cosmopolitan adventurer who was arrested in Russia for his criminal behavior (forging Russian notes and using these to acquire twenty thousand pounds' worth of gems), and became a Russian agent to regain his freedom. Not long after, Edward Phillips Oppenheim attained a notable success in The Mysterious Mr. Sabin (1898), the titular figure in which was a French operative – a would-be "Richelieu of his days" - working against England.

Nonetheless, reflecting the then-prevailing tendency to view the spy's trade as "ungentlemanly," spies were predominantly foreign villains (or if they were countrymen, traitors), with the role of the usually amateur protagonist most often the frustration of their plans (as in the stories discussed above). Cooper-like stories in which a spy was the hero only began to appear after the turn of the century with books like Max Pemberton's Pro Patria (1901), Rudyard Kipling's India-set adventure Kim (1901) and Erskine Childers' Riddle of the Sands (1903).

The novels of Pemberton and Childers depict Britons who stumble upon mysterious foreign doings - in Pemberton's case, a secret French plan to build a Channel tunnel, in Childers', the adventures of a pair of Britons sailing the Frisian coast who have stumbled upon mysterious doings in the area. Probing into these they learn of German preparations to use the area as a staging ground for an invasion of Britain. In Kipling's novel the titular protagonist, an Anglo-Irish orphan, gets caught up in the Great Game between Britain and Russia. Today historians of the genre commonly identify either Kim or Riddle as the first modern spy novel.

Of course, it might be argued that Kim is essentially a picaresque which traces the early part of a spy's career, and Riddle a sailing story which involves espionage. However, it was not long before writers started to produce works more fully focused on this theme, and in the process established the rough boundaries of the field – its core themes, concerns and plot formulas - as well as the range of viewpoints within which subsequent authors generally worked.

William Le Queux went beyond his early forays into this area as a writer of invasion stories in Spies of the Kaiser: Plotting the Downfall of England (1909) offered a collection of loosely connected stories centering on German schemes against England (notable for their use of the theft of technical secrets as a basis for a spy story). Edward Phillips Oppenheim did the same, the book for which he is best remembered today, The Great Impersonation (1920) notable as an early treatment of the idea of the triple agent and deep-cover mole. John Buchan's The Thirty-Nine Steps (1915) gave us an innocent man forced to go on the run by villains whom he must take on nearly single-handed to clear his name and save the day (and gave the spy genre its first major series' character in Richard Hannay), while H.C. "Sapper" McNeile's Bulldog Drummond (1920) was a hugely influential proto-James Bond adventure.

Meanwhile, Joseph Conrad was already treating espionage as a subject of serious drama, and offering a more critical take on the game itself in The Secret Agent (1907) and Under Western Eyes (1911). Stories of seedy little men playing seedy little games which destroy human lives, they dealt with terrorism and counterrorism, agent provocateurs and false flag attacks - as well as how the game looks from the standpoint of a double agent, and a foretaste of later stories of depravity on the part of the forces of order. W. Somerset Maugham brought irony and humor to the genre in Ashenden (1928), as well as a strong sense of espionage as a matter of tedious routine, a consciousness of the scale and organization of modern intelligence operations, and a memorable spymaster in "R" (a generation before Ian Fleming gave us "M").

In the next decade Eric Ambler stood the conventions of Oppenheim, Buchan and company on their head - and offered a leftish view of them - in novels like The Dark Frontier (1936), as well as Background to Danger (1937), Cause for Alarm (1938) and A Coffin for Dimitrios (1939). The Dark Frontier was an outright parody of the genre's conventions (which offered a protagonist who doesn't remember his true identity and instead thinks he's a legendary super-operative long before The Bourne Identity, and the theme of keeping nuclear weapons out of the hands of "rogue" states), while the outsiders pulled into the game in novels like Background to Danger do not give a heroic account of themselves in the manner of Buchan's Richard Hannay, but are simply ordinary people fighting for their lives. Graham Greene, getting his start in the genre at roughly the same time, took a similar course in books like This Gun for Hire (1936), which depicted an operative who turns on villainous employers after they betray him. The stories of Ambler and Greene are also noteworthy for their depiction of the threat as coming not from foreigners or domestic radicals (e.g. Communists and anarchists), but from within "our" Establishment (like British business interests happy to do business with Fascists in Background to Danger and Cause for Alarm, or industrialists who welcome, or even provoke, war for the profits it will bring them in This Gun for Hire), and heroism located not in "our" people, but those normally regarded as villains (like Ambler's Soviet superspy Andreas P. Zaleshoff).

Looking on this list of works it may seem there was little for later writers to add after 1940, beyond the genre's obvious adaptation to changes in international politics (the outbreak of World War II, or the Cold War), technology (like jet travel, communications satellites, and computers) and attitudes toward race, gender and sex (one way in which Drummond was not like Bond), the adoption of "difficult," Modernist storytelling techniques (which touched every genre over the course of the twentieth century) and the tendency of books to lengthen (a matter of trends in the publishing industry as a whole). Nonetheless, the genre evolved over subsequent decades in three notable ways.

The first is the changing nature of the protagonists. In the early novels mentioned above (the idiosyncratic works of Conrad aside) the heroes were typically men with public-school educations, independent incomes and servants; gentlemen-sportsmen at home in London clubs and on rural estates. They often led lives of leisure, having inherited wealth (like Everard Dominey in The Great Impersonation, Sapper's Drummond, and the unnamed protagonist of Geoffrey Household's 1939 Rogue Male), or already accumulated it (like Buchan's Hannay, who at the start of The Thirty-Nine Steps has already made his fortune in southern Africa before coming to Britain). Such jobs as they did hold were typically of the kind to which the upper-class commonly gravitated, and which were likely to allow a lengthy leave (like Childers' Carruthers in The Riddle of the Sands, a Foreign Office official able to take a month off just to go sail and shoot in the Baltic – or for that matter, Ambler heroes like Coffin for Dimitrios's Charles Latimer). They tended to have conservative outlooks, and adhered to the political and social orthodoxies of their day, including a simplistic nationalism. And they typically entered the adventure on their own initiative, often after a chance meeting, with a restless nature and a taste for adventure crucial factors in their decision (these last treated most blatantly in Drummond's case).

Later protagonists were less likely to be such examples of upper-class gentility, as with the unnamed hero of Len Deighton's The IPCRESS File (1961) and its sequels, and Adam Hall's Quiller, who pointedly tells the reader that "We are not gentlemen" as he watches a member of the opposition burn to death in a car after deciding not to save him in The Berlin Memorandum (1966).1 Not only were they more likely to be ambivalent about the game, but they were often cynical about nationalism and political ideology. This was not only the case when they were outsiders unfortunate enough to get mixed up in the business, like journalist Thomas Fowler in Graham Greene's The Quiet American (1955), but also when they were professional intelligence operatives, like John le Carré characters like Alec Leamas in The Spy Who Came In From the Cold (1963). At times, this went as far as outright hostility or disdain toward the Establishment, with not only leftist but rightist writers as well expressing such sentiments (as in William Haggard's idiosyncratic Colonel Russell novels). Additionally, the professionals increasingly squeezed out plucky amateurs like Bulldog Drummond, certainly where series characters are concerned.

The second is a growing recognition of, and response to, what might usefully be termed the "tiny rivet" problem. As Maugham put it in Ashenden,
Being no more than a tiny rivet in a vast and complicated machine . . . [his protagonist] never had the advantage of seeing a completed action. He was concerned with the beginning or the end of it, perhaps, or with some incident in the middle, but what his own doings led to he had seldom a chance of discovering. It was as unsatisfactory as those modern novels that give you a number of unrelated episodes and expect you by piecing them together to construct in your mind a connected narrative.
In that novel Maugham worked within the framework he described to give us a protagonist who does not see completed actions (the drama in his heroes' adventures typically supplied by other events and factors), but this was a rarity, and other writers dealt with it in two different ways.

One group simply ignored or worked around the fact, with heroes fortuitously having fuller participation – for instance, because some unlikely circumstances have forced them to operate on their own (as in the stories by Buchan and Ambler mentioned above). The other group devoted increasing attention to the "vast and complicated machine," describing its operations at length – both bureaucratic, and technological. Ian Fleming's novels, for instance, presented James Bond as part of a vast organization, and made the reader quite conscious of the fact in novels like Moonraker (1955) and Thunderball (1961) (even as his membership in the special double-o section placed him in the kinds of exceptional positions noted above). Other, later authors went further, not concentrating their narrative on one character, or a few characters, but rather using a large number of viewpoint characters to show as well as tell about more aspects of the machine's functioning – so that the plot is really the heart of the story, and the national security state the real protagonist, with the ostensible characters really just "rivets" inside of it. (At most, one of those characters might be recognizable as a protagonist because he occupies a place within the machine that lets him have a fuller view of the picture than the others.)

Frederick Forsyth was a crucial developer of the latter approach, with novels like his classic The Day of the Jackal (1971), in which the titular assassin begins and ends the story as a cipher, and the opposition is not so much Claude Lebel (who is not introduced until halfway into the story), but the French security state over which Lebel exercises exceptional powers for a brief spell. Seven years later Forsyth scaled up the approach substantially in The Devil's Alternative (1978), as did Larry Collins and Dominique LaPierre's The Fifth Horseman (1980). However, Tom Clancy may be said to epitomize this "epic" approach to the tale of international security crisis, his hero Jack Ryan (first introduced in 1984's The Hunt for Red October) tellingly not a field operative but an intelligence analyst, who in the sequels occupied positions of successively greater responsibility - all the way up to the presidency itself by the end of Debt of Honor (1994).

The third is a late but significant Americanization of the genre from the 1970s. Certainly there were some Americans who met with a measure of success writing in the genre before then, like Edward Aarons, author of the Sam Durrell novels, Donald Hamilton, who penned the Matt Helm series, and Richard Condon with the classic The Manchurian Candidate (1959), but nearly all of the important innovators in this area pre-1970, all of the writers remembered and being read today, are British. In his 1972 history of the crime story, Mortal Consequences, Julian Symons speculated that this was due to
the prevailing air of sophisticated coolness about ends and means. Certainly the Americans . . . have never been able to treat the existence of spies threatening or betraying their security with anything but the most narrowly nationalistic seriousness.2
There seems something to this analysis, especially as the American writers who made a splash at this time, like Robert Ludlum in The Scarlatti Inheritance (1971) and The Matarese Circle (1979), Trevanian in The Eiger Sanction (1972), James Grady in Six Days of the Condor (1973) and Charles McCarry in The Miernik Dossier (1973) and The Tears of Autumn (1975), offered more varied and nuanced views of such matters. There is no question that many American writers came to enjoy vast commercial success (as Ludlum and Clancy did), and while it would be difficult to point to an American with the status of a Greene or a le Carré, for instance, there was something more like parity in the status of later American and British writers in the field.

All three of these changes were well established by the end of the 1970s, by which time the spy genre was starting to look a bit worn-out again. Considering the fact I am once more reminded of John Barnes' argument in the essay "Reading for the Undead" that genres tend to follow a three-generation life cycle, with the first generation discovering something new, a second generation finding an established field and going on to develop its still unexploited potentials (a process likely to be guided by a critical reassessment of previous work) – and the third less concerned with innovation than "doing it well" as it turns into
something like an inside joke (as with much of live theatre), a treasured family story (as with opera or jazz), or a set of exercises in which to display virtuosity (as with ballet and with much of orchestral music).
It is easy to fall into the trap of fitting facts to theories. Still, the spy story (much like the mystery and science fiction) does seem to me to have traveled something like this course, with writers like Childers, Le Queux and Oppenheim being first-generation early innovators, and Ambler and Greene early second-generation authors bringing new ideas (political as well as aesthetic) and greater skill to a genre that was already threatening to grow stale prior to their appearance.

In the third generation, clearly underway by the 1970s, there seemed a greater tendency to look back, evident in such things as the genre having become so over-the-top that parody went unrecognized – as happened with Trevanian's Jonathan Hemlock novels and Shibumi (1979), which were almost universally read as straight thrillers by critics and general audiences alike (much to that author's frustration). There is also the increased prominence of stories set in World War II (and other earlier periods) in the output of new authors like Robert Ludlum and Ken Follett, and the resurrection of James Bond by John Gardner and Glidrose Publications in 1981 with License Renewed.3

It seems that generic boundaries get fuzzy at this stage of the life cycle, and this tendency was evident in the life of the spy story as well, increasingly hybridized with elements from other genres providing the principal interest – as with Craig Thomas "espionage adventures" like Firefox (1977), military techno-thrillers like Tom Clancy's Jack Ryan books, Dale Brown's Day of the Cheetah (1989) or Payne Harrison's Storming Intrepid (1989), and the Dirk Pitt novels of Clive Cussler, like Raise the Titanic! (1976) and Deep Six (1984), which combine espionage and military action with historical mystery and maritime adventure (in a way, coming back full circle to Childers).

Developments within the genre aside, it seems that world events encouraged such a turn. The tendency to look back can be seen as at least partly a reflection of the cultural mood of the 1970s – a sense of national decline (as the post-war boom ended, the energy crisis hit, and the decline of colonial powers like Britain and France ran its course), and of ambivalence about present-day politics (in reaction to Vietnam, Watergate and the like) making earlier periods where claims to national greatness were more credible and clear-cut "good guys" and "bad guys" easier to identify more attractive (like World War II).4

It is worth remembering, too, that the spy story arose in an era of profound international tension, over which the danger of systemic, great power war constantly hovered – and great ideological tension, as nineteenth century liberal society faced challenges from left and right. The advent of détente, and the partial waning of Cold War tensions that went with it, may have made it appear somewhat less compelling as subject matter for some, and earlier conflicts commensurately more attractive. A decade later, the end of the Cold War took a great deal of the remaining steam out of the genre. (To put it bluntly, industrial espionage, terrorism, international crime, rogue states and the faint possibility of Western conflict with Russia or China were no substitute for the Soviets.) Spy novels continued to be written afterward, by new authors – like Charles Cumming, Henry Porter, Barry Eisler and Daniel Silva - as well as the older writers so established as to be nearly immune to such fluctuations in the market - like Forsyth, Clancy and le Carré (all still publishing). However, their book sales and overall cultural impact tended to be less impressive than formerly (though Clancy still managed to be one of the top-selling authors of the '90s), and noteworthy innovation scarcer, and the tendency to look backward growing only more pronounced.5 In the 2000s the most successful stories of international intrigue were more likely to be concerned with historical-religious-Masonic mysteries in the manner of Dan Brown's Robert Langdon novels (or Matthew Reilly's Jack West novels) than conventional political intrigue. I see little sign that the genre is going to stage some comeback, but, to use John Barnes' term, its "afterlife" is at the least a presence in the cultural landscape.

NOTES
1. Fleming's Bond can be thought of as halfway between these and a later generation of action heroes. Like the older style of protagonist, he went to Eton, enjoys an independent income and has a housekeeper looking after his apartment. However, he is also a long-serving professional intelligence operative in the British Secret Service (and one with a "license to kill" at that), flouts Victorian mores in his attitudes toward gambling and sex, and is not unknown to express ambivalence about his profession and the ends it serves.
2. Symons attributes this difference to the United States' "direct involvement in various wars." This is unconvincing, however, as Britain was more lengthily and completely involved in both of the World Wars than the United States (and suffered far more in them by any measure); far closer to the "front-line" in the Cold War; and in the decades after 1945, involved in dozens of conflicts as it disengaged from its empire, not all of them small in scale (as with the Malaysian Emergency). Rather, the context in which they fought those wars would seem relevant. The spy genre appeared in Britain during a period of concern about the country's decline relative to other, rising powers (like Germany), fears which became realities as the century progressed. The 1970s, when the change arrived for American spy fiction, saw the arrival of a comparable mood in the United States (amid the Vietnam War, the end of the Bretton Woods economic order, the oil crisis and other such challenges). One might conclude from this that the genre flourishes in a period when the pious simplicities of jingoism and national exceptionalism are shown up, and public opinion reckons with life's more complex realities.
3. Ludlum's first book, The Scarlatti Inheritance (1971), used an incident in World War II as a frame for a story of the rise of the Nazis in the '20s, and the The Rhinemann Exchange (1974) was wholly set during World War II, as was a significant part of The Gemini Contenders (1976), and the opening of The Holcroft Covenant (1978), which had for its theme the post-war legacy of the Third Reich. Ken Follett made his name as a thriller writer with a World War II story, The Eye of the Needle (1978), as did Jack Higgins with the spies-and-commandos story The Eagle Has Landed (1975). Frederick Forsyth's first two thrillers, The Day of the Jackal and The Odessa File (1973), were both set in the early 1960s, during past periods of political crisis. By and large, the major works of the 1950s and 1960s did not make such use of earlier periods.
4. Certainly some British writers compensated for Britain's diminution by emphasizing the country's "special relationship" with the United States, as Fleming did in novels like 1953's Casino Royale (where the combination of American cash and British skill defeated Le Chiffre), and as others have continued to do down to the present. However, by the 1960s and 1970s many writers were taking a more ironic view, like le Carré in The Looking Glass War (1965), A Small Town in Germany (1968), and Tinker, Tailor, Soldier, Spy (1974), and Joseph Hone in The Private Sector (1971) – in all of which books, the inability or unwillingness of British officials to adapt to their country's decline was a prominent theme.
5. One of le Carré's best-received post-Cold War novels, 1995's The Tailor of Panama, was a homage to Greene's Our Man in Havana (1958). Another example of this is the decision of the publishers of the post-Fleming James Bond novels to return 007 to the 1960s, as happened in Sebastian Faulks' Devil May Care (2008). A number of authors have also combined such homages with elements of science fiction and fantasy, like Charles Stross in his "Bob Howard" novels and stories, and Tim Powers in Declare (2001).

(This essay was previously published as two separate posts, "A History of the Spy Story, Part I: The Birth of a Genre" on February 1, 2012, and "A History of the Spy Story, Part II: The Life of a Genre" on February 6, 2012.)

"Fourth Reich Rising" . . . Again

In the early 1990s, a slew of technothrillers depicted Germany on the march again, like Larry Bond's Cauldron (1993), in which a Franco-German assault on Eastern Europe starts a third world war, and Harold Coyle's The Ten Thousand (1994), in which renewed German military ambitions set American troops in the country on a repeat of Xenophon's Anabasis. This was in part an attempt to find a substitute for the Soviet Union in the aftermath of the Cold War, but also a response (however exaggerated) to the geopolitical circumstances of the time: the combination of Soviet collapse, American declinism and German reunification; the judgment widely passed on Germany (along with Japan) as a model economy for the 1990s; and the expectations that free trade would be replaced by neo-mercantilist competition among economic blocs, with Germany at the heart of what seemed potentially the most formidable of those blocs, the emergent European Union.

Of course, the idea became less popular with time as the pendulum swung back from American pessimism to American exceptionalism, while the pundits shifted from viewing Germany as one country doing just about everything right to seeing it as a poster child for "Eurosclerosis" – and globalization seemed to have put paid to neomercantilism as a source of war among the industrialized powers. However, the dialogue has taken yet another hundred and eighty degree turn in recent years, with some observers not only more appreciative of Germany's manufacturing strength, but even uttering dark warnings about a new German empire – as Frederick Forsyth recently did.

Over at my other blog, I discuss the rhetoric, and the actual – quite different - facts of the situation. (Naturally, they are not what the overheated speculations claim.)

Sunday, January 29, 2012

Writers Write About Writerly Advice

Aspiring writers face many hardships as they learn their craft and try to break into the business. One of them is the bad advice inflicted on them by all and sundry, from casual acquaintances to authors of "how-to" books and articles swathed in the mantle of Authority. Perhaps worst of all are the lists of meaningless "do"s and "don't"s that inhibit and confuse instead of help.

On his blog Earth and Other Unlikely Worlds Paul McAuley (a real writer, unlike most of those so free with their advice) points to two pieces in which writers respond to some of those classic commandments.

Charlie Jane Anders - whom you might remember as not only a contributor to the ever-useful web site io9, but the author of the short story "The Fermi Paradox is Our Business Model" (collected last year in Rich Horton's The Year's Best Science Fiction and Fantasy 2011, a review of which you can find here, and available at Tor.com) - presents "10 Writing 'Rules' We Wish More Science Fiction Writers Would Break." (I particularly like her answers to numbers one and three – "No third-person omniscient," and "Avoid infodumps" – as I've long felt that infodumps get a bad rap, and that the preference for narrower viewpoints is often a matter of sheer snobbery.)

Meanwhile, at Nihilistic Kid's Journal, Nick Mamatas offers "Ten Bits of Advice Writers Should Stop Giving Aspiring Writers." (I especially like his succinct two-word response to the old truism, "Show Don't Tell.")

In the end, it all boils down to the fact that there's usually more than one way to succeed – just as there's always more than one way to fail.

Saturday, January 28, 2012

"Chuck Versus The Goodbye"

I first started watching Chuck mainly because it was on before Heroes, and wasn't sure what to make of it at first. For one thing, the blend of atompunk spy games with post-cyberpunk technology didn't quite gel for me. (The Intersect seemed an especially hokey gimmick.) For another, the villains were bland and derivative. (Fulcrum is no S.P.E.C.T.R.E.)

But the Buymore was great, an interesting contrast with the secret agent doings in which Chuck got caught up – and Morgan and Lester and Jeff and Big Mike and the rest were the source of much of the fun from the first. There was also the writers' handling of Chuck's geekiness, and geek culture more generally, which unusually for American TV did not come off as a caricature in the mind of a schoolyard bully (unlike one Chuck Lorre sitcom I can think of), and which they also managed to cleverly work into many a plot. These two elements turned out to be just the things to breathe new life into the half-century old game of parodying James Bondian espionage, helping to produce some memorable gags and set pieces – and at the show's best, episodes like "Chuck Versus Tom Sawyer."

Alas, Chuck peaked early. As the show's main character became more self-assured, more at home in this element – as he went from Nerd Herder-in-over-his-head to genuine superspy - we lost the fish-out-of-water aspect of the story that was initially a source of much of the comedy. Bringing the other characters in on the craziness didn't compensate for it, the repetition producing diminishing returns, while also dispensing with much of the comic tension created by Chuck's keeping so much of his life secret from friends and family. And as the story of Chuck and Sara shifted from "angsty tale of a guy hopelessly in love with unattainable fantasy girl" to mundane boyfriend-girlfriend stuff one might see discussed with a therapist on a daytime talk show, this too lost interest. (To paraphrase George Costanza, "Relationship Chuck" was less entertaining than "Independent Chuck.")

Of course, such changes were inevitable (it's inconceivable that Chuck could have gone on reacting to things in exactly the same way, and TV show romances which just spin their wheels get tiresome fast), but along with them went much of what made the show distinctive and appealing. Meanwhile, really clever mixes of its diverse elements became rarer – the espionage and the BuyMore comedy and the geekiness happening alongside one another rather than coming together in a whole greater than the sum of its parts. Still, the show managed to be entertaining enough to keep me coming back, all the way through its five season run, which ends with the pair of episodes airing tonight.

Let's hope they make for a fitting finale.

Friday, January 27, 2012

Thoughts on "Brigadoom"

Watching the fourth season Sanctuary episode "Fugue," I was again struck by the tradition of "musical episodes" in recent television series, especially those in the science fiction and fantasy genres.

This is usually discussed as having begun with the Buffy the Vampire Slayer episode "Once More, With Feeling" – but as has been the case with many a genre trope, Lexx did it first, with "Brigadoom." And pulled off a very tough act.

It is worth mentioning here that when the show started airing on the Sci-Fi Channel back in 2000 (with a handful of episodes plucked out of the second series, without regard to the show's development, or the season's story arc), I wasn't impressed with what I saw. Lexx was certainly . . . different. I was intrigued by some of the characters and concepts, but the episodes themselves didn't amount to much. When I saw the commercial for a musical episode, I was skeptical.

But that didn't last, and neither did my earlier skepticism about the show. "Brigadoom" (cowritten by show creator Paul Donovan, and the late Lex Gigeroff) didn't simply have a few of the lines sung instead of spoken (the way Fringe's "Brown Betty" did, for instance), but instead offered a full-blown musical which dramatized the back story of a central character, and presented a turning point in the season's story arc (which was itself a whopper, the title of its final episode, "End of the Universe," not being overstatement).1 It helped, too, that the songs were memorable.

As the show's own writers have said in various interviews (mine included), the quality of the show varied wildly from one episode to another. "Brigadoom" was the show at its very best, which was great, and it has since been a fan favorite.

Alas, the achievement of cast and crew here has been overlooked in the general tendency to ignore or put down the show, which has been extreme, even compared with other TV space operas, toward which critics and audiences (hardcore fans aside) are famously ungenerous. Still, those writing about such episodes should remember that early moment in the history of television science fiction characters suddenly breaking out into song.

1. It is worth noting that this wasn't Lexx's first musical moment. The second part of the four-part miniseries that comprises series one, "Supernova" (which aired two years before "Brigadoom") also contains a brief but striking musical interlude.

Thursday, January 26, 2012

On the Eureka Paradigm

Popular expectations regarding technological progress in areas from artificial intelligence to fusion, from cloning to space, have commonly been overblown. The response on the part of many is simply to regard futurists' predictions with irony, but this has never seemed a useful position to me. The reality is that we can hardly avoid making guesses about what's to come, given that we are so often obliged to think in the long-term, and to plan, especially where science and technology are concerned. Guessing poorly can have serious implications for public policy (some of which I recently touched on in my Space Review article, "Space War and Futurehype Revisited"). Consequently, instead of irresponsible dismissal of the whole enterprise of prediction, the appropriate response is to try and get better at making predictions, and at responding to predictions appropriately. (Even if the best we can hope for is to get things somewhat closer to right somewhat more often, or simply better-position ourselves to cope with the inevitable surprises, this seems to me well worth our while.)

That said, the number of pitfalls in the way of those studying such issues is staggering. Today it is almost impossible to point to any discussion of public issues which is not carried on in a fog of misinformation, and disinformation spread by vested interests and ideologues. The situation is even worse when expert knowledge is crucial (as in virtually any discussion into which science enters), and the uncertainties are large (as when one makes guesses about things that haven't happened yet, exactly the issue here), because of how much more difficult it becomes for even informed observers to make their own judgments. (Think, for instance, of how Creationists succeeded in manufacturing a "debate" over "intelligent design.")

However, this falls far short of exhausting the list, and I would argue that one big stumbling block has been a deeply flawed "folk perception" of science and technology shaped, in large part, by media produced for (and often by) people without expertise in this area. Consider, for instance, what we get in so much pop science journalism. News items about developments in science and technology, just like news items about anything else, are typically intended to be brief and punchy and accessible (read: attention-grabbing and entertaining). Breathless stories about exciting new possibilities fit these criteria far better than critical, nuanced pieces attentive to the obstacles in the way of realizing those possibilities. (Indeed, many of them make it sound as if the possibility is already a reality, as does the questionable title of this otherwise useful piece: "Quantum Keys Let Submarines Talk Securely.")

Such stories skew perceptions, making it seem as if a great many ideas are much more developed than they really are, and the press's lack of a memory worsens matters. After telling readers and viewers about the thrilling innovation that just might change all our lives (with the "might" sometimes in the small print), there is generally no proper follow-up; the story just fades away even as the impression it created remains. The failures lapse into obscurity, while successes are likely to get so loudly trumpeted that it can seem as if the latter are all that exist.

These attitudes may be reinforced by the endless repetition of the claim that we live in an era of unprecedentedly rapid technological progress, making historical precedents irrelevant, and implying the imminent bursting of all the old boundaries. They are reinforced, too, by a tendency to identify "technology" with "information technology" (which goes as far as Google's news aggregator compiling, under the heading of technology, numerous items that are not about technology as such, but rather the financial fortunes of IT firms), which has the effect of making the state-of-the-art in consumer electronics the yardstick of technological progress, a perception which can be very misleading given that other areas (like food and energy production, medicine and transport) have seen much slower change. Raymond Kurzweil's "Law of Accelerating Returns," which and the technological Singularity with which it is associated, are a particularly extreme version of this kind of thinking, but this futurehyped, IT-centric outlook is indisputably mainstream, so much so that the correctives offered by observers like Robert J. Gordon, Jonathan Huebner, Bob Seidensticker and Michael Lind have had little traction with Apple brand-worshipping consumers who believe that their new cell phone is the telos of human history.

Still, for all its flaws, it has to be admitted that pop science journalism is far less influential than fiction, and especially television and movies, which appear to play far and away the biggest role in shaping the popular image of science. Last year a poll taken in Australia found that TV and movies were the primary source of information about science for three-quarters of those surveyed, a pattern likely typical for other developed countries. One likely consequence of this is our habituation to the thought of technologies that are only imaginary, with "hard," technologically-oriented, extrapolative science fiction set in the near future likely having the strongest impact on our thought as "rumors of the future" (as suggested by scholar Charles E. Gannon's study Rumors of War and Infernal Machines).

Additionally, science fiction routinely dramatizes the processes of science and engineering, and in the process propagates a particular view of them. The realities of science and engineering as specialized, collaborative, often slow-moving activities, participants in which often cope with multiple, knotty problems at once, some of which may be theoretical in nature; as dependent on massive amounts of organization, equipment and money controlled by people who are not scientists (indeed, often are scientific illiterates), and so susceptible to the vagaries of economics, business, politics and even personal whim; as under even the best circumstances subject to fits and starts, to dead ends and flashes-in-the-pan, with projects to develop new technologies often concluding in nearly useless prototypes, on the drawing board, or even at the concept stage; are given short shrift.

As an example of what we are much more likely to see, consider the Syfy Channel show Eureka, which centers on an imaginary town in the Pacific Northwest inhabited by a large portion of the United States' scientific elite. Eureka makes some concessions to reality in its depiction of the Global Dynamics corporation running the town, and the involvement of Pentagon functionaries in the affairs of its researchers. However, this is superficial stuff, the episodes mostly playing like twenty-first century "Edisonades," with geniuses equally adept at fields as diverse as astrophysics and molecular biology flitting from one project to the next, many of them undertaken singlehandedly in their garages; casually tinkering with toys as exotic as androids and faster-than-light drives; and during the inevitable crisis, inventing on-the-spot solutions to staggeringly complex problems that somehow always work.

Taken together, all this leaves most people imagining science as nerd-magic, and picturing R & D as a matter of omnicompetent nerd-magicians pulling solutions out of thin air – so that if a problem does need solving all we have to do is get the nerds on it and, voila, it's solved. (And if the nerd-magicians don't work quickly enough, we just have to hector them until they do, the way Sheriff Jack Carter, like many another rugged sci-fi hero, hectors the science types for a fix when there's trouble.) It also leaves audiences thinking that, to use William Gibson's phrasing, "The future is already here – it's just not very evenly distributed" in the literal sense of thinking that every gadget one has ever heard of must be out there, somewhere.

In short, it has them believing in the "Eureka Paradigm" of scientific and technological R& D.

Of course, there are lots of reasons why fiction still depicts science and technology in this way, long after such depictions have lost any credibility they may once have had. A fairly good one is that this simplistic approach is a better fit with dramatic requirements, easier to turn into a compact, intelligible, accessible story in which we get to focus on a few characters as interesting things happen – and to be fair, the point is to entertain, not inform. Yet, it is an appalling basis for actual consideration of how research actually proceeds, and we take such images seriously at our cost.

Sunday, January 22, 2012

Review: Deep Six, by Clive Cussler

New York: Simon & Schuster, 1984, pp. 432.

Clive Cussler's Deep Six (1984) can be thought of as something of a transitional work for the author, occupying a space in between his early, tightly focused, short novels like Raise the Titanic (1976), Vixen 03 (1978) and Night Probe (1981), and the later, sprawling, epic action-adventures which began with Cyclops (1986).

Even though I usually read right through Cussler's books after picking them up (back when I did read them), I started this one a few times before making much headway in it. This had much to do with Cussler's handling of the book's two main plot threads--the first, an investigation of a marine disaster in the northern Pacific, and the second, the mystery following the disappearance of the presidential yacht (with the President, Vice-President, Speaker of the House and the President Pro Tempore of the Senate all aboard). The first mystery engages Pitt and his friends from NUMA exclusively for the first third of the book or so. They only become involved with the second mystery in the book's middle, and play only a minor role in that investigation until the last third of the story, when the connection between the two series' of events finally becomes clear.

Filling the gap in between is a great deal of inside-the-Beltway intriguing among officials of the National Security Council and the Secret Service--which at times left me with the impression that I'd put down Cussler's book and picked up one by Tom Clancy instead. As is usually the case with these novels, Cussler's included, the portrait of D.C. struck me as simplistic and inauthentic, devoid as it is of the sausage factory-like quality of real-life politics. This is a Washington without lobbyists and political action-committees and revolving doors between industry and government, where politicians who take campaign contributions from shady special interests are "bad apples" and the "power elite" is described as "elected"--in short, a sanitized civics class textbook's version of governance (much as seen in other Cussler novels, admittedly, but more problematic here because of the foregrounding of this part of the story). The foreign politics are equally lacking in nuance, down to the foreign villains, who are, not unexpectedly, one-dimensional clichés that occasionally cross the line into racism, with the unsurprising result that the Soviet strategy comes off as astonishingly clumsy in stark counterpoint to the tactical and technological genius the KGB and its partners display in executing the scheme.

In short, others have done this stuff before and after and in many cases better, and it is poor compensation for what is missing in the earlier parts of the story, where in their limited appearances Pitt and company merely contribute to a couple of underwater searches (the second of them treated rather briefly), and engage in some mostly stationary detective work. We are more than halfway through the book before Pitt has his first brushes with the bad guys, and it is some time after that before he gets up to his usual antics. The result is a story that gets better as it goes along, with the last third providing exactly the kind of thing for which Cussler's readers come to his books, especially in the action-packed finale full of over-the-top heroics, flashy military hardware and creative anachronism as the clock ticks down to disaster. But getting to that point is occasionally a slog.

Mad Men: A Second Look

I recently finished watching the first three seasons of Mad Men.

The second viewing confirmed many of my first impressions, but I have come away with some new thoughts as well.

Where the acclaim is concerned (the show won its fourth Emmy for Best Drama in a row last year), it would seem critics are responding not only to the aesthetic-nostalgic appeal of the show's recreation of a more glamorous-seeming era; or its iconoclastic portrayal of the early 1960s, (the echoes of Thomas Frank's The Conquest of Cool and Richard Yates' Revolutionary Road not groundless, but greatly exaggerated); or even its giving the audience the guilty pleasure of vicarious indulgence in un-p.c. behavior while still feeling superior to it.

The truth is that almost everything about the show is a perfect fit with highbrow critics' views on what constitutes Good Drama – the slow pace and the heavy use of indirectness and implication (e.g. subtext) to drop a massive freight of irony on the heads of its mostly unlikeable characters that might seem like liabilities to many a viewer a big plus in their book. The upper-middle class social setting, and the premise's allowing for a great deal of writing about writing, media about media, the positioning of identity, domestic life and suburban dissatisfaction as central themes, are likewise much in line with their tastes. And the association of show creator Matthew Weiner with the last cable drama to win such heaping (over)praise, The Sopranos (which worked in a not dissimilar manner), only helps.

All this makes the show not just a triumph of style over substance, but a reminder that pandering to the snobbery of the upmarket review pages (and the viewers who mindlessly follow their lead) can pay real dividends.

Wednesday, January 18, 2012

SOPA Blackout Protest

I found out about the protest against SOPA (H.R. Bill 3261, innocuously named the "Stop Online Piracy Act") too late to participate properly. So I'm posting this message to indicate my solidarity with the protestors - and offering this link to blackout participant Wikipedia's FAQ on the subject.

Until tomorrow, please regard this site as also blacked out.

This blog will resume its normal operations then.

Friday, January 13, 2012

New Review: Scarecrow Returns, by Matthew Reilly

New York: Simon & Schuster, 2012, pp. 368.

Matthew Reilly's Scarecrow Returns (published last year in his native Australia as Scarecrow and the Army of Thieves) opens with a burst of action as a mysterious "Army of Thieves" captures a Russian island in the Arctic Sea – one which happens to be home to a secret Cold War-era research installation, and the site of a Soviet superweapon now in play. This event marks the beginning of a global crisis as seen by the American and Russian crisis response teams. As luck would have it, Marine Recon Captain Shane Scofield, and his longtime friend and comrade-in-arms "Mother," are with an equipment-testing team nearby – and virtually all that the American government can call on to save the day.

This book is Reilly's first Shane Scofield novel since Scarecrow (2003), almost eight years earlier, and naturally I was looking forward to it. Nonetheless, some aspects of the premise initially worried me.1 For one thing, it suggested that the globe-trotting and mystery-solving I had enjoyed in Scarecrow and the Jack West trilogy had been abandoned in a return to the more static adventures with which Reilly began his career, like Ice Station (1998) and Area 7 (2001). I'd enjoyed those books, but felt he'd since superseded them (Reilly himself has referred to them as the work of Reilly 1.0), and wasn't sure how much more juice he could extract from the older concept he'd already executed several times. Additionally, two decades after the Cold War's end the idea of a Soviet superweapon would seem to have passed its "sell-by" date – much as has long become the case with villains left over from the Third Reich. I also wasn't sure what to make of the "Army of Thieves" who comprised the villains, these seeming to be an especially senseless bunch in comparison with Reilly's previous bad guys, whose agendas, however horrific, at least had a recognizable rationale.

Fortunately, the book exceeded my expectations in all these areas. Like the books of "Reilly 1.0," Scarecrow Returns is a three-way collision between teams of special-forces soldiers at a high-tech facility in a remote, hostile landscape, but Reilly manages to keep the material fresh, and the plot and action unfold with a smoothness that reflects his now lengthy experience in telling this kind of tale. The battles are as readable as any Reilly has written (at least, when read with the aid of the numerous illustrations), while being as grand in scale and over-the-top as readers have come to expect – which is to say, unequaled by any writer working similar territory today. Reilly's particular variant on the trope of the "left-over Soviet superweapon now on the loose" is a good one, and his villain is in line with his predecessors, at least, when we get behind the mask. The novel also benefits from a number of new touches, ranging from a scene-stealing combat robot named Bertie, to a French vendetta against our hero – and a few memorable plot twists (which I won't spoil here). Additionally, cartoonish as Reilly's characters are, they are nonetheless a bit fuller and more nuanced here, and their personalities do have a bearing on the tale.

That is not to say that everything is perfect. Readers demanding meticulous treatment of the technical detail will be irritated by such things as Reilly's depiction of a KH-12 satellite as a signals intelligence platform (its function is in fact optical imaging), and his repeated reference to an SS-23 as an intermediate range ballistic missile (when its 500 kilometer range actually makes it a short-range ballistic missile) – details that could have easily been corrected without requiring the slightest changes to the story. There is an incident in one of the battles (in the "Stadium") where the editing appeared to falter. (It seemed to me that Reilly wrote "trench" when he should have written "walkway" – though I'm less than a hundred percent certain of this, as those of you familiar with his action sequences can understand.) Such nit-picks aside, Reilly's use of his over-the-top plot to explore very real geopolitical issues struck me as less clever this time around, the rationale behind the action comparatively muddled, especially when compared with the almost psychic perceptiveness of the villain. (The fact that the weapon's activation frankly seems unlikely to leave any "winners" on the planet is only one of the reasons for this.)

Still, on the whole it's a satisfying read if you're up for this kind of adventure, and fans of previous books are likely to find it well worth their time. However, given the extent to which events in the previous novels bear on the story in Scarecrow Returns, readers new to the series might want to check out the previous installments (Ice Station, Area 7, and Scarecrow) first.

1. I use the original publication dates here, rather than the dates of their release in North America, my edition excepted.

Thursday, January 12, 2012

Lex Gigeroff, 1962-2011

As many of you already know, Lex Gigeroff, best known as one of the three principal writers on Lexx (as well as a sometime actor on that show, in such memorable roles as Barnabas K. Huffington), died last month, on December 24, at the age of forty-nine.

I had only one exchange with Mr. Gigeroff, when I interviewed him for my retrospective on that series back in 2007, but I found him friendly, witty and helpful, as he has generally been to those who knew him, and there is no question that he will be missed.

As might be expected, social media is one avenue through which these sentiments are being expressed: there is now a Lex Gigeroff Memorial page on Facebook (the page, and the memorial event it mentions, came to my attention only after they passed), as well as a tribute video on YouTube, which fans and well-wishers can check out.

Wednesday, January 11, 2012

Thoughts on W. Somerset Maugham's Ashenden

W. Somerset Maugham is perhaps best known for his books Of Human Bondage (1915) and The Razor's Edge (1944), but students of the spy novel know him by another book, Ashenden: Or The British Agent (1928), a collection of loosely connected stories about the adventures of the titular figure in the service of British intelligence during World War I.

In the course of the narration Maugham touched on many of the difficulties of turning the spy story into entertainment with surprising frankness, as when he observed that
Being no more than a tiny rivet in a vast and complicated machine, [Ashenden] never had the advantage of seeing a completed action. He was concerned with the beginning or the end of it, perhaps, or with some incident in the middle, but what his own doings led to he had seldom a chance of discovering. It was as unsatisfactory as those modern novels that give you a number of unrelated episodes and expect you by piecing them together to construct in your mind a connected narrative (Maugham, 7).1
Rather making "a picture out of the various pieces of the jigsaw puzzle" was the prerogative of "the great chiefs of the secret service in their London offices" (Maugham, 101).

Maugham offered another thought of the kind in regard to Ashenden's routine as a case officer when he observed that:
Ashenden's official existence was as orderly and monotonous as a City clerk's. He saw his spies at stated intervals and paid them their wages; when he could get hold of a new one he engaged him, gave him his instructions . . . he waited for the information that came through and dispatched it; he kept his eyes and ears open; and he wrote long reports which he was convinced no one read till having inadvertently slipped a jest into one of them he received a sharp reproof for his levity. The work he was doing . . . could not be called anything but monotonous (Maugham, 101).
As Maugham's remark demonstrates, the "tiny rivet" problem has been a big one for writers across the whole history of the spy genre (though it has loomed increasingly large as time has gone on).

Some authors have responded to this reality by contriving ways to put their protagonists at the center of events, so that not only do they get to see a "completed action," but that the action can be thought of as in large part their own, as writers as diverse as John Buchan and John le Carré, Eric Ambler and Ian Fleming, are known to do. (The approach tends to have the author writing in unlikely coincidences or exceptional organizational circumstances; forcing their heroes to become independent operators, whether as hapless outsiders or insiders forced to go rogue; or simply ignoring bureaucratic realities.) More recently they have deemphasized the rivets and instead concentrated on offering a broad picture of that "vast and complicated machine" – as Frederick Forsyth and Tom Clancy have done. (These describe the machine's operations at length, and rather than focusing their narrative on one character, or a few characters, use a large number of viewpoint characters to show a great many aspects of the machine's functioning, so that the vast plot is really the heart of the story, and the national security state the real protagonist.) And on the whole, writers of spy fiction have been far more prone to present their spies acting like detectives investigating a crime or carrying on a manhunt, heist men planning a black bag job, special operations soldiers, or fugitives on the run, than actual case officers in the business of handling agents.

Maugham, however, works within exactly the framework he describes. Unsurprisingly, he eschews the thriller conventions writers like E. Phillips Oppenheim, John Buchan and H.C. "Sapper" McNeile had already popularized. Instead, the intelligence work tends to be a backdrop to other dramas, as with his adventure in pre-Revolutionary Russia (in which we see almost nothing of what he is actually doing as a British agent). Reading these I was often reminded of Maugham's irony and humor (at its best in the episodes involving the "Hairless Mexican," and Russia, "Love and Russian Literature" being especially funny if you have enough familiarity with the context to get the joke). The result is, unsurprisingly, a book that holds up rather better than many contemporaneous classics of the spy genre.

1. The edition I have cited is the following: Maugham, W. Somerset, Ashenden: Or, The British Agent (Mattiuck, NY: The American Reprint Company, 199?), pp. 304.

Subscribe Now: Feed Icon