Wednesday, September 26, 2012

The Return Of Xander Cage?

The revival of Vin Diesel's career following his return to a starring role in the Fast and the Furious franchise in 2009 quickly led to a third Riddick film (shot earlier this year, and likely to be released in 2013), and talk of a continuation of the XXX series. However, for the time being it seems that the proposed XXX3: The Return of Xander Cage, is stuck in development hell. The exact reasons for this do not seem altogether clear, but the fact is less surprising than it may appear, and not solely because the second film in the series, 2005's XXX: State of the Union, was a flop. The fact remains that it has already been a decade since the first XXX movie came out, and even were the film to get a green light today it would almost certainly be 2014 before its release. A dozen years is a long time for a franchise--especially one which is premised on being an update of an earlier concept. XXX, after all, was sold as "James Bond for a new generation" in its overhaul of the idea of the adventurous and formidable, yet suave and high-living, secret agent who gets the girl(s) and saves the world from the depredations of assorted madmen.

As an attempt at a "neo-Bond," however, XXX had its share of problems. In contrast with other Bond-inspired Hollywood blockbusters, like Indiana Jones (with its archaeological theme, fantasy elements, period setting, and homages to yesteryear's movie serials) and True Lies (with its heavy element of suburban domesticity borrowed from the French film La Totale!), its reworking of the material was superficial. The early part of the film certainly succeeds in establishing a different tone with an opening scene in which a tuxedo-clad operative's conspicuousness at a Ramstein concert gets him killed, and the antics of its rebellious, extreme sports-loving protagonist. However, once Cage has been drafted into the service of the National Security Agency the innovation becomes almost completely aesthetic--Cage's conspicuously not walking and talking like an Etonian, meetings in dance clubs instead of casinos, fur coats instead of evening wear. Everything else is standard Bond imitation, the mission, the gadgets, the villain, the bad girl who turns out to be a good girl, the assault on the bad guy's lair with the clock ticking away to oblivion and the final pursuit, all of it much more copied than reinvented, down to Cage winding up at the wheel of a missile-firing car racing down a Czech road with the Russian villain's former girlfriend in the passenger seat. One might add, too, that while the execution of much of this was competent enough, the film displayed neither the flair or the scale to out-Bond Bond (was the ski chase, for instance, really so "extreme" as to put it in a different class from those already seen in the Bond films?), while some bits were decidedly sub-par. (The villain's plan is especially stupid, all the more so for being explained through reference to "anarchism.")

In short, XXX was a serviceable action film which had some fun with its idea of a Generation X/Y version of Bond, but which fell short of properly realizing that ambition, which at any rate is already showing its age ten years on. The "extreme" label has from the start seemed to many like a mere matter of marketing rather than substance, and in any case has long since been played out as superlative, gimmick and brand--so much so that it is no longer even the butt of jokes.1 Vin Diesel is now forty-five years old, a fact which has not been an obstacle to his continued participation in the FF films (he appears in Fast Six next year), but which makes him less and less plausible as the star of "the new generation's" version of anything--especially if one considers how much older he would be in further sequels (the prospects for which are naturally a crucial factor in any decision to continue the series). Meanwhile, even the Bond films have stopped trying to be Bond films of the kind it took as its starting point. Indeed, rather than the start of some new wave XXX now looks like the last major attempt by Hollywood to make an actual action movie using the Bond films as a model--virtually everything seen since then conceived as parody, from Cody Banks to Get Smart to last summer's Cars 2--while makers of more serious spy thrillers take their cues from Mssr. Bourne.

The result is that that update would now seem in need of a massive update itself. Even if this were feasible, and frankly I'm not sure that it is, it would probably leave the series unrecognizable, little of the original XXX's concept (already thin stuff to begin with) remaining but a title, a couple of character names, a casting choice or two--and it hardly seems strong enough to survive that. At this point I think it simply isn't going to happen, and that it's going to keep on getting less likely with time, the window in which the project could have looked like a good idea closed.

1. Stargate fans, for instance, will remember the episode "Wormhole X-Treme!"--which aired way back in 2001, almost a year before XXX came out.

Monday, September 24, 2012

Review: The Man Who Saved Britain: A Personal Journey Into the Disturbing World of James Bond, by Simon Winder

New York: Farrar, Straus & Giroux, 2006, pp. 287.

In writing The Many Lives and Deaths of James Bond I found myself revisiting the vast literature about the character, taking a new look at books with which I had been long acquainted, and checking out newer additions to this corpus for the first time. Of these the book that made the strongest impression on me was Simon Winder's 2006 The Man Who Saved Britain: A Personal Journey into the Disturbing World of James Bond.

Unlike most of the books published about Bond for popular consumption, Winder's book was not put together as a compendium of production history, plot summary, reviews and trivia, but rather uses the series as a lens for looking at post-war Britain – both the stuff of the history books, and his personal, lived experience of it, while offering a range of thoughts on the series itself. At once a piece of film criticism, cultural history and memoir, there is a thesis of sorts running through it all, namely that James Bond offered a British image of power, relevance and glamour in those post-World War II years when Britons keenly felt the lack of these things.

As might be guessed, his doing so many different things in a single volume makes for a loosely structured and unsystematic book best taken as a collection of bits. However, these bits come from an author with a broad vision of the last two centuries of British history, a striking sense of the quirky ways in which the world-historical and the personal can connect, a wonderfully eclectic literary taste and a deep and subtle appreciation for film as an art form. It also happens to be the case that, at least in this telling of it, Winder does seem to have lived a life oddly explicable in the terms of the subtitle ("A Personal Journey into the Disturbing World of James Bond"), supplying him with an anecdote appropriate to just about every wrinkle of the phenomenon he considers. To Winder's credit, his writing is also brisk and colorful and humorous to the point of frequently being laugh-out-loud funny. When reflecting on such matters as how the leading political figures of the interwar era look on film or the bizarrely anachronistic boarding school to which his parents packed him off, the career of actor Walter Gotell or the echoes of World War II in his childhood pastimes, the musical scores of John Barry or the manner in which a younger Winder conducted himself as an international traveler, he deftly and entertainingly combines autobiography with wide-field political, military, social and economic historiography, with results that are frequently as incisive as they are flip.

Still, Winder's take has its limitations. Like many another commentator on things Bondian he fell in love with the films (and the books) when he was rather young (as a ten year old watching Live and Let Die in its first run), and then grew apart from the series, especially as the series grew apart from its own '60s roots. While still greatly admiring of much of what he sees in From Russia With Love and Goldfinger, his later reaction against the series is nothing short of blistering. Certainly a sardonic touch is commonly a component of such reflections, while the sociological aspects of his study make a critical take on the material and the history it reflects (the hard facts of what empire involves, the neurosis that sets in when empire goes, the racial attitudes of yesteryear, etc.) natural. However, Winder seems to have gone from the extreme of adulation (the extent of his youthful devotion to The Man With the Golden Gun will astonish even hardcore Bondians) to the opposite extreme in his apparent determination to repudiate his Bond-besotted past (the rapid piling up of one denigrating remark after another regarding just about every aspect of the series equally astonishing). At the same time, while he has an exceptionally sharp eye for the ways in which the novels and the films appealed to Britons of the '50s and '60s (and to a decreasing extent, after), the reverse side of this seems an obliviousness to the ways in which they could have engaged just about anyone else. The result is a number of bits which don't work. Winder's remarks about such things as how non-Britons respond to the films (that, for instance, Bond's appeal to those of us on this side of the Atlantic is only comprehensible as a matter of his giving Americans a chance to laugh at British delusions) can only be read ironically by those generously disposed toward him, while a handful of his barbs (for instance, his wholesale dismissal of the Bond girls as lacking in sex appeal) can come across as a strained effort at iconoclasm.

Even where the recapitulation of British history is concerned there is one point where the book struck me as weirdly self-contradicting. Despite the leftishness of his outlook much of his characterization of postwar Britain (his praise for Attlee and his lack of same for Thatcher aside) takes at face value the right-wing cliches about that history: the '50s as all grim austerity, the '70s as stagflationary disaster, the '80s as a time of national revival. There seems a disconnect here, though rather than some personal quirk of Winder's I suppose it is simply a testament to how pervasive that version of events has become in the popular consciousness. (It may also be a legacy of Winder's childhood in a Daily Express-reading household in the '60s and '70s, and a matter of the contrast of his early impressions of the world inside that context then, and his independent impressions later.)

Still, it would be unfair to linger on these points. If at times Winder seems overeager to provoke, he is much more often thought-provoking. If there are times when he is inconsistent or confusing, he more frequently cuts through the inconsistencies and confusion surrounding his subject. Even when he was at his most unconvincing, I never came close to closing the book, the eccentricities of The Man Who Saved Britain perhaps inseparable from what it gets right, which is considerable, and I dare say, also inseparable from its considerable charm.

Saturday, September 22, 2012

Review: A Dance With Dragons, by George R.R. Martin

New York: Bantam Books, 2011, pp 1016.

George R.R. Martin's A Song of Ice and Fire has been widely admired for its sweep and scale, its vast and engaging cast of characters, the intricacy of its plotting, and the vividness of its world-building, which is not just full of memorable touches (like the "sky cells" of the Vale), but in the complexity of its politics far more like the Medieval world of actual history than we are accustomed to seeing from our epic fantasies. There is, too, its fiercely anti-romantic take on this genre, which can still seem an interesting counterpoint to the dominant fantasy tradition, and which Martin for the most part handles brilliantly. ("Life is not a song," Littlefinger tells Sansa, and Martin proceeds to prove it to her, and everyone else, time and again, in what it is that Sansa really finds when she meets her prince, the triumphal entry into the city after the Battle of the Blackwater, in the sagas of the lords who set forth to claim their "rightful" crowns and find something else instead.)

As far as I was concerned the first three books - A Game of Thrones (1996), A Clash of Kings (1998), A Storm of Swords (2000) - simply got better and better, and naturally I was eager to read the fourth when I got through them. As Martin made clear, it ended up being too big to conveniently publish as a single book, and so he ended up splitting it into two volumes, A Feast for Crows and A Dance With Dragons. Moreover, he decided to have the events of the fourth and fifth books track different sets of characters through simultaneous events, with the two sets of threads only reconnecting in the latter part of the fifth book, A Dance With Dragons.

It is now apparent that a major reason why this installment got so large was that Martin turned Cersei Lannister and Brienne of Tarth into major viewpoint characters, while starting major threads concerning events in Dorne and the Iron Islands. Reading them I initially welcomed the increased attention to the Lannisters, but with Tyrion gone and Tywin dead, they proved a less interesting bunch, while Brienne's adventure likewise proved a letdown, admittedly widening our view of this world a bit, but not really advancing the larger story, or being interesting enough in itself to make us overlook the fact. The episodes in Dorne and the Iron Islands appeared unnecessary diversions cluttering up the narrative.

Unsurprisingly, the remarkable momentum the story had developed by Storm of Swords (the pace of which was positively giddy at times, as one chapter after another served up shattering twists) does not continue in Feast, the developments described appearing comparatively minor and marginal. Still, while my enthusiasm for the saga had been dimmed somewhat, Martin did end the story with a couple of compelling cliffhangers concerning Cersei and Brienne, and I hoped to see how those threads worked out. I also looked forward to the continuation of the stories of Tyrion, Jon and the rest. Naturally, I went straight to Dragons.

Alas, we only reconnect with the characters whose stories dominated Feast six hundred pages into Dance, and see very little of them in the three hundred pages after that - a disappointment, even considering Martin's advance notice about how this part of the narrative is structured. We do not see either Sansa or Littlefinger, and Jaimie rates just one chapter, while Brienne rates only the smallest slice of one (which reveals nothing of how she escaped the noose at the end of the last story, or where she will be going now). Cersei gets only slightly more treatment, rating a mere two scenes (though they are meatier).

Of course, that leaves the storylines totally left out of Feast. However, just as Cersei is less interesting without having her family around, so is Tyrion less interesting without the rest of the Lannisters (this group definitely more than the sum of its parts), or even Bronn and Shae, both of whom had earlier made their exits from the story's main stream. The picaresque adventures he has after King's Landing have their moments, but simply lack the charge of earlier parts of his story. Jon fares better, his struggles as a newly minted Lord-Commander of the Night's Watch having an intrinsic interest (helped by the secondhand view it offers of Stannis's continuing fight to claim the throne). We also learn something of young Bran's destiny, while Theon Greyjoy reenters the picture, and plays a larger role in the subsequent events than one might expect. Unfortunately, Daenerys' time in Meereen increasingly seems a detour along her path to what now seems her inevitable restoration to the throne - which is also what the plot lines he developed in Dorne and the Iron Islands still feel like.

The result is that Dance, despite some strong bits, is also overcrowded and bloated, and confirmed my suspicion that we already saw the story's climax in the third volume, leaving us with little to look forward to but the falling action winding everything up. The fact that not one but two volumes comparable to the one books four and five were meant to be, and the increasing length of time between the release of one book and the next (books two and three appeared just two years after the preceding volume, while it took another five years for Feast, and another six for Dance to come along) does not make me optimistic. Unless the pace picks up, it may be 2030 or later before we find out how it all ends up - and many devoted readers might not really care anymore by then. Given the stunning first three volumes, that would be a real shame, and naturally I'm hoping that things will go differently, Martin and his editor finding a way to wrap up the heart of the story in one surprisingly good volume (or two short ones released very close together) coming out sooner rather than later, and save the rest of the material for side projects like the tales of Dunk and Egg.

Tuesday, September 18, 2012

Cory Doctorow On Summer Blockbusters

Over at Locus Cory Doctorow recently published a piece on why science fiction movies "drive him nuts," in particular the big-budget blockbusters, with this summer's The Amazing Spiderman the case in point. There is much in here that is familiar, not least such films' combination of stunning visuals with intellectual hollowness (and the gratuitous Star Wars prequel bashing Doctorow throws in), but he does offer some comparatively fresh angles. His more interesting remarks mostly have to do with his take on "How Hollywood Does Science," particularly the tendency to think of science as a magical source of cool stuff rather than a process. As he notes, the labs in the movie look not like places where working people actually do things, but showcases for finished products, while "The characteristic tasks of science – arguing, staring intently at screens, begging for funding, writing down stuff and revising it, and getting heated up about something cool and unexpected" are absent from the scenes set in the "Science Billionaire’s Science Tower."

This removes the whole from anything like a recognizable reality. Granted, as Doctorow notes one can take it all as opera, "stylized, larger-than-life, highly symbolic work that is not meant to be understood literally," but this leaves him unsatisfied - because it seems to him unjustifiable on the grounds that scientific activity is "visually interesting." I'm personally doubtful that people "staring intently at screens" constitutes compelling cinema; indeed, this has long been a significant problem for computer-themed movies, resulting in many an unsuccessful attempt to make presentations of hacking seem more interesting, with the result that ultimately the industry fell back on integrating such activity into more conventionally action-oriented storylines. Still, Hollywood's tendency to not just cut "the dull bits" but act as if they don't exist has effects beyond irking viewers with a modicum of scientific or technological literacy (like Mr. Doctorow). It also has a massively distorting effect on the way the average non-scientist thinks about the subject - much like the god-like presentation of the "Science Billionaire" in the lobby has had a profoundly distorting effect on economic thought, such that the average person seems to think of the Edisonade not as an outworn Victorian myth, but a viable basis for a national economy in the twenty-first century.

Monday, September 17, 2012

Paul Kincaid and Last Year's Best

Receiving plenty of attention on the web recently is science fiction critic Paul Kincaid's review of three science fiction anthologies purporting to offer round-ups of the year's best (the annuals by Gardner Dozois and Richard Horton, and the Nebula Awards Showcase 2012) in the Los Angeles Review of Books in early September.

Unfortunately, Kincaid reports the experience as underwhelming. Reading the assembled stories it seemed to him that for the most part "the genre has become a set of tropes to be repeated and repeated until all meaning has been drained from them," rather than an engagement with the world. And where that repetition is concerned, we appear to have passed the point at which we have "rubbed away anything that was bright and new." The result is that we increasingly get science fiction about science fiction, and especially science fiction as it used to be, "older . . . familiar futures," (as in Elizabeth Bear's update of '40s-style robotics tales like "Dolly,") or science fiction that looks rather more like fantasy ("more and more of the stories shortlisted for the Nebula Awards . . . either overtly fantasy or else indistinguishable from fantasy for all practical purposes), often because of their presentation of futures which are "magical or incomprehensible" (as in Gavin J. Grant's "Widows in the World"). In either case, the results bespeak the writers' loss of all "real conviction about what they are doing." This seems to him a matter of a loss of "confidence in the future," or at least, the future's comprehensibility.

Just as the stuff of the stories is familiar, so is the stuff of Kincaid's commentary - in part because there is much truth in what he says. Indeed, I have written so much about all this myself that saying anything more about most of his points seems to me superfluous. The exception is his note about incomprehensibility, his remark that
somewhere amidst the ruins of cyberpunk in the 1980s, we began to feel that the present was changing too rapidly for us to keep up with. And if we didn’t understand the present, what hope did we have for the future?
I find that line of thought dubious. As those of you familiar with this blog have likely noted by this point, I (like Robert Gordon, and Bob Seidensticker, and Ha-Joon Chang, and Michael Lind) have been unimpressed by the claims that we live in some era of unprecedentedly deep and rapid change, the "information age" thus far a modest tweaking of the industrial age. What actually seems to me most striking when I reflect on recent decades is how little political, economic and technological change actually occurred, certainly in comparison with the expectations of an earlier and supposedly more timid generation of futurologists and science fiction writers. (Remember world government? The twenty-four hour work week? Space manufacturing? Contrary to many an expectation, the nation-state and the business corporation remain the dominant political and economic actors, people still devote the great majority of their alert, waking hours to work rather than leisure, and we remain very much Earthbound as a species.) In many ways it seems as if we are standing still, and that is what makes problems like climate change seem so frightening - the failure of needed new technologies and public policies to emerge at anything like the speed, or on the scale, necessary for effectively dealing with them. The conscientious can only wish that the reality matched the hype in this respect.

This raises an obvious question: if change is not so rapid as many suggest, then why the continued attachment to the theme? Part of it may simply be the sheer power of the hype, to which science fiction has been especially subject. That fact by itself means that even those who don't share the expectation can still find interest in it - like Charles Stross, who has expressed considerable skepticism about the idea, but continues to write in this vein, as in his latest collaboration with Cory Doctorow, The Rapture of the Nerds (a sample from which you can check out here), the ironic title of which suggests much about the nature of the work.

However, the insistence on the world's incomprehensibility can have yet another function, namely that of dodge. To throw up one's hands in confusion can be an understandable response to their genuinely intimidating largeness, but it can also be a convenient way of avoiding the serious social and ethical and political questions raised by our problems (like our ecological crisis). There certainly seems an incentive to take that route when so many of the obvious responses to such problems - substantive critique of the prevailing orthodoxies, efforts to envision really meaningful alternatives, despair in the absence of such - are regarded as naive, disreputable or simply risky for the career-minded, encouraging the ever-present temptation to self-censor. Postmodernism has always concealed a significant amount of evasion behind its smugly enunciated epistemological doubts, and postmodern science fiction has not been an exception to the pattern. Indeed, the lack of conviction Kincaid finds in the writing is best understood as a parallel to that lack of conviction pervading our broader cultural and political life.

Tuesday, June 5, 2012

In Defense of the Star Wars Prequels

The denigration of the Star Wars prequels – Episode 1: The Phantom Menace (1999), Episode II: Attack of the Clones (2002), Episode III: Revenge of the Sith (2005) – has been tediously ubiquitous ever since their release. What has been most striking is not the attacks coming from the expected quarters – those who have always been critical of the series, as with those dubious about the story's mythological underpinnings – but the hardcore fans who cherish the original movies most.

More than most, I tend to find prequels an exercise in the pursuit of diminishing returns, prone to depend excessively on the good will earned by previous works, mining less interesting parts of the tale than what we saw before for what riches remain, and often diminishing the whole in its failure to recapture the magic of the originals. And certainly episodes I-III had their flaws – weaknesses in the acting and dialogue, bits of story which seemed muddled or flat, inconsistencies with the original trilogy. Yet the criticism has long seemed to me excessive to the point of neurosis – and for good reason. Its expression has been so frequent and loud and strident that geek antipathy to those movies has become a pop culture clichĂ©, casually referenced not just in genre-oriented films and shows like Spaced and Stargate: Universe, but more mainstream fare as well. (In the romantic comedy Failure to Launch, for instance, Patton Oswalt's character is identified as not just a fan of Star Wars, but one specifically devoted to the original trilogy, of course.1)

It seems to me that much of the reaction has really been about the audience itself. Between the special place the original trilogy holds in many a heart, the sixteen years that passed between Return of the Jedi and Phantom Menace, and the predictably massive hype preceding the theatrical release of Episode I, expectations rose impossibly high – so that any conceivable movie would have been a letdown for many. And of course, much of the audience had changed during the wait. Those who saw the initial theatrical release of the trilogy as children were adults now – and in many cases, more sophisticated, more critical viewers who compared what they saw then with the earlier films as they recalled them through the rose-colored glasses of cherished childhood memory. (They were certainly less forgiving of weaknesses in acting or dialogue, which were hardly absent in Episodes IV-VI.)

But part of it is also the ways in which the prequels differed from the originals. In place of the clarity and simplicity of episodes IV-VI, there was a comparatively sprawling story with more complex plotting and world-building, and rather more politics along with the mythology. The originals had underdog good guys taking head-on, and beating, an evil empire – while the prequels have the guardians of order fighting an enemy that strikes at them from the shadows in what is ultimately a tale of tragedy rather than triumph. And in place of the youthful Luke, Phantom Menace put an eight year old Anakin at the center of events, while in Clones and Revenge the character was a really difficult adolescent. There is also no question that the lavish use of CGI gave the films a different look.

These are not, as such, bad things. Nor were they necessarily surprises. It was unreasonable for anyone to expect the prequels to simply be more of the same, for a tale of the Republic's fall to be the same in structure and tone and sense as the tale of its restoration, for the story of Anakin to be the same as Luke's, for the look of the films to be identical after nearly two decades of FX wizards upping the ante – the obvious appreciated by surprisingly, dismayingly, few viewers (among them, Scott Mendelson and Timothy Sexton, whose pieces I strongly recommend as a corrective).

Of course, one can still reply that these differences resulted in a less appealing, less compelling product. (The more complex world-building and plotting is a lot easier to do in a novel, or a TV series, than a two-hour movie, for instance, while the premise and the central character of the prequels do not lend themselves so easily to crowd-pleasing stories.) And that there are ways in which the new approach did not wholly succeed. (There is plenty of room for criticism of the films' handling of the Old Republic's politics, for instance.) But all the same, a more sophisticated, nuanced judgment was called for, and I, for one, found much to enjoy in the films beyond their indisputable appeal as spectacle. One thing the narrative cannot be faulted for is a lack of ambition or scope, and the plot did get more engaging as it developed, especially where it concerned Anakin's transformation into Darth Vader, and the slide of the Republic toward disaster. (Indeed, Episode III was justly the best-reviewed of the three films, quite well-received by a good many critics.) In fact, I will go further and say that, like no other Hollywood films since the originals, the prequels gave the big screen an epic science fiction tale (helped by that much-derided CGI, which afforded a much broader view of Lucas's galaxy) which lived up to the romanticism and scale the term "space opera" is supposed to denote.

For all their shortcomings, a general reappraisal is long, long overdue.

1. The relevant bit of dialogue is in an exceptionally stupid and patronizing scene in this exceptionally stupid and patronizing film. Ordinarily this fact wouldn't merit a footnote, but I wouldn't have felt right not clarifying that point, lest this mention be mistaken for an endorsement of this example of awful writing and (where the leads are concerned) really bad casting.

Friday, June 1, 2012

Remembering Babylon 5

Babylon 5 seems to have had a somewhat lower profile in its "afterlife" than might have been expected once upon a time, the show only infrequently discussed or referenced these past few years.

I can think of a number of reasons why this has been the case. One is that it appeared at a time when space opera was fairly plentiful on television (as many as a half dozen such shows in their first runs, compared with zero today). Another is that the franchise has had a fairly low profile since its run ended over a decade ago, as its spin-off Crusade lasted a mere thirteen episode season, the last tie-in novel appeared back in 2001, the 2002 pilot for another spin-off, The Legend of the Rangers, failed to spark enough enthusiasm for a series of its own, the straight-to-DVD The Lost Tales came to a halt after a single installment, and the one reported cinematic project (The Memory of Shadows) never entered production - while the reruns have been absent from American TV for years to the best of my knowledge. Meanwhile, J. Michael Stracyznski's next TV project, Jeremiah, didn't quite set the world on fire, after which Straczynski left TV-land for comics (working on series from The Fantastic Four to Wonder Woman) and feature films (where his recent credits include Chameleon, Ninja Assassin, Thor and Underworld: Awakening). And in the years since, much of the attention devoted to the genre has instead gravitated toward the more recent Farscape, Firefly, and of course, the grossly overrated Battlestar Galactica, widely hailed as the definitive anti-Trek.

Still, B5 was consistently the best space opera of science fiction television's recent "golden age," to say nothing of a crucial innovator (no one has handled story arcs quite like that show did), and its achievements are not quite forgotten - io9 recently looking back at the show's most game-changing episodes.

Wednesday, May 30, 2012

Looking Back: Ray Kurzweil’s 2009

Originally Published In THE FIX, January 1, 2009

Ray Kurzweil’s 1999 classic, The Age of Spiritual Machines, devoted a full chapter to his predictions for the year 2009. Given that this year is already upon us, it only seems appropriate to take a look back at the predictions made for it by a thinker whose influence on science fiction has been so vast.

I will not attempt here to discuss with every one of his prognostications, some of which were vague, did not lend themselves to easy verification, or were simply commonplaces that had little to do with the book's area of concentration. Instead I will focus on the relatively unambiguous claims, particularly those having to do with specific developments he assumes to have materialized not as clunky, cranky, buggy, unmarketable prototypes forever "in development," or novelties or luxuries for people with too much money and too little sense, but to have become refined enough and affordable enough to proliferate widely. (Readers of that book will remember that there are many of these, several of which he reasserted in his discussion of the "2010 scenario" in his more recent The Singularity is Near, which appeared in 2005.) Additionally, given that the changes in information and communications technology are the linchpin of everything else here, it would only be fair to focus on those.

It must be noted that some of his anticipations proved at least partially correct, such as his observation that

• Local area networks and wireless technology will become widespread.

• The downloading of content from the web would increasingly replace the transfer of physical objects.

• There would be increasing convergence between various media via the web.

• The transition from 2-D to 3-D chip technology would be clearly underway.

• A $1,000 PC ($1300 in 2008 dollars) would be capable of 1 trillion calculations per second.

However, this was greatly outweighed by the things he clearly got wrong.

• There would be a significant reduction in the size and weight of personal computers below that of the notebooks of 1999--banality, except that significant in this case means that PCs could now be embedded in clothes and jewelry.

• Rotating memories (like hard drives, CD-ROMs and DVDs) would be on their way out, in favor of "purely electronic" memory.

• Speech-recognition software will make text primarily voice-created, so that keyboards will be increasingly vestigial.

• Personal computers will be equipped with facial recognition hardware and software adequate to let them recognize their owners on sight.

• Print-to-speech and speech-to-text readers will become so capable, compact and affordable as to be standard equipment taken everywhere (and indeed, eliminate many of the handicaps of blindness and deafness in the process).

• Navigational devices will enable the blind to make their way using comparable devices, with which they can communicate verbally.

• Paraplegics will walk using computer-controlled orthotic devices.

• Long-distance driving will be automated by "intelligent roads."

• Translation software (letting you speak in English and be heard in Japanese, and vice-versa) will be standard in phones.

• "Smart paper"-quality visual displays will be standard in our electronic devices.

• Eyeglasses-based displays will routinely be used.

• Most reading (as in school) will be done off electronic displays, with e-book reader-style devices becoming the most common way of reading print media.

• Speakers will be replaced by miniaturized, chip-based devices generating 3-D sound.

• Not only will "responsive and precise" Language User Interfaces be commonly available, but "intelligent assistants" with "natural-language understanding, problem solving, and animated personalities [will] routinely assist with finding information, answering questions, and conducting transactions."

• Virtual reality, while still primitive, would be a standard consumer item (replacing, for instance, 1990s-style chat rooms), lacking only "surround" tactile environments.

• Autonomous nano-machines will have been demonstrated (though they will not yet be commercially useful).

Inevitably, his guesses about the direct consequences of technological change were off--for instance, his expectation that computing technology would be so powerful and widely available that teachers effectively become "human resources" workers concentrating on the motivation, psychological well-being and socialization of their students rather than the imparting of knowledge or intellectual training as such.

Naturally, this carried over to larger, more significant mistakes at the big-picture level. Notably Kurzweil argued that, at least in the U.S., the "ten years leading up to 2009 [will] have seen continuous economic expansion and prosperity," including a booming stock market and price deflation, with the continuing tech boom (despite occasional "corrections") leading the way. He also argued that the underclass will be stable in size, and politically neutralized by public assistance and the broad general affluence.

This prediction proved to be such utter nonsense that I will not bother to tear it apart here, virtually every word I have ever published about economics attesting to how differently things have turned out. The same goes for his guesses about the future of warfare, when he said that computer and communications security would be the main mission of the U.S. Defense Department; and that when there is violence of a more conventional kind,
Humans are generally far removed from the scene of battle. Warfare is dominated by unmanned intelligent airborne devices. Many of the flying weapons are the size of birds, or smaller.
For the moment, never mind the implicit dehumanization of the enemy, or the civilians they will not easily be extricated from, and consider the sense in which he meant this: that U.S. troops will not be there at the scene of battle. Given the attention lavished on aircraft like the Predator, it may seem that he had something there--but one need only look at the massive and costly U.S. commitment of ground troops to Iraq to see that this claim is total nonsense.

Admittedly, Kurzweil's errors will not surprise those who revel in such errors for their own sake. However, we simply cannot get away from the problem of predicting the future, as pressing issues like climate change and resource depletion remind us. For that reason, it is far more useful to think about why he was wrong rather than wallow in the "futility" of ever trying to think ahead, gratifying as some find this to be.

My assessment is that Kurzweil's predictions had two fundamental weaknesses. One is that he succumbed to "futurehype" about a few key technologies (in particular neural net-based pattern recognition), expecting that they would be much more advanced than they really were. The other is his facile view of the "softer" side of his subject, in particular politics, macroeconomics, and I daresay, ecology, given the role of oil supplies in our current troubles. Both are inextricable from the prevailing "wisdom" of the late ‘90s tech boom that was just about peaking when he was writing.

Of course, I should acknowledge that Richard Dooling, in his recent book Rapture for Geeks: When AI Outsmarts IQ, was more forgiving in his assessment of some of these same points. Additionally, one can argue that this is only the beginning of 2009, and that a truly fair assessment will have to wait for New Year's Day 2010; or failing this, that he might prove to have been just slightly off where some of his technological predictions are concerned, perhaps pointing to the recent proliferation of e-book readers.

Nonetheless, others seem to be clearly much further off, like the claims for telephone call translation and virtual reality. And the fact that so much of this change has come along much more slowly than he suggested is itself significant; the acceleration of technological evolution is the very foundation of the technological Singularity to which he has devoted so much ink, and with which he has become so closely identified.

Tuesday, May 22, 2012

Reading Bulldog Drummond

I was recently surprised to find that the Internet Movie Data Base lists a Bulldog Drummond movie as in development for 2013. There is so little information available about the project that it may just be a "myth of IMDB" (like The Brazilian Job). Even if that is the case, that there should be a myth at all is a reminder of the utter unwillingness of movie-land to completely let go of any franchise.

Until recently reading H.C. "Sapper" McNeile's original novel, Bulldog Drummond (1920), I knew the character only secondhand, from mentions of him in studies of spy fiction I'd read, and from a couple of movies that treated the material rather loosely – 1969's Some Girls Do (the second and last in a series of films that updated the character for the '60s), and 1983's Bullshot Crummond (a parody of the series I found very entertaining despite my limited familiarity with the butt of its jokes, and which seems to me underappreciated).

In that first book, Bulldog is the nickname of Captain Hugh Drummond, a decorated British Army officer of World War I who has since returned to private life as a "sportsman and a gentleman." Quickly getting bored with the leisure his wealth affords him, he decides, initially as something of a lark, to put an ad in the newspaper in which he indicates a hunger for "diversion. Legitimate, if possible; but crime, if of a comparatively humorous description, no objection. Excitement essential."

The ad is soon answered by Phyllis Benton, the tale's damsel in distress and romantic interest, whose father has been drawn into the machinations of one Carl Peterson, and his henchman Henry Lakington. Drummond finds plenty to interest him here, and soon enough, with the help of some of his ex-Army friends, is taking on a plot for a takeover of Britain by left-wing revolutionaries financed by a pack of greedy and vengeful foreigners.

It has often been said that Drummond can be thought of as a proto-James Bond. There is the protagonist's privileged social background, his wartime military experience, his penchant for sport and juvenile behavior (along with his buddies, he seems very much the upper-class hooligan), his knack for getting into trouble and then back out of it again. Like Bond he lives in a comfortable apartment in an upscale London district, tended by loyal personal servants (James Denny, and his wife), and is very particular about his cigarettes (with Virginia tobacco on one side of his case, and Turkish on the other).

There is, too, the feel of his adventures: the international criminal conspiracy headed by an evil genius (Carl Peterson) with an affinity for exotic and deadly pets, weapons and cronies (a cobra, a killer gorilla, knock-out gas), and a gadget-packed lair; his encounters with alluring girls good and bad (besides Phyllis, there is Peterson's daughter Irma); the combination of gentility and murderousness in his confrontations with the arch-villain (fully on display in such scenes as Peterson's visit to Drummond's apartment early in the story); the bad guys who in their arrogance or sadism prefer not to dispose of Drummond immediately, making bouts of unconsciousness and rounds of capture and escape a surprisingly regular feature of his life.1 The course of Drummond's investigations also anticipates Bond's, particularly his habit of baiting the bad guys into revealing bits of information, and his coincidentally eavesdropping on villains at the right time to catch crucial bits of their plans (often, during those periods when they think he's safely locked up). He even picks up an American detective as a helper in Jerome K. Green of the New York Police Department (a proto-Felix Lighter to Drummond's proto-Bond). And when it is all over, Peterson, like Ernst Stavro Blofeld in On Her Majesty's Secret Service (1963), manages to escape from justice at the end of the adventure, insuring that there will be a next time.

That is not to say that there are not significant differences. Bond is, after all, a professional operative working as part of a massive government organization rather than a restless amateur, and something of a cynic, world-weary, in a way that Drummond shows no sign of being. This attitude carries over to the treatments of politics in their respective adventures. While Fleming's plots were typically right-wing and nationalistic in their conception (so much so that Umberto Eco took them ironically), and the author was not at all shy about expressing his opinions about politics (or anything else), he never comes close to the long-winded, heavy-handed and contempt-filled denunciation of the left, the working class, and intellectuals in general to which Drummond and company devote most of the last tenth of the novel. (One might also add that Bulldog is much less debonair than 007, described by the author as possessed of a "cheerful ugliness," and at any rate is, despite Irma's charms, a one-woman man, marrying Phyllis and ending his first adventure on his honeymoon with her.) Nonetheless, Fleming's debt to Sapper is such a vast one that Julian Symons was substantially right when he described the James Bond novels as "the 'Sapper' pipe-dream converted to the mood of the fifties," and Bond as a "more sophisticated version of Bulldog Drummond" (270) in his classic study of the thriller genre, Mortal Consequences.

Sunday, May 20, 2012

Dirk Pitt Returns?

Recently reviewing the stats for this blog, I was struck by the number of web surfers who found their way to it looking for information about the possibility of Hollywood making more Dirk Pitt movies, whether sequels to Sahara (2005), or a new, "rebooted" series which starts with a clean slate (to judge by the frequency with which search keywords like "dirk pitt movie 2" and "any more dirk pitt movies" turn up in the section titled "traffic sources"). This seems to be mainly because of my earlier post, "Returning to Sahara: The Dirk Pitt Novels on Screen."

That post considered the pitfalls involved in translating Clive Cussler's novel Sahara from the page to the screen, rather than any plans for more Dirk Pitt films, about which I have heard, seen or read absolutely nothing – which is exactly what I would expect. The fact that Sahara was a commercial disappointment ordinarily would not rule out the studios taking another shot. After all, Hollywood has proven nothing short of fanatical about trying to squeeze every penny it can out of any and every established IP, and as fans can attest, the Dirk Pitt novels sure look like obvious source material for big-budget action-adventure films – especially as this franchise remains very much alive, new entries continuing to regularly appear on bestseller lists. Of course, as I pointed out in my previous post, adapting Pitt's adventures to the big screen is trickier than it looks, given how dated some of the plots have become (in light of changes in world politics, and industry sensibilities), but Hollywood has never been averse to seizing on a brand name and a few other select trappings (a character, a gimmick), and dispensing with everything else, and in the process launching a commercially viable film sequence, even as it leaves the purists unsatisfied. (They did it with James Bond, after all. And Jason Bourne. And Star Trek. And others far too numerous to list here.)

However, it is worth remembering that Sahara wasn't just a disappointment, but could be described without any hyperbole whatsoever as one of the biggest flops in movie history – and that this came on top of an earlier production of a Dirk Pitt film (1980's Raise the Titanic) which could be described in exactly the same terms. Indeed, one might say Cussler was lucky to see his series get a second chance, even if it took twenty-five years for this to happen. The way that extraordinary opportunity went has left the material seeming almost cursed – and silly as that may sound, one should not overestimate the rationality of this business. (In fact, when so much money is at stake, the factors to be weighed in the decisions regarding its use are so intangible, and every facet of commerce has come to be seeped in a casino mentality, one should bet on Hollywood's irrationality every time.) Perhaps even more frightening than any such curse, Cussler himself did not manage to have a good relationship with the producers, with the result a protracted, grinding legal battle in the aftermath of the film's financially ruinous shooting and release – and that is something that will frighten the least superstitious of Suits.

All that being the case, it may be a long time before anyone takes another crack at this series, if ever. However, for what it's worth, you can be sure that I will have something to say about any new developments in that story here.

Review: The Origin of Consciousness in the Breakdown of the Bicameral Mind, by Julian Jaynes

Boston: Houghton-Mifflin, 1976, pp. 467.

I first encountered Julian Jaynes ideas about bicameralism through Jack Dann's classic The Man Who Melted five years ago. I enjoyed the novel greatly, but was skeptical of the theory underlying it. Jaynes' theory, after all, holds that consciousness is not something intrinsic to life or humanity, but a cultural creation, which did not emerge until the second millennium B.C.. It seemed to me impossible that humanity could have ever been without consciousness, let alone reached an Iron Age level of civilization without it. How, for instance, could the "bicameral" mind have organized such a feat as the construction of the Great Pyramid of Cheops? How, too, could such a theory possibly be reconciled with the indications that even some higher animals (apes, dolphins) possess self-awareness?

Jaynes' book did not fully satisfy me with its answer to the first question, and didn't address the second question at all, but it did make the theory considerably harder to dismiss. In discussing the book, it seems appropriate to start with the author's definition of consciousness--which is not a capacity for sensation and observation, memory, or the ability to learn, think or reason. (Even these last, he argues, are predominantly unconscious processes.) Rather, consciousness is a model of the world around us we carry around in our heads enabling us to introspect, visualize and take decisions in a self-aware manner. Built upon the capacity of language to create metaphors, it consists of selected ("excerpted") details, arranged according to a sense of space and "spatialzed" time, and organized in narrative form, with dissonant information reconciled around an image of the self (as seen by oneself and by others)--and on the whole, a relatively small part of our mental life for all its complexity.

Prior to the arrival of consciousness, Jaynes posits the "bicameral" mind as the dominant thought mode. For bicameral persons, the left and right hemispheres of the brain worked rather more separately than is the norm today, with an important aspect of the separation the generation of auditory hallucinations, especially in moments when unconsciously acquired habit was inadequate, and stress high. Their mental life, in fact, is compared by Jaynes to contemporary observations of schizophrenia. A key difference, however, is that this was the norm rather than an exception. Indeed, he suggests society was actually structured hierarchically around such hallucinations, with priest-kings hallucinating the voices of gods (in many cases, originating in predecessors whose voices they continued to "hear" after their passing), and subordinates at each level the voices of the authority above them at such times, a control reinforced by such hallucinatory aids as centrally located houses of worship and ubiquitous depictions of deities--themselves organized as hierarchically as humans were. (This put the conflicting voices that torment so many schizophrenics in line, and as Jaynes has pointed out, this kind of sorting is not unknown among those suffering this condition today).

Jaynes suggests that bicameralism may have enabled the beginnings of civilization by making human beings subject to such a method of control. However, it did not endure. The intrinsic frailty of this system apart, it grew decreasingly effective as polities expanded--as with the empire of Hammurabi. An increased use of writing was one way of compensating, but the activity probably weakened bicameralism as well. The Bronze Age collapse (theorized by Jaynes as having been a result of the Theran catastrophe) knocked over these vulnerable structures, flinging together uprooted peoples forced to reckon with the inappropriateness of earlier modes of conduct, and with each others' differences (which may have led them to think that difference was internal for the first time), in harsh circumstances where deceit (thinking one thing and appearing to do another) had a high survival value. (Jaynes speculates also that there may have been an aspect of natural selection in the process, and that the development of narrative epics like Homer's Iliad contributed to the process as well.)

Over the millennium or so that followed, the conception of our behavior's driving forces was internalized, intellectualized (recognized as mental processes) and synthesized into consciousness as we now know it, which had clearly arrived by the time of such figures as Solon and Pythagoras, whose notions of mind, volition and responsibility are recognizable as similar to those we possess today. Still, a nostalgia for the certainties of divine voices speaking to us personally remained, in the laments of our myths of a "fall" which has distanced us from the gods, in methods of divination which attempt to reach those increasingly distant deities (like auguries, oracles, prophets and possession), and even in "scientism" (the quasi-religious treatment of scientific inquiry).1

In examining the argument over three decades after Jaynes' initial presentation of the theory, it seemed to me that, my initial objections aside, many of the smaller claims on which he has built his larger argument (like the nature of consciousness) remain a subject of dispute. His focus on the eastern Mediterranean region--he looks principally at Mesopotamia, Egypt, Greece and the Biblical Israelites--appears rather narrow. And the evidence available for his examination even of this history was necessarily limited, and full of ambiguities and gaps, from the uncertainty regarding the dating of particular works, to the meanings of key terms, to the question of how representative the works in question really were of how people lived and thought at the time (all of these evident in, for example, his reading of ancient Greek literature). Still, Jaynes is quite aware of the scantiness of some of the material, and the speculative nature of many of his claims, frequently acknowledging that what he presents is but a starting point for a fuller investigation.

Of course, the proof that would really convince a skeptic--proof that a society can actually function this way, let alone proof that this really was the way the species lived--may be unobtainable. However, what Jaynes does with the material available to him is genuinely impressive, the theory at the least fitting a great many facts. In the process, he makes a contribution to the indisputable view that our recognition and expression of abstract thought and subjectivity has grown enormously over these millennia (a view going back at least to G.W.F. Hegel), and a good case that schizophrenia and hallucination have played a larger role in antiquity than we appreciate. In the process he points the way to a significant part of the human story--as is so often the case with those who dare to offer Big Ideas, even when they do not persuade us completely of their version of the larger picture.

1. As Jaynes notes, in the second millennium B.C., human beings heard the voices directly; in the first millennium B.C., they were only able to access those voices through exceptional individuals, sites and efforts, like prophets and oracles; in the first millennium A.D., even these fell increasingly silent, leaving behind the texts based on them; and in the past thousand years, those texts have seen their authority eroded.

Friday, May 18, 2012

On the Word "Lifestyle": A Postscript

Half a year ago I discussed the contemporary use - and much more often, misuse, of the word "lifestyle" on this blog.

Your standard of living is a matter of how much money you make.

Your quality of life is an evaluation of your general well-being – of which income is only one part, as matters like physical health, leisure and social life enter into it.

A way of life is something you're born into, or adopted into, or adopt yourself. A culture has a way of life, and one shares that way of life when they belong to that culture.

A lifestyle is something an individual picks out of a catalog. It is something few people actually have, and it is not be confused with any of the others.

Saturday, May 5, 2012

The Anti-Humanism of Battlestar Galactica

Humanity, the reimagined Battlestar Galactica tells us, is Fallen, progress an illusion that dies when the creations of that "flawed creation," Man, finally turn on it – and do so successfully because those flawed human beings failed to understand that peace is just a period of preparation for the next round of war. (The first scene of the show, in fact, depicts the Cylons' strike on the Armistice Station that represents the Colonies' peace overture to them – adding insult to injury.)

Humanity also failed to appreciate that not only could the Enemy not be reasoned or bargained with, but that it had penetrated within, their ruthless agents and their hapless dupes undermined us to the point that we had little defense against the external assault that finished civilization off in hours.

Trying to examine the history leading up to it, to actually understand the origins of this situation, is pointless, and even traitorous. ("How this could've happened, why it happened – none of that matters right now. All that does matter is that as of this moment we are at war," Commander William Adama says in his call to arms during the pilot.) And that's not the only way in which too much truth is an inconvenience. In the wake of the disaster, the man who claims the mantle of leadership bolsters his legitimacy with a "useful" lie promulgating false hope (regarding Adama's knowledge of the location of Earth), and the forging of a consensus through appeals to the irrational, and the pressure to conform ("So say we all!").

Those who do not conform, like the dissenter and the radical (e.g. Tom Zarek), repeatedly justify the suspicion and contempt with which they are treated, as does the intellectual (e.g. Gaius Baltar), who is not only the very source of our predicament in his misuse of his technical expertise, but a self-seeking traitor at every turn. The rise of such men to a position of leadership is, of course, ruinous, not least because the masses are so easily led astray. Proving once more that they cannot be trusted to take care of themselves, that they are weak and soft and unwilling to face the Hard Facts of Life, the People grab the first chance to fly from the struggle – foolishly electing Baltar President (and foolishly permitted to do so by Adama and sitting President Laura Roslin when they decide not to falsify the election's results to head off this outcome), and leaving their ships for the soil of New Caprica, with the result that Democracy and Excessive Respect for the Law has led to the exchange of the bitter struggle for the even more bitter subjugation by the Enemy. (The second season, in fact, ends with a shot of Cylons marching down the settlement's single muddy street like the Wehrmacht parading through Paris after its capture in 1940 – an image all the more loaded in the American imagination by Anglo-Saxon perceptions of French arms during World War II – and the next begins with Baltar playing quisling to the Cylon occupiers.)

Naturally, it falls to the soldiers (who despite being the show's principal repositories of virtue are no exceptions to the spectacle of human degradation that is the cast of characters, from the pathetic Saul Tigh to the broken Kara Thrace to the tragic Felix Gaeta) to save the civilians from themselves yet again. However, even their heroics accomplish only so much. The Apocalypse runs its course, after which those who had indulged in the luxuries of the wicked cities finally return to the Land. Nonetheless, it is to be only a matter of time before they begin the same stupid cycle of rebuilding and destruction all over again – while the rather smug "angels" manipulate us toward some unknown end.

This was all taken for "gritty" – which is to say, "realistic." Yet one could more precisely, substantively and usefully describe it all as authoritarian, militarist, xenophobic, obscurantist, anti-rational and anti-modern, and just plain misanthropic, which is to say that it conforms to "reality" as seen only from a very specific part of the political spectrum.1 Such a reading of the show is, if anything, encouraged by the presentation of so much of what happened on the show as relevant to our contemporary politics (through its constant, shamelessly sensationalist evocations of the War on Terror).2

That such comments have not been far more common says a great deal about the political moment in which the show appeared, and what it takes to be "realistic."3 It also says a great deal about our moment that this is the show which has supplanted Star Trek as the template for science fiction television, its considerable influence evident in recent efforts like Stargate: Universe and the reimagined V (which reimagining is in itself a striking story of political inversion). Ken MacLeod has described this condition as well as anyone I can think of: "humanity as a rational and political animal died in 1979, and went to hell." The new Galactica, representative of the anti-humanist polar opposite to what Star Trek (once) stood for, is a story we told ourselves there.4

1. The original Galactica has also been noted for its right-wing politics. Still, many of the changes took it further in that direction, like the emphasis on humanity's nature as fallen, and the treatment of particular characters, like the transformation of Gaius Baltar from a power-hungry politician into a horny, status-seeking scientist with an accent the show's primarily North American audience would regard as foreign. There is, too, the addition of the civilian-military power struggle, which had Adama in a tug of war with President Laura Roslin, presented here as out of her depth, as she had been "only" a teacher earlier – not just a member of a profession held in low regard among Americans (and a lightning rod for the same anti-intellectualism we see in the treatment of Baltar), but one identified with the "liberal establishment" conservatives frequently demonize.
2. The show's first episode "33" had the heroes shooting down a hijacked passenger craft, while later in that same season Cylons were seen blowing themselves up like suicide bombers, captured Cylons got waterboarded, etc.. The result was to equate the September 11 terror attacks with the annihilation of a twelve-planet civilization, al-Qaida with the military might of the Cylons, and the context of a handful of refugees in a flotilla of life boats to that of the United States today – comparisons which entail so much exaggeration as to render any analogy useless (while the tone was usually that of FOX News in Space).
3. Admittedly, the show did not adhere to the view described with perfect consistency, at least after the first two seasons. Indeed, some argued that the politics were quite the opposite of those described here in season three. In particular, much was made of the show's protagonists becoming resistance fighters against the occupying Cylons in that season's first episodes, with many reviewers suggesting an equation of the human "insurgents" with present-day Iraqis, while subsequent episodes like "Dirty Hands" and "The Woman King" entered new territory by raising questions of class, labor and bigotry. (Indeed, Jonah Goldberg adored the first two seasons, and hated the later ones, which thoughts he shared in a post pointedly titled "How Politics Destroyed a Great TV Show" – which is to say, how the merest whiff of liberal politics made intolerable to him a show he'd once enjoyed because of its agreement with his world-view.) However, the extent and depth of the changes should not be overstated. They did not alter the basic premises of the series, and in any case, were quickly drowned in the pseudo-religious muddle already moving to the fore.
4. Ironically, the new show's creator, Ronald D. Moore, worked as a producer and writer on the Star Trek franchise, scripting nearly sixty episodes of Star Trek: The Next Generation, Star Trek: Deep Space Nine and Star Trek: Voyager, as well as the films Star Trek: Generations and Star Trek: First Contact.

Wednesday, April 25, 2012

Making Iron Man 3: A Geopolitical Perspective

It now appears that Chinese company DMG will invest nine figures in the production of Iron Man 3 – an unprecedented collaboration between a Chinese business and Hollywood, which will also see DMG distributing the film domestically.

I was immediately struck by the irony of the news given not just the fact that Iron Man's most famous antagonist is the Mandarin (as many a comic book fan has already pointed out in the comments pages to various reports of this development), but the story of this particular hero. Tony Stark, after all, is just a Victorian Edisonade protagonist updated for the world of the Cold War-era military-industrial complex – who began his adventures while fighting "Asian Communism" in Vietnam (the Cold War hawkishness of the early '60s Marvel comics being especially pointed here).

Certainly much has changed in world politics in the half century since that tale was first penned. Yet, while some more recent writers (like Warren Ellis) have offered relatively nuanced treatments of the comic, the movies pretty much stuck with the simplicities of the original vision, trading the Cold War for the War on Terror, and Vietnam for Afghanistan, while playing up the idea of Iron Man as an unapologetic embodiment of U.S. military superiority, and military interventionism - which one would certainly imagine to be problematic from a Chinese perspective.1 In the second film, China is even cited as a possible antagonist to the U.S. (when Joshua Hammer names it along with Iran and North Korea while speculating about potential foreign buyers of the Iron Man technology). Moreover, there has been reason to think that such politics matter, in the wake of the complications faced by the producers of the recent remake of Red Dawn.2

Of course, it may simply be that business trumps politics in this case. The attraction of this particular business opportunity, and perhaps, a closer relationship with Hollywood over the long-term, may appear too great to resist (especially with a staggering $3 trillion worth of foreign exchange burning holes in that nation's pockets, and the prospect of the movie being partially filmed in China itself). Additionally, while it is hard to think of a major comic book hero that would be more offensive to the sensibility of a country run by a Communist Party than Tony Stark, China's establishment has long since ceased to be Communist in anything but name – and in the post-Mao era in which "To be rich is glorious," and the Chinese Dream looks not unlike the American Dream, it could be that a figure like Stark (a tech industry Gary Stu if ever there was one) enjoys a greater appeal than may seem the case at first glance.

1. For a critical take of the politics of the Iron Man films, check out Cristobal Giraldez Catalan's review of the first movie and Hiram Lee's reviews of both the first and second films.
2. Worried by Chinese disapproval, the producers changed the absurd premise of a Chinese invasion of the U.S. to an even more absurd one in which North Korea appears as the would-be conqueror of the United States – by editing the already-shot film, a procedure I expect will appear clumsy when the film actually hits theaters. (Incidentally, the scenario of a North Korean invasion and occupation of the U.S. was presented in last year's video game Homefront, written by John Milius, director of the original Red Dawn.)

Sunday, April 15, 2012

Remembering the Titanic

I have at times wondered why, a century after the event, the sinking of the Titanic continues to receive as much attention as it does. There have been bigger ships (the supertankers and aircraft carriers aside, there have been been bigger liners, like today's colossal Oasis class ships), and deadlier maritime disasters (like the sinking of the Wilhelm Gustoff, which killed more than 9,000, and the Goya, which killed more than 6,000, compared with the 1,500 who died when the Titanic went down). The sinking of another British ocean liner, the Lusitania, just three years later, took nearly as many lives (almost 1,200), and had a far greater significance for the course of world history given its impact on international opinion during World War I.

Part of this seems a reflection of the memorialization of the event, which in turn has made it better known and therefore more likely to be memorialized yet again, as has certainly happened on the movie screen – in 1955's A Night to Remember (adapted from Walter Lord's successful book by Eric Ambler), in 1964's The Unsinkable Molly Brown (based on the hit play), and of course, in 1997's James Cameron's Titanic (one of the biggest blockbusters in history). But this hardly explains why the event was memorialized in the first place, or why the retelling of the story as fiction and nonfiction continues to find an interested audience.

It seems notable that the Titanic was a Belfast-built ship of the British White Star line, bound from Southampton, England for New York, which sank off Newfoundland, Canada – and did so in peacetime, when ships might have been expected to safely cross the ocean, and sailing associated with pleasure rather than necessity and danger. By contrast, the Gustoff and Goya were German ships sunk by Soviet submarines in the Baltic in the last months of World War II, denying them such resonance for the English-speaking world. At the same time, the absence of the obvious political freight of events like the sinking of the Gustoff, or the Lusitania, from the story of the Titanic; the remembrance of the ship as the epitome of luxury in its era; and the presence of numerous celebrities of the day on its passenger list (members of the Astor and Guggenheim families among them); makes it suitable raw material for those inclined to present the human experience as a collection of "stories for the amusement of the comfortable classes" (as Gore Vidal once said of a particular bestselling historian).

Yet, even if one can approach the story in this way, the event also lends itself to an extraordinary range of allegories and comments. One can see in the disparities of wealth and position among those who journeyed aboard the ship, the collision with the iceberg, the handling of the accident and the aftermath a story of the technological hubris, corporate irresponsibility, and social inequality of an era that in its fundamentals is not much different from our own. (James Cameron, certainly, has highlighted this side of the story in his comments about his film.) At the same time, the 1912 disaster can appear like one of the last moments of the long nineteenth century that has been such an object of nostalgia, and a foreshadowing of the disasters that lay ahead. (Indeed, the first episode of Downton Abbey opened with a telegram regarding the ship's sinking, and the death of the heir to the titular estate.) Indeed, the fascination of the Titanic appears a miniature version of the hold that period has on our imaginations – the Victorian-Edwardian era both as we knew it, and as it might have been.

Subscribe Now: Feed Icon