Wednesday, May 30, 2012

Looking Back: Ray Kurzweil’s 2009

Originally Published In THE FIX, January 1, 2009

Ray Kurzweil’s 1999 classic, The Age of Spiritual Machines, devoted a full chapter to his predictions for the year 2009. Given that this year is already upon us, it only seems appropriate to take a look back at the predictions made for it by a thinker whose influence on science fiction has been so vast.

I will not attempt here to discuss with every one of his prognostications, some of which were vague, did not lend themselves to easy verification, or were simply commonplaces that had little to do with the book's area of concentration. Instead I will focus on the relatively unambiguous claims, particularly those having to do with specific developments he assumes to have materialized not as clunky, cranky, buggy, unmarketable prototypes forever "in development," or novelties or luxuries for people with too much money and too little sense, but to have become refined enough and affordable enough to proliferate widely. (Readers of that book will remember that there are many of these, several of which he reasserted in his discussion of the "2010 scenario" in his more recent The Singularity is Near, which appeared in 2005.) Additionally, given that the changes in information and communications technology are the linchpin of everything else here, it would only be fair to focus on those.

It must be noted that some of his anticipations proved at least partially correct, such as his observation that

• Local area networks and wireless technology will become widespread.

• The downloading of content from the web would increasingly replace the transfer of physical objects.

• There would be increasing convergence between various media via the web.

• The transition from 2-D to 3-D chip technology would be clearly underway.

• A $1,000 PC ($1300 in 2008 dollars) would be capable of 1 trillion calculations per second.

However, this was greatly outweighed by the things he clearly got wrong.

• There would be a significant reduction in the size and weight of personal computers below that of the notebooks of 1999--banality, except that significant in this case means that PCs could now be embedded in clothes and jewelry.

• Rotating memories (like hard drives, CD-ROMs and DVDs) would be on their way out, in favor of "purely electronic" memory.

• Speech-recognition software will make text primarily voice-created, so that keyboards will be increasingly vestigial.

• Personal computers will be equipped with facial recognition hardware and software adequate to let them recognize their owners on sight.

• Print-to-speech and speech-to-text readers will become so capable, compact and affordable as to be standard equipment taken everywhere (and indeed, eliminate many of the handicaps of blindness and deafness in the process).

• Navigational devices will enable the blind to make their way using comparable devices, with which they can communicate verbally.

• Paraplegics will walk using computer-controlled orthotic devices.

• Long-distance driving will be automated by "intelligent roads."

• Translation software (letting you speak in English and be heard in Japanese, and vice-versa) will be standard in phones.

• "Smart paper"-quality visual displays will be standard in our electronic devices.

• Eyeglasses-based displays will routinely be used.

• Most reading (as in school) will be done off electronic displays, with e-book reader-style devices becoming the most common way of reading print media.

• Speakers will be replaced by miniaturized, chip-based devices generating 3-D sound.

• Not only will "responsive and precise" Language User Interfaces be commonly available, but "intelligent assistants" with "natural-language understanding, problem solving, and animated personalities [will] routinely assist with finding information, answering questions, and conducting transactions."

• Virtual reality, while still primitive, would be a standard consumer item (replacing, for instance, 1990s-style chat rooms), lacking only "surround" tactile environments.

• Autonomous nano-machines will have been demonstrated (though they will not yet be commercially useful).

Inevitably, his guesses about the direct consequences of technological change were off--for instance, his expectation that computing technology would be so powerful and widely available that teachers effectively become "human resources" workers concentrating on the motivation, psychological well-being and socialization of their students rather than the imparting of knowledge or intellectual training as such.

Naturally, this carried over to larger, more significant mistakes at the big-picture level. Notably Kurzweil argued that, at least in the U.S., the "ten years leading up to 2009 [will] have seen continuous economic expansion and prosperity," including a booming stock market and price deflation, with the continuing tech boom (despite occasional "corrections") leading the way. He also argued that the underclass will be stable in size, and politically neutralized by public assistance and the broad general affluence.

This prediction proved to be such utter nonsense that I will not bother to tear it apart here, virtually every word I have ever published about economics attesting to how differently things have turned out. The same goes for his guesses about the future of warfare, when he said that computer and communications security would be the main mission of the U.S. Defense Department; and that when there is violence of a more conventional kind,
Humans are generally far removed from the scene of battle. Warfare is dominated by unmanned intelligent airborne devices. Many of the flying weapons are the size of birds, or smaller.
For the moment, never mind the implicit dehumanization of the enemy, or the civilians they will not easily be extricated from, and consider the sense in which he meant this: that U.S. troops will not be there at the scene of battle. Given the attention lavished on aircraft like the Predator, it may seem that he had something there--but one need only look at the massive and costly U.S. commitment of ground troops to Iraq to see that this claim is total nonsense.

Admittedly, Kurzweil's errors will not surprise those who revel in such errors for their own sake. However, we simply cannot get away from the problem of predicting the future, as pressing issues like climate change and resource depletion remind us. For that reason, it is far more useful to think about why he was wrong rather than wallow in the "futility" of ever trying to think ahead, gratifying as some find this to be.

My assessment is that Kurzweil's predictions had two fundamental weaknesses. One is that he succumbed to "futurehype" about a few key technologies (in particular neural net-based pattern recognition), expecting that they would be much more advanced than they really were. The other is his facile view of the "softer" side of his subject, in particular politics, macroeconomics, and I daresay, ecology, given the role of oil supplies in our current troubles. Both are inextricable from the prevailing "wisdom" of the late ‘90s tech boom that was just about peaking when he was writing.

Of course, I should acknowledge that Richard Dooling, in his recent book Rapture for Geeks: When AI Outsmarts IQ, was more forgiving in his assessment of some of these same points. Additionally, one can argue that this is only the beginning of 2009, and that a truly fair assessment will have to wait for New Year's Day 2010; or failing this, that he might prove to have been just slightly off where some of his technological predictions are concerned, perhaps pointing to the recent proliferation of e-book readers.

Nonetheless, others seem to be clearly much further off, like the claims for telephone call translation and virtual reality. And the fact that so much of this change has come along much more slowly than he suggested is itself significant; the acceleration of technological evolution is the very foundation of the technological Singularity to which he has devoted so much ink, and with which he has become so closely identified.

Tuesday, May 22, 2012

Reading Bulldog Drummond

I was recently surprised to find that the Internet Movie Data Base lists a Bulldog Drummond movie as in development for 2013. There is so little information available about the project that it may just be a "myth of IMDB" (like The Brazilian Job). Even if that is the case, that there should be a myth at all is a reminder of the utter unwillingness of movie-land to completely let go of any franchise.

Until recently reading H.C. "Sapper" McNeile's original novel, Bulldog Drummond (1920), I knew the character only secondhand, from mentions of him in studies of spy fiction I'd read, and from a couple of movies that treated the material rather loosely – 1969's Some Girls Do (the second and last in a series of films that updated the character for the '60s), and 1983's Bullshot Crummond (a parody of the series I found very entertaining despite my limited familiarity with the butt of its jokes, and which seems to me underappreciated).

In that first book, Bulldog is the nickname of Captain Hugh Drummond, a decorated British Army officer of World War I who has since returned to private life as a "sportsman and a gentleman." Quickly getting bored with the leisure his wealth affords him, he decides, initially as something of a lark, to put an ad in the newspaper in which he indicates a hunger for "diversion. Legitimate, if possible; but crime, if of a comparatively humorous description, no objection. Excitement essential."

The ad is soon answered by Phyllis Benton, the tale's damsel in distress and romantic interest, whose father has been drawn into the machinations of one Carl Peterson, and his henchman Henry Lakington. Drummond finds plenty to interest him here, and soon enough, with the help of some of his ex-Army friends, is taking on a plot for a takeover of Britain by left-wing revolutionaries financed by a pack of greedy and vengeful foreigners.

It has often been said that Drummond can be thought of as a proto-James Bond. There is the protagonist's privileged social background, his wartime military experience, his penchant for sport and juvenile behavior (along with his buddies, he seems very much the upper-class hooligan), his knack for getting into trouble and then back out of it again. Like Bond he lives in a comfortable apartment in an upscale London district, tended by loyal personal servants (James Denny, and his wife), and is very particular about his cigarettes (with Virginia tobacco on one side of his case, and Turkish on the other).

There is, too, the feel of his adventures: the international criminal conspiracy headed by an evil genius (Carl Peterson) with an affinity for exotic and deadly pets, weapons and cronies (a cobra, a killer gorilla, knock-out gas), and a gadget-packed lair; his encounters with alluring girls good and bad (besides Phyllis, there is Peterson's daughter Irma); the combination of gentility and murderousness in his confrontations with the arch-villain (fully on display in such scenes as Peterson's visit to Drummond's apartment early in the story); the bad guys who in their arrogance or sadism prefer not to dispose of Drummond immediately, making bouts of unconsciousness and rounds of capture and escape a surprisingly regular feature of his life.1 The course of Drummond's investigations also anticipates Bond's, particularly his habit of baiting the bad guys into revealing bits of information, and his coincidentally eavesdropping on villains at the right time to catch crucial bits of their plans (often, during those periods when they think he's safely locked up). He even picks up an American detective as a helper in Jerome K. Green of the New York Police Department (a proto-Felix Lighter to Drummond's proto-Bond). And when it is all over, Peterson, like Ernst Stavro Blofeld in On Her Majesty's Secret Service (1963), manages to escape from justice at the end of the adventure, insuring that there will be a next time.

That is not to say that there are not significant differences. Bond is, after all, a professional operative working as part of a massive government organization rather than a restless amateur, and something of a cynic, world-weary, in a way that Drummond shows no sign of being. This attitude carries over to the treatments of politics in their respective adventures. While Fleming's plots were typically right-wing and nationalistic in their conception (so much so that Umberto Eco took them ironically), and the author was not at all shy about expressing his opinions about politics (or anything else), he never comes close to the long-winded, heavy-handed and contempt-filled denunciation of the left, the working class, and intellectuals in general to which Drummond and company devote most of the last tenth of the novel. (One might also add that Bulldog is much less debonair than 007, described by the author as possessed of a "cheerful ugliness," and at any rate is, despite Irma's charms, a one-woman man, marrying Phyllis and ending his first adventure on his honeymoon with her.) Nonetheless, Fleming's debt to Sapper is such a vast one that Julian Symons was substantially right when he described the James Bond novels as "the 'Sapper' pipe-dream converted to the mood of the fifties," and Bond as a "more sophisticated version of Bulldog Drummond" (270) in his classic study of the thriller genre, Mortal Consequences.

Sunday, May 20, 2012

Dirk Pitt Returns?

Recently reviewing the stats for this blog, I was struck by the number of web surfers who found their way to it looking for information about the possibility of Hollywood making more Dirk Pitt movies, whether sequels to Sahara (2005), or a new, "rebooted" series which starts with a clean slate (to judge by the frequency with which search keywords like "dirk pitt movie 2" and "any more dirk pitt movies" turn up in the section titled "traffic sources"). This seems to be mainly because of my earlier post, "Returning to Sahara: The Dirk Pitt Novels on Screen."

That post considered the pitfalls involved in translating Clive Cussler's novel Sahara from the page to the screen, rather than any plans for more Dirk Pitt films, about which I have heard, seen or read absolutely nothing – which is exactly what I would expect. The fact that Sahara was a commercial disappointment ordinarily would not rule out the studios taking another shot. After all, Hollywood has proven nothing short of fanatical about trying to squeeze every penny it can out of any and every established IP, and as fans can attest, the Dirk Pitt novels sure look like obvious source material for big-budget action-adventure films – especially as this franchise remains very much alive, new entries continuing to regularly appear on bestseller lists. Of course, as I pointed out in my previous post, adapting Pitt's adventures to the big screen is trickier than it looks, given how dated some of the plots have become (in light of changes in world politics, and industry sensibilities), but Hollywood has never been averse to seizing on a brand name and a few other select trappings (a character, a gimmick), and dispensing with everything else, and in the process launching a commercially viable film sequence, even as it leaves the purists unsatisfied. (They did it with James Bond, after all. And Jason Bourne. And Star Trek. And others far too numerous to list here.)

However, it is worth remembering that Sahara wasn't just a disappointment, but could be described without any hyperbole whatsoever as one of the biggest flops in movie history – and that this came on top of an earlier production of a Dirk Pitt film (1980's Raise the Titanic) which could be described in exactly the same terms. Indeed, one might say Cussler was lucky to see his series get a second chance, even if it took twenty-five years for this to happen. The way that extraordinary opportunity went has left the material seeming almost cursed – and silly as that may sound, one should not overestimate the rationality of this business. (In fact, when so much money is at stake, the factors to be weighed in the decisions regarding its use are so intangible, and every facet of commerce has come to be seeped in a casino mentality, one should bet on Hollywood's irrationality every time.) Perhaps even more frightening than any such curse, Cussler himself did not manage to have a good relationship with the producers, with the result a protracted, grinding legal battle in the aftermath of the film's financially ruinous shooting and release – and that is something that will frighten the least superstitious of Suits.

All that being the case, it may be a long time before anyone takes another crack at this series, if ever. However, for what it's worth, you can be sure that I will have something to say about any new developments in that story here.

Review: The Origin of Consciousness in the Breakdown of the Bicameral Mind, by Julian Jaynes

Boston: Houghton-Mifflin, 1976, pp. 467.

I first encountered Julian Jaynes ideas about bicameralism through Jack Dann's classic The Man Who Melted five years ago. I enjoyed the novel greatly, but was skeptical of the theory underlying it. Jaynes' theory, after all, holds that consciousness is not something intrinsic to life or humanity, but a cultural creation, which did not emerge until the second millennium B.C.. It seemed to me impossible that humanity could have ever been without consciousness, let alone reached an Iron Age level of civilization without it. How, for instance, could the "bicameral" mind have organized such a feat as the construction of the Great Pyramid of Cheops? How, too, could such a theory possibly be reconciled with the indications that even some higher animals (apes, dolphins) possess self-awareness?

Jaynes' book did not fully satisfy me with its answer to the first question, and didn't address the second question at all, but it did make the theory considerably harder to dismiss. In discussing the book, it seems appropriate to start with the author's definition of consciousness--which is not a capacity for sensation and observation, memory, or the ability to learn, think or reason. (Even these last, he argues, are predominantly unconscious processes.) Rather, consciousness is a model of the world around us we carry around in our heads enabling us to introspect, visualize and take decisions in a self-aware manner. Built upon the capacity of language to create metaphors, it consists of selected ("excerpted") details, arranged according to a sense of space and "spatialzed" time, and organized in narrative form, with dissonant information reconciled around an image of the self (as seen by oneself and by others)--and on the whole, a relatively small part of our mental life for all its complexity.

Prior to the arrival of consciousness, Jaynes posits the "bicameral" mind as the dominant thought mode. For bicameral persons, the left and right hemispheres of the brain worked rather more separately than is the norm today, with an important aspect of the separation the generation of auditory hallucinations, especially in moments when unconsciously acquired habit was inadequate, and stress high. Their mental life, in fact, is compared by Jaynes to contemporary observations of schizophrenia. A key difference, however, is that this was the norm rather than an exception. Indeed, he suggests society was actually structured hierarchically around such hallucinations, with priest-kings hallucinating the voices of gods (in many cases, originating in predecessors whose voices they continued to "hear" after their passing), and subordinates at each level the voices of the authority above them at such times, a control reinforced by such hallucinatory aids as centrally located houses of worship and ubiquitous depictions of deities--themselves organized as hierarchically as humans were. (This put the conflicting voices that torment so many schizophrenics in line, and as Jaynes has pointed out, this kind of sorting is not unknown among those suffering this condition today).

Jaynes suggests that bicameralism may have enabled the beginnings of civilization by making human beings subject to such a method of control. However, it did not endure. The intrinsic frailty of this system apart, it grew decreasingly effective as polities expanded--as with the empire of Hammurabi. An increased use of writing was one way of compensating, but the activity probably weakened bicameralism as well. The Bronze Age collapse (theorized by Jaynes as having been a result of the Theran catastrophe) knocked over these vulnerable structures, flinging together uprooted peoples forced to reckon with the inappropriateness of earlier modes of conduct, and with each others' differences (which may have led them to think that difference was internal for the first time), in harsh circumstances where deceit (thinking one thing and appearing to do another) had a high survival value. (Jaynes speculates also that there may have been an aspect of natural selection in the process, and that the development of narrative epics like Homer's Iliad contributed to the process as well.)

Over the millennium or so that followed, the conception of our behavior's driving forces was internalized, intellectualized (recognized as mental processes) and synthesized into consciousness as we now know it, which had clearly arrived by the time of such figures as Solon and Pythagoras, whose notions of mind, volition and responsibility are recognizable as similar to those we possess today. Still, a nostalgia for the certainties of divine voices speaking to us personally remained, in the laments of our myths of a "fall" which has distanced us from the gods, in methods of divination which attempt to reach those increasingly distant deities (like auguries, oracles, prophets and possession), and even in "scientism" (the quasi-religious treatment of scientific inquiry).1

In examining the argument over three decades after Jaynes' initial presentation of the theory, it seemed to me that, my initial objections aside, many of the smaller claims on which he has built his larger argument (like the nature of consciousness) remain a subject of dispute. His focus on the eastern Mediterranean region--he looks principally at Mesopotamia, Egypt, Greece and the Biblical Israelites--appears rather narrow. And the evidence available for his examination even of this history was necessarily limited, and full of ambiguities and gaps, from the uncertainty regarding the dating of particular works, to the meanings of key terms, to the question of how representative the works in question really were of how people lived and thought at the time (all of these evident in, for example, his reading of ancient Greek literature). Still, Jaynes is quite aware of the scantiness of some of the material, and the speculative nature of many of his claims, frequently acknowledging that what he presents is but a starting point for a fuller investigation.

Of course, the proof that would really convince a skeptic--proof that a society can actually function this way, let alone proof that this really was the way the species lived--may be unobtainable. However, what Jaynes does with the material available to him is genuinely impressive, the theory at the least fitting a great many facts. In the process, he makes a contribution to the indisputable view that our recognition and expression of abstract thought and subjectivity has grown enormously over these millennia (a view going back at least to G.W.F. Hegel), and a good case that schizophrenia and hallucination have played a larger role in antiquity than we appreciate. In the process he points the way to a significant part of the human story--as is so often the case with those who dare to offer Big Ideas, even when they do not persuade us completely of their version of the larger picture.

1. As Jaynes notes, in the second millennium B.C., human beings heard the voices directly; in the first millennium B.C., they were only able to access those voices through exceptional individuals, sites and efforts, like prophets and oracles; in the first millennium A.D., even these fell increasingly silent, leaving behind the texts based on them; and in the past thousand years, those texts have seen their authority eroded.

Friday, May 18, 2012

On the Word "Lifestyle": A Postscript

Half a year ago I discussed the contemporary use - and much more often, misuse, of the word "lifestyle" on this blog.

Your standard of living is a matter of how much money you make.

Your quality of life is an evaluation of your general well-being – of which income is only one part, as matters like physical health, leisure and social life enter into it.

A way of life is something you're born into, or adopted into, or adopt yourself. A culture has a way of life, and one shares that way of life when they belong to that culture.

A lifestyle is something an individual picks out of a catalog. It is something few people actually have, and it is not be confused with any of the others.

Saturday, May 5, 2012

The Anti-Humanism of Battlestar Galactica

Humanity, the reimagined Battlestar Galactica tells us, is Fallen, progress an illusion that dies when the creations of that "flawed creation," Man, finally turn on it – and do so successfully because those flawed human beings failed to understand that peace is just a period of preparation for the next round of war. (The first scene of the show, in fact, depicts the Cylons' strike on the Armistice Station that represents the Colonies' peace overture to them – adding insult to injury.)

Humanity also failed to appreciate that not only could the Enemy not be reasoned or bargained with, but that it had penetrated within, their ruthless agents and their hapless dupes undermined us to the point that we had little defense against the external assault that finished civilization off in hours.

Trying to examine the history leading up to it, to actually understand the origins of this situation, is pointless, and even traitorous. ("How this could've happened, why it happened – none of that matters right now. All that does matter is that as of this moment we are at war," Commander William Adama says in his call to arms during the pilot.) And that's not the only way in which too much truth is an inconvenience. In the wake of the disaster, the man who claims the mantle of leadership bolsters his legitimacy with a "useful" lie promulgating false hope (regarding Adama's knowledge of the location of Earth), and the forging of a consensus through appeals to the irrational, and the pressure to conform ("So say we all!").

Those who do not conform, like the dissenter and the radical (e.g. Tom Zarek), repeatedly justify the suspicion and contempt with which they are treated, as does the intellectual (e.g. Gaius Baltar), who is not only the very source of our predicament in his misuse of his technical expertise, but a self-seeking traitor at every turn. The rise of such men to a position of leadership is, of course, ruinous, not least because the masses are so easily led astray. Proving once more that they cannot be trusted to take care of themselves, that they are weak and soft and unwilling to face the Hard Facts of Life, the People grab the first chance to fly from the struggle – foolishly electing Baltar President (and foolishly permitted to do so by Adama and sitting President Laura Roslin when they decide not to falsify the election's results to head off this outcome), and leaving their ships for the soil of New Caprica, with the result that Democracy and Excessive Respect for the Law has led to the exchange of the bitter struggle for the even more bitter subjugation by the Enemy. (The second season, in fact, ends with a shot of Cylons marching down the settlement's single muddy street like the Wehrmacht parading through Paris after its capture in 1940 – an image all the more loaded in the American imagination by Anglo-Saxon perceptions of French arms during World War II – and the next begins with Baltar playing quisling to the Cylon occupiers.)

Naturally, it falls to the soldiers (who despite being the show's principal repositories of virtue are no exceptions to the spectacle of human degradation that is the cast of characters, from the pathetic Saul Tigh to the broken Kara Thrace to the tragic Felix Gaeta) to save the civilians from themselves yet again. However, even their heroics accomplish only so much. The Apocalypse runs its course, after which those who had indulged in the luxuries of the wicked cities finally return to the Land. Nonetheless, it is to be only a matter of time before they begin the same stupid cycle of rebuilding and destruction all over again – while the rather smug "angels" manipulate us toward some unknown end.

This was all taken for "gritty" – which is to say, "realistic." Yet one could more precisely, substantively and usefully describe it all as authoritarian, militarist, xenophobic, obscurantist, anti-rational and anti-modern, and just plain misanthropic, which is to say that it conforms to "reality" as seen only from a very specific part of the political spectrum.1 Such a reading of the show is, if anything, encouraged by the presentation of so much of what happened on the show as relevant to our contemporary politics (through its constant, shamelessly sensationalist evocations of the War on Terror).2

That such comments have not been far more common says a great deal about the political moment in which the show appeared, and what it takes to be "realistic."3 It also says a great deal about our moment that this is the show which has supplanted Star Trek as the template for science fiction television, its considerable influence evident in recent efforts like Stargate: Universe and the reimagined V (which reimagining is in itself a striking story of political inversion). Ken MacLeod has described this condition as well as anyone I can think of: "humanity as a rational and political animal died in 1979, and went to hell." The new Galactica, representative of the anti-humanist polar opposite to what Star Trek (once) stood for, is a story we told ourselves there.4

1. The original Galactica has also been noted for its right-wing politics. Still, many of the changes took it further in that direction, like the emphasis on humanity's nature as fallen, and the treatment of particular characters, like the transformation of Gaius Baltar from a power-hungry politician into a horny, status-seeking scientist with an accent the show's primarily North American audience would regard as foreign. There is, too, the addition of the civilian-military power struggle, which had Adama in a tug of war with President Laura Roslin, presented here as out of her depth, as she had been "only" a teacher earlier – not just a member of a profession held in low regard among Americans (and a lightning rod for the same anti-intellectualism we see in the treatment of Baltar), but one identified with the "liberal establishment" conservatives frequently demonize.
2. The show's first episode "33" had the heroes shooting down a hijacked passenger craft, while later in that same season Cylons were seen blowing themselves up like suicide bombers, captured Cylons got waterboarded, etc.. The result was to equate the September 11 terror attacks with the annihilation of a twelve-planet civilization, al-Qaida with the military might of the Cylons, and the context of a handful of refugees in a flotilla of life boats to that of the United States today – comparisons which entail so much exaggeration as to render any analogy useless (while the tone was usually that of FOX News in Space).
3. Admittedly, the show did not adhere to the view described with perfect consistency, at least after the first two seasons. Indeed, some argued that the politics were quite the opposite of those described here in season three. In particular, much was made of the show's protagonists becoming resistance fighters against the occupying Cylons in that season's first episodes, with many reviewers suggesting an equation of the human "insurgents" with present-day Iraqis, while subsequent episodes like "Dirty Hands" and "The Woman King" entered new territory by raising questions of class, labor and bigotry. (Indeed, Jonah Goldberg adored the first two seasons, and hated the later ones, which thoughts he shared in a post pointedly titled "How Politics Destroyed a Great TV Show" – which is to say, how the merest whiff of liberal politics made intolerable to him a show he'd once enjoyed because of its agreement with his world-view.) However, the extent and depth of the changes should not be overstated. They did not alter the basic premises of the series, and in any case, were quickly drowned in the pseudo-religious muddle already moving to the fore.
4. Ironically, the new show's creator, Ronald D. Moore, worked as a producer and writer on the Star Trek franchise, scripting nearly sixty episodes of Star Trek: The Next Generation, Star Trek: Deep Space Nine and Star Trek: Voyager, as well as the films Star Trek: Generations and Star Trek: First Contact.

Wednesday, April 25, 2012

Making Iron Man 3: A Geopolitical Perspective

It now appears that Chinese company DMG will invest nine figures in the production of Iron Man 3 – an unprecedented collaboration between a Chinese business and Hollywood, which will also see DMG distributing the film domestically.

I was immediately struck by the irony of the news given not just the fact that Iron Man's most famous antagonist is the Mandarin (as many a comic book fan has already pointed out in the comments pages to various reports of this development), but the story of this particular hero. Tony Stark, after all, is just a Victorian Edisonade protagonist updated for the world of the Cold War-era military-industrial complex – who began his adventures while fighting "Asian Communism" in Vietnam (the Cold War hawkishness of the early '60s Marvel comics being especially pointed here).

Certainly much has changed in world politics in the half century since that tale was first penned. Yet, while some more recent writers (like Warren Ellis) have offered relatively nuanced treatments of the comic, the movies pretty much stuck with the simplicities of the original vision, trading the Cold War for the War on Terror, and Vietnam for Afghanistan, while playing up the idea of Iron Man as an unapologetic embodiment of U.S. military superiority, and military interventionism - which one would certainly imagine to be problematic from a Chinese perspective.1 In the second film, China is even cited as a possible antagonist to the U.S. (when Joshua Hammer names it along with Iran and North Korea while speculating about potential foreign buyers of the Iron Man technology). Moreover, there has been reason to think that such politics matter, in the wake of the complications faced by the producers of the recent remake of Red Dawn.2

Of course, it may simply be that business trumps politics in this case. The attraction of this particular business opportunity, and perhaps, a closer relationship with Hollywood over the long-term, may appear too great to resist (especially with a staggering $3 trillion worth of foreign exchange burning holes in that nation's pockets, and the prospect of the movie being partially filmed in China itself). Additionally, while it is hard to think of a major comic book hero that would be more offensive to the sensibility of a country run by a Communist Party than Tony Stark, China's establishment has long since ceased to be Communist in anything but name – and in the post-Mao era in which "To be rich is glorious," and the Chinese Dream looks not unlike the American Dream, it could be that a figure like Stark (a tech industry Gary Stu if ever there was one) enjoys a greater appeal than may seem the case at first glance.

1. For a critical take of the politics of the Iron Man films, check out Cristobal Giraldez Catalan's review of the first movie and Hiram Lee's reviews of both the first and second films.
2. Worried by Chinese disapproval, the producers changed the absurd premise of a Chinese invasion of the U.S. to an even more absurd one in which North Korea appears as the would-be conqueror of the United States – by editing the already-shot film, a procedure I expect will appear clumsy when the film actually hits theaters. (Incidentally, the scenario of a North Korean invasion and occupation of the U.S. was presented in last year's video game Homefront, written by John Milius, director of the original Red Dawn.)

Sunday, April 15, 2012

Remembering the Titanic

I have at times wondered why, a century after the event, the sinking of the Titanic continues to receive as much attention as it does. There have been bigger ships (the supertankers and aircraft carriers aside, there have been been bigger liners, like today's colossal Oasis class ships), and deadlier maritime disasters (like the sinking of the Wilhelm Gustoff, which killed more than 9,000, and the Goya, which killed more than 6,000, compared with the 1,500 who died when the Titanic went down). The sinking of another British ocean liner, the Lusitania, just three years later, took nearly as many lives (almost 1,200), and had a far greater significance for the course of world history given its impact on international opinion during World War I.

Part of this seems a reflection of the memorialization of the event, which in turn has made it better known and therefore more likely to be memorialized yet again, as has certainly happened on the movie screen – in 1955's A Night to Remember (adapted from Walter Lord's successful book by Eric Ambler), in 1964's The Unsinkable Molly Brown (based on the hit play), and of course, in 1997's James Cameron's Titanic (one of the biggest blockbusters in history). But this hardly explains why the event was memorialized in the first place, or why the retelling of the story as fiction and nonfiction continues to find an interested audience.

It seems notable that the Titanic was a Belfast-built ship of the British White Star line, bound from Southampton, England for New York, which sank off Newfoundland, Canada – and did so in peacetime, when ships might have been expected to safely cross the ocean, and sailing associated with pleasure rather than necessity and danger. By contrast, the Gustoff and Goya were German ships sunk by Soviet submarines in the Baltic in the last months of World War II, denying them such resonance for the English-speaking world. At the same time, the absence of the obvious political freight of events like the sinking of the Gustoff, or the Lusitania, from the story of the Titanic; the remembrance of the ship as the epitome of luxury in its era; and the presence of numerous celebrities of the day on its passenger list (members of the Astor and Guggenheim families among them); makes it suitable raw material for those inclined to present the human experience as a collection of "stories for the amusement of the comfortable classes" (as Gore Vidal once said of a particular bestselling historian).

Yet, even if one can approach the story in this way, the event also lends itself to an extraordinary range of allegories and comments. One can see in the disparities of wealth and position among those who journeyed aboard the ship, the collision with the iceberg, the handling of the accident and the aftermath a story of the technological hubris, corporate irresponsibility, and social inequality of an era that in its fundamentals is not much different from our own. (James Cameron, certainly, has highlighted this side of the story in his comments about his film.) At the same time, the 1912 disaster can appear like one of the last moments of the long nineteenth century that has been such an object of nostalgia, and a foreshadowing of the disasters that lay ahead. (Indeed, the first episode of Downton Abbey opened with a telegram regarding the ship's sinking, and the death of the heir to the titular estate.) Indeed, the fascination of the Titanic appears a miniature version of the hold that period has on our imaginations – the Victorian-Edwardian era both as we knew it, and as it might have been.

Monday, April 9, 2012

The 50th Anniversary of the James Bond Film Series

As fans of the James Bond film series know, this is the fiftieth anniversary of the franchise's launch with 1962's Dr. No. Naturally, the fact is being commemorated in manifold ways big and small, from a highly publicized reissue of the films on Blu-Ray, to exhibitions at film festivals from Toronto to Muscat, to retrospectives in the press.

Those offering comment on the subject typically note the series' "reinvention" of itself during its half century of life. However, such reinvention is not unproblematic. Indeed, I have found myself wondering about a question that no one in the entertainment press ever seems to ask: at what point is it time to let an IP go? To stop rebooting and updating and just leave things be and move on to something else?

The answer Big Media offers is "NEVER!" Anything their companies hold, and for that matter, anything with a name recognizable enough to be suitable for the high concept approach, is to be milked for all it can possibly give and beyond, for reasons I've already discussed here many a time. But for those with art rather than money on their minds, this answer is unsatisfactory. Instead I would say that it is time to let go when what is distinctive about the original ceases to be credible or acceptable, when it can only be presented to the audience ironically or apologetically or in exceedingly sanitized form, or with enormous strain.

It seemed to me that this had already happened with the Bond films when I wrote "The End of James Bond?" back in August 2010. Certainly a few stylistic touches remain to the present, like the opening shot through a gun barrel, the John Barry score, and the line "The name is Bond. James Bond." However, at the core of the earlier films there were also the following, rather more substantive elements:

1. A charged context of high-stakes conflict between world powers (like the Cold War, especially its early phase, in which détente had yet to come about, and the British Empire was breaking up but the metropole not yet reduced to the standing of a "normal" country).
2. The particular model of sophistication, machismo and hedonism represented by the central character (black tie settings, hard liquor, tobacco, skill at upper-class pastimes like golf and baccarat, gambling, flings with numerous women).
3. The plot formula, originally developed in Fleming's novels but ultimately perfected by the films, which has Bond up against freakish megalomaniacs with massive resources and global ambitions, and their even more freakish henchmen. The battle typically begins with Bond engaging the villain in games (golf, cards, etc.) in social settings mixing gentility with murderousness, evading the subsequent assassination attempts, becoming involved with girls good and bad, getting captured and escaping from elaborate death traps, and in the climactic confrontation (usually at the villain's over-the-top fortress) saving the world as the clock ticks toward oblivion – typically with the help of an arsenal of gadgets.

Due to the changes in Britain's international position (and the broader international situation) since 1953, 1962 or even 1989, the first of these has become possible only with a retro approach, as in Sebastian Falk's Devil May Care (2008) - an experiment which has not been repeated in print, and seems an even less likely bet on screen. The second has largely been quashed by changing conceptions of glamour, luxury and the action hero (a guy in a tux no longer making the impression he used to), to say nothing of concessions to feminism and the New Puritanism. The third, deeply worn by heavy use (twenty films between 1962 and 2002, excluding 1967's Casino Royale and 2002's Never Say Never Again, as well as profuse imitation and parody) has also been abandoned in the aforementioned concessions, and more significantly, the pursuit of "gritty realism." Indeed, in their playing off the Bond films the Austin Powers movies (and to a more modest extent, 2002's xXx) differed from the innumerable earlier books, TV shows and films parodying 007 in their presentation of the series' tropes as not merely cartoonish or silly, but as dated, and the dating of aspects of the series from the playboy attitudes to the villains' schemes central to many of their gags.

The result is that this anniversary seems to me rather a hollow one – like that of a marriage in which each spouse has forgotten what they ever saw in the other.

Friday, March 30, 2012

Game of Thrones: Season One

In some respects George R.R. Martin's Song of Ice and Fire seems an obvious fit with HBO. Their programmers have long gravitated toward material that can be labeled "dark and gritty," and the success of Showtime's The Tudors has suggested an audience willing to watch this kind of thing when the characters wear period costume and live in castles. Martin's fiercely anti-romantic take on the high fantasy genre certainly fit that bill, full of the moral ambivalence and brutality and unsympathetic characters critics love to praise writers for writing - the jokes about "The Sopranos of Middle Earth" not too far off the mark as a description of its intrigues, which are driven not by the machinations of some external Evil impinging on a happy world, but the base ambitions and hungers of high nobility (though north of the wall, at least, there are also the stirrings of a common danger).

Additionally, Martin's novels marginalize the fantasy elements (magic, imaginary creatures and the like), particularly in the series' earlier volumes, making this kind of thing easier for critics and mainstream audiences to take (while relieving some of the pressure on the FX budget, which for the ten-hour series is likely less than a movie studio would bring to bear on a first-rank two-hour film). The fact that it has become routine for television networks to buy the rights to genre novels as a basis for their series (like Flash Forward, and The Vampire Diaries), and that Martin's books have as big a built-in audience as any recent work of speculative fiction not written for young adults (indeed, Martin was virtually the only non-YA fantasy author on last year's list of the top hundred selling books), doesn't hurt.

Yet, there were plenty of reasons for doubt about how the production would fare. The source novels are not just long but dense, packed with characters and intrigues which are both complicated and complex. My hardback edition of the first runs to 674 pages, not including the 20 page appendix listing the characters by house – and the story only gets bigger and more tangled with each succeeding volume. (Indeed, the fourth book of the series ended up being split into two volumes, each focusing on a different set of characters.) Translating that into a TV series which is both faithful and accessible is not a simple thing. Additionally, much of the tale is told through the eyes of very young characters, some of them children (Arya, Bran), hardly the kind of thing the channel seemed likely to accommodate. I wondered, too, if the show's production values would do Martin's world justice. (Martin himself has recently wondered what the Battle of the Blackwater would look like when it reaches the screen.) And at any rate, HBO's track record with speculative fiction has not been impressive.

On the whole, I was pleasantly surprised. The first episode was admittedly a bit shaky. Exposition-heavy, it felt a bit crowded and disjointed, the milieu thin. However, the storytelling quickly got smoother, and the world of Westeros more satisfactorily developed. I was ambivalent about a few of the tweaks (the softening of Cersei Lannister, the use of Renly Baratheon as a foil for his older brother rather than a younger version), but most of the adjustments - the scenes created from scratch, the alterations of events and the like - aided in clarifying the storyline, better establishing the characters (especially the non-viewpoint characters, who so often come off as opaque in Martin's writing), and holding together a sprawling epic. Certain scenes and bits of action are admittedly scaled down from the novels (like the depiction of Vaes Dothrak and the Twins, and the major battles), but never in such a way that it compromised the story (though one might wonder about their presentation of the Battle of the Green Fork). And of course, the cast has been deservedly praised (with Peter Dinklage notably picking up an Emmy for his performance as Tyrion Lannister - a rare win for genre television in the acting categories). As a reader of the original novels I came away satisfied, but I think the series would have been accessible and entertaining even if I came to it without that background, and it has left me looking forward to season two, which starts airing Sunday, April 1.

C. Wright Mills and the Day Job

The sociologist C. Wright Mills concisely described a major problem facing those who must hold day jobs in his classic sociological study White Collar:
Alienation in work means that the most alert hours of one's life are sacrificed to the making of money with which to "live." Alienation means boredom and the frustration of potentially creative effort, of the productive sides of personality. It means that while men must seek all values that matter to them outside of work . . . they must be serious and steady about something that does not mean anything to them, and moreover during the best hours of their day, the best hours of their life (236).
For more about the book, check out my review of the book at my other blog.

Monday, March 26, 2012

The Matarese Circle – Coming to a Theater Near You?

With the success of the Jason Bourne film series, Hollywood has unsurprisingly taken an interest in bringing other Ludlum novels to the screen – including 1979's The Matarese Circle - the book Ludlum published right before Jason Bourne's first adventure.

For the moment, it seems that the project is dead, but it could yet come back from development hell. If it does, the age of the novel (over three decades old now) presents challenges to filmmakers trying to update it for a twenty-first century audience, just like The Bourne Identity did. Indeed, they may be even bigger challenges.

Perhaps the first has to do with the novel's principal characters - American spy Brandon Scofield and Soviet operative Vasili Taleniekov - putting aside an old enmity to fight the titular villains. In this Circle is very much a Cold War story, rooted in a now-vanished context, the loss of which will mean that putting an American and Russian agent together simply doesn't involve the same drama. Indeed, retaining the Russian nationality of Scofield's ally seems more like inertia than anything else given today's geopolitics.1

Much the same is the case with the novel's two central real-world themes – international terrorism, and corporate power. Terrorism is still topical, and this book doesn't have the disadvantage The Bourne Identity did of being centered on a specific, real-life terrorist, and the highly idiosyncratic strategy (to put it generously) for taking him down Ludlum dreamed up. Yet, the character of terrorism (or at least, what we label as terrorism) has changed, the cosmopolitan, leftish groups that captured headlines in that era, like the Italian Red Brigades, giving way to groups with ethno-religious identifications (in the American imagination, synonymous with Muslim fundamentalism). Aside from affecting nuances of the plot, it makes me wonder what the writers would do with the character of ex-Red Brigade member Antonia.

Corporate power may also be topical – but again, the image of this has also changed after nearly four decades of neoliberal globalization. I suppose Dr. Evil's Number Two said it best in Austin Powers: International Man of Mystery (1997):
"I spent the last thirty years of my life turning this two-bit evil empire into a world-class multi-national. I was going to have a cover story with Forbes. But you, like an idiot, want to take over the world. And you don't even realize that there is no world anymore! There's just corporations!"
When that's how the world appears, when we live in the age of the Davos World Forum, elaborate plots by corporations to seize political power seem quaint - and even superfluous.

Moreover, the Matarese's rationale for seizing control belongs to another era, the very same one that gave us the original Rollerball (1975). Dubious as they have always been (critiques of corporate influence over politics go back at least to Adam Smith's oft-cited but generally unread Wealth of Nations), the claims of corporations to a technocratic, meritocratic rationality that would represent an advance over the administration national governments supply possessed a shred more credibility then – credibility long since obliterated by the endlessly demonstrated greed of executives unable to think beyond their next bonus, and the pandemic of short-termism that follows from it, with all its fiscal and other consequences; by corporate indifference to environmental degradation, resource exhaustion and not just social justice, but social stability, as a part of their standard operating procedure; by business's continued reliance on politicians who cover corporate economic agendas with retrograde versions of nationalism and religious fundamentalism. (And of course, as even neoliberal cheerleader Thomas Friedman owned in The Lexus and the Olive Tree, "McDonald's cannot flourish without McDonnell-Douglas.")

The recent remake of Rollerball (2003) (which deservedly made io9's recent list of "worst science fiction movie remakes of all time") was a toothless mess, unsatisfying even as a simple action movie, despite its being helmed by the director of Die Hard. It is not a foregone conclusion that The Matarese Circle will be that, but I wouldn't bet against that either.

1. I'm actually reminded of 2009's G.I. Joe: The Rise of Cobra, which for its plot had a lame rip-off of the 1977 Bond movie The Spy Who Loved Me. Like that earlier film, the G.I. Joe movie had for its villain a madman (Destro this time) in an undersea fortress (incidentally, visually reminiscent of Karl Stromberg's aquatic facility in the Bond film) whose plan was to launch stolen weapons of mass destruction (nanite-based warheads, instead of submarine-launched nuclear warheads) at both Moscow and a major American city on the eastern seaboard of the United States (the newer film shifted the target from New York to Washington), then build a new order in the aftermath. The writers even decided to throw in a love story between Duke and the Baroness - a move with no basis in the history of these characters.

Friday, March 23, 2012

Stargate as Star Trek

It has become something of a tradition for new space operas to proudly label themselves an un-Star Trek or an anti-Star Trek. Yet, the only show that actually approached Star Trek-like success was the show that perhaps came closer than any of the others to Star Trek in concept and feel – the Stargate television franchise.1

Like that previous show, it had a team of humans from a military organization (one can think of as Star Fleet as at least quasi-military), with a usually stone-faced alien companion (Teal'C, to the original Star Trek's Mr. Spock) venturing out on journeys of exploration, in which they routinely encountered English-speaking humanoids on other planets.1 As the story was set in the present day Earth was not a utopia – but it certainly appeared more attractive than the rest of the galaxy, where most of the human species lived in theocratic slavery under the iron heel of the principal villains, the Goa'uld, or where these were absent, some other form of backwardness or repression. (More technologically advanced or socially progressive human societies were few in number, and generally lost their luster when closely examined.) One might add, too, that when the show explored Serious Themes, or offered social or political commentary, it was typically by way of the depiction of those other worlds – our current problems, on the screen at least, looking more like Other People's Problems. The result was that, for all its flaws, Earth seemed a beacon of freedom and enlightenment, its condition something to be aspired to rather than transcended. Stargate even used the tropes of some of the original Star Trek's most memorable episodes, like "Mirror Mirror" (paralleled in the SG-1 episode "The Road Not Taken").

To be sure, Stargate drew on other inspirations. The titular stargate itself can be regarded as a bit of steampunk – a big, solid, steam-snorting machine clearly modeled on that piece of Victorian high-tech, the rotary dial telephone (dialing seven numbers to get an interstellar connection), and evocative of the era of Howard Carter (and Indiana Jones) in its rediscovery in Egypt in the 1920s. The extravagant Goa'uld were Oriental despots right out of Flash Gordon. There was more than a bit of the Roswell mythology in the Stargate program's secrecy, the reverse-engineering of alien technology, and the appearance of "gray" aliens (the Asgard). There was even a bit of "media savvy" Galaxy Quest-like self-parody, with the milestone 100th and 200th episodes of the first Stargate series given over to humor of this kind (centering on the show-within-a-show Wormhole X-treme!), and fan service-y casting in the inclusion of Farscape's Ben Browder and Claudia Black as core members of the cast in the ninth season.

Yet, the Star Trek-like elements (admittedly, used in a rather conservative way) were what held the show together, and it seems notable that the Stargate series which met with the least success – Stargate: Universe – was the one that broke with that pattern, playing rather more like Battlestar Galactica lite in its head games and soap opera and overall tone. It ended after a mere two seasons, but in fairness, this had much to do with the show's second-season move to Monday nights, and the changing priorities of the channel's executives (wrestling, reality TV). For the time being it seems that the reimagined BSG seems likely to enjoy the status of fashionable template for anti-Treks for some time to come.1 Indeed, TV's next space opera will likely be a new entry in the Galactica saga: Battlestar Galactica: Blood and Chrome. (Alas, even with the dearth of such programming on television now, I can't say I'm really looking forward to it.)

1. Together the three series, Stargate: SG-1 (1997-2007), Stargate: Atlantis (2004-2009), and Stargate: Universe (2009-2011) ran for fourteen years, from 1997 to 2011, during which they produced 354 one-hour episodes – figures that are actually comparable only to Star Trek.

On Star Trek Bashing

Star Trek bashing is, of course, an old pastime among science fiction connoisseurs. Some of this is a genuine response to the artistic limitations of the show. (A significant part of the writing of the original series has not aged well, and its successors also have their clunky bits – while no one can dispute that the genre's cutting edge had already moved beyond it in 1966.) Some of this is likely a reaction to the highly publicized excesses of some of its fans (like the study of the Klingon language). And some of this is the inevitable snobbery toward a franchise which has enjoyed such enormous commercial success as to be almost synonymous with the science fiction genre in the minds of "mundanes" (except for Star Wars, nothing else is comparable), making the limitations, and the fan excesses, all the more galling for those anxious about science fiction's image.

However, much of this has simply been a denigration (often overt, though rarely plain-spoken) of the show's humanism and "utopianism" – the last, a highly loaded and often misused term. Indeed, it might be more useful to discuss the show's setting less in terms of utopianism than what literary critic Patrick Parrinder described as the "scientific world-view," which held the world to be on a self-destructive course, and called for a new order not for the sake of some "fuzzy-minded" ideal, but as humanity's best (and perhaps only) shot at survival. H.G. Wells, a crucial formulator of this view, saw the combination of the nation-state system, the capitalist system and the irrationalities of nationalism and religion in an age of industrialized production and industrialized warfare as exactly that kind of dangerous situation, and called for the establishment of a world government, a socialist economy, and a rationalistic culture in their place.

Wells dramatized his argument in novels like The War in the Air, The World Set Free, and The Shape of Things to Come. In those works he depicted the decadence of the old order, its horrific collapse (predictions of which were, to some extent, validated by the Depression and the world wars), and the building of the new and different order that followed (which in our history did not proceed along the lines hoped for by the proponents of this view). Star Trek is set fairly far along in the history of such a new age, the series depicting a future in which some sort of global (indeed, interstellar) governance, a more humane economics (never labeled socialist, but at the very least post-capitalist), and a more rationalistic and tolerant conception of life has long since trumped the superstition, small-mindedness and bigotry of the past.

Of course the last four decades – this age of unreason, identity politics, ideologically convenient pseudo-science and neoliberal economics – have been a grave disappointment for those espousing such visions, who might well feel that instead of the Federation of Planets, we are turning into a frightening hybrid of the Ferengi and the Cardassians. Yet, the problems that gave rise to the scientific world-view continue to hang over our heads. Indeed, some of them have proven worse than Wells imagined (real-world nuclear arsenals have far more potential to destroy civilization than what he portrayed in The World Set Free), while the list of problems has actually lengthened since his day (as Wells certainly did not anticipate anything like anthropogenic climate change). And there has certainly been a scarcity of alternative ideas for dealing with them, "optimism" (another loaded, misused word) now seeming to consist mainly of dismissing or shrugging off very real problems, or a "faith" that a technical fix will conveniently appear, and conveniently be implemented in some market-friendly way – the techno-libertarianism that has been discredited time and time again. (Think, for instance, of what the first decade of the twenty-first century was expected to look like at the height of the tech bubble.) This leaves just despair on the part of those who regard something better than the present muddle as not merely an ideal but a necessity, gloating on the part of those who never had anything but contempt for their ideas, and the outlook of the "bright-sided" – hardly a healthy state of affairs.

The state of this particular subgenre of science fiction is just a symptom of the problem, not the disease itself, but the more I consider the situation the more it seems to me that not only is there still room for a vision like the one Star Trek presented, but that its presence would be a worthwhile thing, and perhaps even vital. Certainly the genre is richer for the emergence of shows with very different outlooks, like Lexx and Firefly – but we are far past the point at which the original Star Trek's approach can really seem stifling to would-be producers of science fiction television and film. If anything, with Battlestar Galactica now the template for TV space opera, it is humanism, and the hope of a better tomorrow – or even human survival – that is in danger of being stifled (all as the "dark and gritty" approach seems increasingly trite).

Where such drama is concerned, the trick is to make that humanism seem credible. The Star Trek franchise's last two television incarnations (Star Trek: Voyager and Star Trek: Enterprise) didn't quite pull off that feat, and J.J. Abrams' cinematic reboot didn't even try. Indeed, it seems to me that in their own ways, other, more recent writers have made better use of the basic principle, like Iain M. Banks in his Culture novels, or Ken MacLeod's Fall Revolution novels (particularly The Stone Canal and The Cassini Division). Of course, well-known as they are to readers of recent science fiction, they have reached a much more limited audience than the multimedia Star Trek franchise, and neither is a likely bet for a Hollywood production – but they do demonstrate what remains possible, all as the need for alternatives to continued wallowing in our age's low and still falling expectations grows only more pressing.

Saturday, March 10, 2012

The Blockbuster Strategy

Lisa Richwine and Ronald Glover published an interesting article yesterday on Hollywood's gamble of making fewer films, with a greater emphasis on "high-risk, high-reward" tentpoles – like this weekend's John Carter.

There is no mystery as to why Hollywood follows this path. The blockbuster, "high concept" strategy has become the standard operating procedure for Big Media as a whole (stodgy old publishing included), and there are especially pressing reasons for the major U.S. studios to adhere to it in film. After all, it is exactly this kind of film that is Hollywood's strength, because other countries' film production has simply lacked the deep pockets to compete in this area with any regularity. Additionally, the spectacle these movies offer is exactly the kind of product that other branches of the media, like television production, cannot match, and capitalizes most fully on the big screens that are the principal draw of the theatrical experience so crucial to a movie's earnings (as well as offering the best justification for the 3-D surcharges which have done so much to pad them these last few years). These films also travel quite well, routinely pulling in enormous grosses in foreign markets, often more than enough to compensate for a disappointing performance in North America (as was the case with the last Pirates of the Caribbean movie) – in contrast with a great many other Hollywood movies (like Will Ferrell comedies).

Yet, as those in business seem to constantly forget (or to have never learned in the first place), effective demand has its limits, and the market has always been crowded enough with this kind of product as to make costly flops routine. Thus far, the hits have proved more than sufficient to keep the studios at the strategy despite the losses. Indeed, Richwine and Glover note that Disney recently halved its film output to focus on "franchise films." However, as the budgets continue to go up (no one is shocked by a $250-300 million production budget anymore), and these movies' traditional target audience (young males) seems less and less interested in buying tickets, while DVD sales decline, I doubt this will stay a commercially tenable approach for long. Given that Hollywood seems unlikely to successfully revert to a more diverse output, I wonder if Hollywood won't soon come to appear a stumbling dinosaur - the way publishing has been for as long as I can remember.

Subscribe Now: Feed Icon