Sunday, May 20, 2012

Dirk Pitt Returns?

Recently reviewing the stats for this blog, I was struck by the number of web surfers who found their way to it looking for information about the possibility of Hollywood making more Dirk Pitt movies, whether sequels to Sahara (2005), or a new, "rebooted" series which starts with a clean slate (to judge by the frequency with which search keywords like "dirk pitt movie 2" and "any more dirk pitt movies" turn up in the section titled "traffic sources"). This seems to be mainly because of my earlier post, "Returning to Sahara: The Dirk Pitt Novels on Screen."

That post considered the pitfalls involved in translating Clive Cussler's novel Sahara from the page to the screen, rather than any plans for more Dirk Pitt films, about which I have heard, seen or read absolutely nothing – which is exactly what I would expect. The fact that Sahara was a commercial disappointment ordinarily would not rule out the studios taking another shot. After all, Hollywood has proven nothing short of fanatical about trying to squeeze every penny it can out of any and every established IP, and as fans can attest, the Dirk Pitt novels sure look like obvious source material for big-budget action-adventure films – especially as this franchise remains very much alive, new entries continuing to regularly appear on bestseller lists. Of course, as I pointed out in my previous post, adapting Pitt's adventures to the big screen is trickier than it looks, given how dated some of the plots have become (in light of changes in world politics, and industry sensibilities), but Hollywood has never been averse to seizing on a brand name and a few other select trappings (a character, a gimmick), and dispensing with everything else, and in the process launching a commercially viable film sequence, even as it leaves the purists unsatisfied. (They did it with James Bond, after all. And Jason Bourne. And Star Trek. And others far too numerous to list here.)

However, it is worth remembering that Sahara wasn't just a disappointment, but could be described without any hyperbole whatsoever as one of the biggest flops in movie history – and that this came on top of an earlier production of a Dirk Pitt film (1980's Raise the Titanic) which could be described in exactly the same terms. Indeed, one might say Cussler was lucky to see his series get a second chance, even if it took twenty-five years for this to happen. The way that extraordinary opportunity went has left the material seeming almost cursed – and silly as that may sound, one should not overestimate the rationality of this business. (In fact, when so much money is at stake, the factors to be weighed in the decisions regarding its use are so intangible, and every facet of commerce has come to be seeped in a casino mentality, one should bet on Hollywood's irrationality every time.) Perhaps even more frightening than any such curse, Cussler himself did not manage to have a good relationship with the producers, with the result a protracted, grinding legal battle in the aftermath of the film's financially ruinous shooting and release – and that is something that will frighten the least superstitious of Suits.

All that being the case, it may be a long time before anyone takes another crack at this series, if ever. However, for what it's worth, you can be sure that I will have something to say about any new developments in that story here.

Review: The Origin of Consciousness in the Breakdown of the Bicameral Mind, by Julian Jaynes

Boston: Houghton-Mifflin, 1976, pp. 467.

I first encountered Julian Jaynes ideas about bicameralism through Jack Dann's classic The Man Who Melted five years ago. I enjoyed the novel greatly, but was skeptical of the theory underlying it. Jaynes' theory, after all, holds that consciousness is not something intrinsic to life or humanity, but a cultural creation, which did not emerge until the second millennium B.C.. It seemed to me impossible that humanity could have ever been without consciousness, let alone reached an Iron Age level of civilization without it. How, for instance, could the "bicameral" mind have organized such a feat as the construction of the Great Pyramid of Cheops? How, too, could such a theory possibly be reconciled with the indications that even some higher animals (apes, dolphins) possess self-awareness?

Jaynes' book did not fully satisfy me with its answer to the first question, and didn't address the second question at all, but it did make the theory considerably harder to dismiss. In discussing the book, it seems appropriate to start with the author's definition of consciousness--which is not a capacity for sensation and observation, memory, or the ability to learn, think or reason. (Even these last, he argues, are predominantly unconscious processes.) Rather, consciousness is a model of the world around us we carry around in our heads enabling us to introspect, visualize and take decisions in a self-aware manner. Built upon the capacity of language to create metaphors, it consists of selected ("excerpted") details, arranged according to a sense of space and "spatialzed" time, and organized in narrative form, with dissonant information reconciled around an image of the self (as seen by oneself and by others)--and on the whole, a relatively small part of our mental life for all its complexity.

Prior to the arrival of consciousness, Jaynes posits the "bicameral" mind as the dominant thought mode. For bicameral persons, the left and right hemispheres of the brain worked rather more separately than is the norm today, with an important aspect of the separation the generation of auditory hallucinations, especially in moments when unconsciously acquired habit was inadequate, and stress high. Their mental life, in fact, is compared by Jaynes to contemporary observations of schizophrenia. A key difference, however, is that this was the norm rather than an exception. Indeed, he suggests society was actually structured hierarchically around such hallucinations, with priest-kings hallucinating the voices of gods (in many cases, originating in predecessors whose voices they continued to "hear" after their passing), and subordinates at each level the voices of the authority above them at such times, a control reinforced by such hallucinatory aids as centrally located houses of worship and ubiquitous depictions of deities--themselves organized as hierarchically as humans were. (This put the conflicting voices that torment so many schizophrenics in line, and as Jaynes has pointed out, this kind of sorting is not unknown among those suffering this condition today).

Jaynes suggests that bicameralism may have enabled the beginnings of civilization by making human beings subject to such a method of control. However, it did not endure. The intrinsic frailty of this system apart, it grew decreasingly effective as polities expanded--as with the empire of Hammurabi. An increased use of writing was one way of compensating, but the activity probably weakened bicameralism as well. The Bronze Age collapse (theorized by Jaynes as having been a result of the Theran catastrophe) knocked over these vulnerable structures, flinging together uprooted peoples forced to reckon with the inappropriateness of earlier modes of conduct, and with each others' differences (which may have led them to think that difference was internal for the first time), in harsh circumstances where deceit (thinking one thing and appearing to do another) had a high survival value. (Jaynes speculates also that there may have been an aspect of natural selection in the process, and that the development of narrative epics like Homer's Iliad contributed to the process as well.)

Over the millennium or so that followed, the conception of our behavior's driving forces was internalized, intellectualized (recognized as mental processes) and synthesized into consciousness as we now know it, which had clearly arrived by the time of such figures as Solon and Pythagoras, whose notions of mind, volition and responsibility are recognizable as similar to those we possess today. Still, a nostalgia for the certainties of divine voices speaking to us personally remained, in the laments of our myths of a "fall" which has distanced us from the gods, in methods of divination which attempt to reach those increasingly distant deities (like auguries, oracles, prophets and possession), and even in "scientism" (the quasi-religious treatment of scientific inquiry).1

In examining the argument over three decades after Jaynes' initial presentation of the theory, it seemed to me that, my initial objections aside, many of the smaller claims on which he has built his larger argument (like the nature of consciousness) remain a subject of dispute. His focus on the eastern Mediterranean region--he looks principally at Mesopotamia, Egypt, Greece and the Biblical Israelites--appears rather narrow. And the evidence available for his examination even of this history was necessarily limited, and full of ambiguities and gaps, from the uncertainty regarding the dating of particular works, to the meanings of key terms, to the question of how representative the works in question really were of how people lived and thought at the time (all of these evident in, for example, his reading of ancient Greek literature). Still, Jaynes is quite aware of the scantiness of some of the material, and the speculative nature of many of his claims, frequently acknowledging that what he presents is but a starting point for a fuller investigation.

Of course, the proof that would really convince a skeptic--proof that a society can actually function this way, let alone proof that this really was the way the species lived--may be unobtainable. However, what Jaynes does with the material available to him is genuinely impressive, the theory at the least fitting a great many facts. In the process, he makes a contribution to the indisputable view that our recognition and expression of abstract thought and subjectivity has grown enormously over these millennia (a view going back at least to G.W.F. Hegel), and a good case that schizophrenia and hallucination have played a larger role in antiquity than we appreciate. In the process he points the way to a significant part of the human story--as is so often the case with those who dare to offer Big Ideas, even when they do not persuade us completely of their version of the larger picture.

1. As Jaynes notes, in the second millennium B.C., human beings heard the voices directly; in the first millennium B.C., they were only able to access those voices through exceptional individuals, sites and efforts, like prophets and oracles; in the first millennium A.D., even these fell increasingly silent, leaving behind the texts based on them; and in the past thousand years, those texts have seen their authority eroded.

Friday, May 18, 2012

On the Word "Lifestyle": A Postscript

Half a year ago I discussed the contemporary use - and much more often, misuse, of the word "lifestyle" on this blog.

Your standard of living is a matter of how much money you make.

Your quality of life is an evaluation of your general well-being – of which income is only one part, as matters like physical health, leisure and social life enter into it.

A way of life is something you're born into, or adopted into, or adopt yourself. A culture has a way of life, and one shares that way of life when they belong to that culture.

A lifestyle is something an individual picks out of a catalog. It is something few people actually have, and it is not be confused with any of the others.

Saturday, May 5, 2012

The Anti-Humanism of Battlestar Galactica

Humanity, the reimagined Battlestar Galactica tells us, is Fallen, progress an illusion that dies when the creations of that "flawed creation," Man, finally turn on it – and do so successfully because those flawed human beings failed to understand that peace is just a period of preparation for the next round of war. (The first scene of the show, in fact, depicts the Cylons' strike on the Armistice Station that represents the Colonies' peace overture to them – adding insult to injury.)

Humanity also failed to appreciate that not only could the Enemy not be reasoned or bargained with, but that it had penetrated within, their ruthless agents and their hapless dupes undermined us to the point that we had little defense against the external assault that finished civilization off in hours.

Trying to examine the history leading up to it, to actually understand the origins of this situation, is pointless, and even traitorous. ("How this could've happened, why it happened – none of that matters right now. All that does matter is that as of this moment we are at war," Commander William Adama says in his call to arms during the pilot.) And that's not the only way in which too much truth is an inconvenience. In the wake of the disaster, the man who claims the mantle of leadership bolsters his legitimacy with a "useful" lie promulgating false hope (regarding Adama's knowledge of the location of Earth), and the forging of a consensus through appeals to the irrational, and the pressure to conform ("So say we all!").

Those who do not conform, like the dissenter and the radical (e.g. Tom Zarek), repeatedly justify the suspicion and contempt with which they are treated, as does the intellectual (e.g. Gaius Baltar), who is not only the very source of our predicament in his misuse of his technical expertise, but a self-seeking traitor at every turn. The rise of such men to a position of leadership is, of course, ruinous, not least because the masses are so easily led astray. Proving once more that they cannot be trusted to take care of themselves, that they are weak and soft and unwilling to face the Hard Facts of Life, the People grab the first chance to fly from the struggle – foolishly electing Baltar President (and foolishly permitted to do so by Adama and sitting President Laura Roslin when they decide not to falsify the election's results to head off this outcome), and leaving their ships for the soil of New Caprica, with the result that Democracy and Excessive Respect for the Law has led to the exchange of the bitter struggle for the even more bitter subjugation by the Enemy. (The second season, in fact, ends with a shot of Cylons marching down the settlement's single muddy street like the Wehrmacht parading through Paris after its capture in 1940 – an image all the more loaded in the American imagination by Anglo-Saxon perceptions of French arms during World War II – and the next begins with Baltar playing quisling to the Cylon occupiers.)

Naturally, it falls to the soldiers (who despite being the show's principal repositories of virtue are no exceptions to the spectacle of human degradation that is the cast of characters, from the pathetic Saul Tigh to the broken Kara Thrace to the tragic Felix Gaeta) to save the civilians from themselves yet again. However, even their heroics accomplish only so much. The Apocalypse runs its course, after which those who had indulged in the luxuries of the wicked cities finally return to the Land. Nonetheless, it is to be only a matter of time before they begin the same stupid cycle of rebuilding and destruction all over again – while the rather smug "angels" manipulate us toward some unknown end.

This was all taken for "gritty" – which is to say, "realistic." Yet one could more precisely, substantively and usefully describe it all as authoritarian, militarist, xenophobic, obscurantist, anti-rational and anti-modern, and just plain misanthropic, which is to say that it conforms to "reality" as seen only from a very specific part of the political spectrum.1 Such a reading of the show is, if anything, encouraged by the presentation of so much of what happened on the show as relevant to our contemporary politics (through its constant, shamelessly sensationalist evocations of the War on Terror).2

That such comments have not been far more common says a great deal about the political moment in which the show appeared, and what it takes to be "realistic."3 It also says a great deal about our moment that this is the show which has supplanted Star Trek as the template for science fiction television, its considerable influence evident in recent efforts like Stargate: Universe and the reimagined V (which reimagining is in itself a striking story of political inversion). Ken MacLeod has described this condition as well as anyone I can think of: "humanity as a rational and political animal died in 1979, and went to hell." The new Galactica, representative of the anti-humanist polar opposite to what Star Trek (once) stood for, is a story we told ourselves there.4

1. The original Galactica has also been noted for its right-wing politics. Still, many of the changes took it further in that direction, like the emphasis on humanity's nature as fallen, and the treatment of particular characters, like the transformation of Gaius Baltar from a power-hungry politician into a horny, status-seeking scientist with an accent the show's primarily North American audience would regard as foreign. There is, too, the addition of the civilian-military power struggle, which had Adama in a tug of war with President Laura Roslin, presented here as out of her depth, as she had been "only" a teacher earlier – not just a member of a profession held in low regard among Americans (and a lightning rod for the same anti-intellectualism we see in the treatment of Baltar), but one identified with the "liberal establishment" conservatives frequently demonize.
2. The show's first episode "33" had the heroes shooting down a hijacked passenger craft, while later in that same season Cylons were seen blowing themselves up like suicide bombers, captured Cylons got waterboarded, etc.. The result was to equate the September 11 terror attacks with the annihilation of a twelve-planet civilization, al-Qaida with the military might of the Cylons, and the context of a handful of refugees in a flotilla of life boats to that of the United States today – comparisons which entail so much exaggeration as to render any analogy useless (while the tone was usually that of FOX News in Space).
3. Admittedly, the show did not adhere to the view described with perfect consistency, at least after the first two seasons. Indeed, some argued that the politics were quite the opposite of those described here in season three. In particular, much was made of the show's protagonists becoming resistance fighters against the occupying Cylons in that season's first episodes, with many reviewers suggesting an equation of the human "insurgents" with present-day Iraqis, while subsequent episodes like "Dirty Hands" and "The Woman King" entered new territory by raising questions of class, labor and bigotry. (Indeed, Jonah Goldberg adored the first two seasons, and hated the later ones, which thoughts he shared in a post pointedly titled "How Politics Destroyed a Great TV Show" – which is to say, how the merest whiff of liberal politics made intolerable to him a show he'd once enjoyed because of its agreement with his world-view.) However, the extent and depth of the changes should not be overstated. They did not alter the basic premises of the series, and in any case, were quickly drowned in the pseudo-religious muddle already moving to the fore.
4. Ironically, the new show's creator, Ronald D. Moore, worked as a producer and writer on the Star Trek franchise, scripting nearly sixty episodes of Star Trek: The Next Generation, Star Trek: Deep Space Nine and Star Trek: Voyager, as well as the films Star Trek: Generations and Star Trek: First Contact.

Wednesday, April 25, 2012

Making Iron Man 3: A Geopolitical Perspective

It now appears that Chinese company DMG will invest nine figures in the production of Iron Man 3 – an unprecedented collaboration between a Chinese business and Hollywood, which will also see DMG distributing the film domestically.

I was immediately struck by the irony of the news given not just the fact that Iron Man's most famous antagonist is the Mandarin (as many a comic book fan has already pointed out in the comments pages to various reports of this development), but the story of this particular hero. Tony Stark, after all, is just a Victorian Edisonade protagonist updated for the world of the Cold War-era military-industrial complex – who began his adventures while fighting "Asian Communism" in Vietnam (the Cold War hawkishness of the early '60s Marvel comics being especially pointed here).

Certainly much has changed in world politics in the half century since that tale was first penned. Yet, while some more recent writers (like Warren Ellis) have offered relatively nuanced treatments of the comic, the movies pretty much stuck with the simplicities of the original vision, trading the Cold War for the War on Terror, and Vietnam for Afghanistan, while playing up the idea of Iron Man as an unapologetic embodiment of U.S. military superiority, and military interventionism - which one would certainly imagine to be problematic from a Chinese perspective.1 In the second film, China is even cited as a possible antagonist to the U.S. (when Joshua Hammer names it along with Iran and North Korea while speculating about potential foreign buyers of the Iron Man technology). Moreover, there has been reason to think that such politics matter, in the wake of the complications faced by the producers of the recent remake of Red Dawn.2

Of course, it may simply be that business trumps politics in this case. The attraction of this particular business opportunity, and perhaps, a closer relationship with Hollywood over the long-term, may appear too great to resist (especially with a staggering $3 trillion worth of foreign exchange burning holes in that nation's pockets, and the prospect of the movie being partially filmed in China itself). Additionally, while it is hard to think of a major comic book hero that would be more offensive to the sensibility of a country run by a Communist Party than Tony Stark, China's establishment has long since ceased to be Communist in anything but name – and in the post-Mao era in which "To be rich is glorious," and the Chinese Dream looks not unlike the American Dream, it could be that a figure like Stark (a tech industry Gary Stu if ever there was one) enjoys a greater appeal than may seem the case at first glance.

1. For a critical take of the politics of the Iron Man films, check out Cristobal Giraldez Catalan's review of the first movie and Hiram Lee's reviews of both the first and second films.
2. Worried by Chinese disapproval, the producers changed the absurd premise of a Chinese invasion of the U.S. to an even more absurd one in which North Korea appears as the would-be conqueror of the United States – by editing the already-shot film, a procedure I expect will appear clumsy when the film actually hits theaters. (Incidentally, the scenario of a North Korean invasion and occupation of the U.S. was presented in last year's video game Homefront, written by John Milius, director of the original Red Dawn.)

Sunday, April 15, 2012

Remembering the Titanic

I have at times wondered why, a century after the event, the sinking of the Titanic continues to receive as much attention as it does. There have been bigger ships (the supertankers and aircraft carriers aside, there have been been bigger liners, like today's colossal Oasis class ships), and deadlier maritime disasters (like the sinking of the Wilhelm Gustoff, which killed more than 9,000, and the Goya, which killed more than 6,000, compared with the 1,500 who died when the Titanic went down). The sinking of another British ocean liner, the Lusitania, just three years later, took nearly as many lives (almost 1,200), and had a far greater significance for the course of world history given its impact on international opinion during World War I.

Part of this seems a reflection of the memorialization of the event, which in turn has made it better known and therefore more likely to be memorialized yet again, as has certainly happened on the movie screen – in 1955's A Night to Remember (adapted from Walter Lord's successful book by Eric Ambler), in 1964's The Unsinkable Molly Brown (based on the hit play), and of course, in 1997's James Cameron's Titanic (one of the biggest blockbusters in history). But this hardly explains why the event was memorialized in the first place, or why the retelling of the story as fiction and nonfiction continues to find an interested audience.

It seems notable that the Titanic was a Belfast-built ship of the British White Star line, bound from Southampton, England for New York, which sank off Newfoundland, Canada – and did so in peacetime, when ships might have been expected to safely cross the ocean, and sailing associated with pleasure rather than necessity and danger. By contrast, the Gustoff and Goya were German ships sunk by Soviet submarines in the Baltic in the last months of World War II, denying them such resonance for the English-speaking world. At the same time, the absence of the obvious political freight of events like the sinking of the Gustoff, or the Lusitania, from the story of the Titanic; the remembrance of the ship as the epitome of luxury in its era; and the presence of numerous celebrities of the day on its passenger list (members of the Astor and Guggenheim families among them); makes it suitable raw material for those inclined to present the human experience as a collection of "stories for the amusement of the comfortable classes" (as Gore Vidal once said of a particular bestselling historian).

Yet, even if one can approach the story in this way, the event also lends itself to an extraordinary range of allegories and comments. One can see in the disparities of wealth and position among those who journeyed aboard the ship, the collision with the iceberg, the handling of the accident and the aftermath a story of the technological hubris, corporate irresponsibility, and social inequality of an era that in its fundamentals is not much different from our own. (James Cameron, certainly, has highlighted this side of the story in his comments about his film.) At the same time, the 1912 disaster can appear like one of the last moments of the long nineteenth century that has been such an object of nostalgia, and a foreshadowing of the disasters that lay ahead. (Indeed, the first episode of Downton Abbey opened with a telegram regarding the ship's sinking, and the death of the heir to the titular estate.) Indeed, the fascination of the Titanic appears a miniature version of the hold that period has on our imaginations – the Victorian-Edwardian era both as we knew it, and as it might have been.

Monday, April 9, 2012

The 50th Anniversary of the James Bond Film Series

As fans of the James Bond film series know, this is the fiftieth anniversary of the franchise's launch with 1962's Dr. No. Naturally, the fact is being commemorated in manifold ways big and small, from a highly publicized reissue of the films on Blu-Ray, to exhibitions at film festivals from Toronto to Muscat, to retrospectives in the press.

Those offering comment on the subject typically note the series' "reinvention" of itself during its half century of life. However, such reinvention is not unproblematic. Indeed, I have found myself wondering about a question that no one in the entertainment press ever seems to ask: at what point is it time to let an IP go? To stop rebooting and updating and just leave things be and move on to something else?

The answer Big Media offers is "NEVER!" Anything their companies hold, and for that matter, anything with a name recognizable enough to be suitable for the high concept approach, is to be milked for all it can possibly give and beyond, for reasons I've already discussed here many a time. But for those with art rather than money on their minds, this answer is unsatisfactory. Instead I would say that it is time to let go when what is distinctive about the original ceases to be credible or acceptable, when it can only be presented to the audience ironically or apologetically or in exceedingly sanitized form, or with enormous strain.

It seemed to me that this had already happened with the Bond films when I wrote "The End of James Bond?" back in August 2010. Certainly a few stylistic touches remain to the present, like the opening shot through a gun barrel, the John Barry score, and the line "The name is Bond. James Bond." However, at the core of the earlier films there were also the following, rather more substantive elements:

1. A charged context of high-stakes conflict between world powers (like the Cold War, especially its early phase, in which détente had yet to come about, and the British Empire was breaking up but the metropole not yet reduced to the standing of a "normal" country).
2. The particular model of sophistication, machismo and hedonism represented by the central character (black tie settings, hard liquor, tobacco, skill at upper-class pastimes like golf and baccarat, gambling, flings with numerous women).
3. The plot formula, originally developed in Fleming's novels but ultimately perfected by the films, which has Bond up against freakish megalomaniacs with massive resources and global ambitions, and their even more freakish henchmen. The battle typically begins with Bond engaging the villain in games (golf, cards, etc.) in social settings mixing gentility with murderousness, evading the subsequent assassination attempts, becoming involved with girls good and bad, getting captured and escaping from elaborate death traps, and in the climactic confrontation (usually at the villain's over-the-top fortress) saving the world as the clock ticks toward oblivion – typically with the help of an arsenal of gadgets.

Due to the changes in Britain's international position (and the broader international situation) since 1953, 1962 or even 1989, the first of these has become possible only with a retro approach, as in Sebastian Falk's Devil May Care (2008) - an experiment which has not been repeated in print, and seems an even less likely bet on screen. The second has largely been quashed by changing conceptions of glamour, luxury and the action hero (a guy in a tux no longer making the impression he used to), to say nothing of concessions to feminism and the New Puritanism. The third, deeply worn by heavy use (twenty films between 1962 and 2002, excluding 1967's Casino Royale and 2002's Never Say Never Again, as well as profuse imitation and parody) has also been abandoned in the aforementioned concessions, and more significantly, the pursuit of "gritty realism." Indeed, in their playing off the Bond films the Austin Powers movies (and to a more modest extent, 2002's xXx) differed from the innumerable earlier books, TV shows and films parodying 007 in their presentation of the series' tropes as not merely cartoonish or silly, but as dated, and the dating of aspects of the series from the playboy attitudes to the villains' schemes central to many of their gags.

The result is that this anniversary seems to me rather a hollow one – like that of a marriage in which each spouse has forgotten what they ever saw in the other.

Friday, March 30, 2012

Game of Thrones: Season One

In some respects George R.R. Martin's Song of Ice and Fire seems an obvious fit with HBO. Their programmers have long gravitated toward material that can be labeled "dark and gritty," and the success of Showtime's The Tudors has suggested an audience willing to watch this kind of thing when the characters wear period costume and live in castles. Martin's fiercely anti-romantic take on the high fantasy genre certainly fit that bill, full of the moral ambivalence and brutality and unsympathetic characters critics love to praise writers for writing - the jokes about "The Sopranos of Middle Earth" not too far off the mark as a description of its intrigues, which are driven not by the machinations of some external Evil impinging on a happy world, but the base ambitions and hungers of high nobility (though north of the wall, at least, there are also the stirrings of a common danger).

Additionally, Martin's novels marginalize the fantasy elements (magic, imaginary creatures and the like), particularly in the series' earlier volumes, making this kind of thing easier for critics and mainstream audiences to take (while relieving some of the pressure on the FX budget, which for the ten-hour series is likely less than a movie studio would bring to bear on a first-rank two-hour film). The fact that it has become routine for television networks to buy the rights to genre novels as a basis for their series (like Flash Forward, and The Vampire Diaries), and that Martin's books have as big a built-in audience as any recent work of speculative fiction not written for young adults (indeed, Martin was virtually the only non-YA fantasy author on last year's list of the top hundred selling books), doesn't hurt.

Yet, there were plenty of reasons for doubt about how the production would fare. The source novels are not just long but dense, packed with characters and intrigues which are both complicated and complex. My hardback edition of the first runs to 674 pages, not including the 20 page appendix listing the characters by house – and the story only gets bigger and more tangled with each succeeding volume. (Indeed, the fourth book of the series ended up being split into two volumes, each focusing on a different set of characters.) Translating that into a TV series which is both faithful and accessible is not a simple thing. Additionally, much of the tale is told through the eyes of very young characters, some of them children (Arya, Bran), hardly the kind of thing the channel seemed likely to accommodate. I wondered, too, if the show's production values would do Martin's world justice. (Martin himself has recently wondered what the Battle of the Blackwater would look like when it reaches the screen.) And at any rate, HBO's track record with speculative fiction has not been impressive.

On the whole, I was pleasantly surprised. The first episode was admittedly a bit shaky. Exposition-heavy, it felt a bit crowded and disjointed, the milieu thin. However, the storytelling quickly got smoother, and the world of Westeros more satisfactorily developed. I was ambivalent about a few of the tweaks (the softening of Cersei Lannister, the use of Renly Baratheon as a foil for his older brother rather than a younger version), but most of the adjustments - the scenes created from scratch, the alterations of events and the like - aided in clarifying the storyline, better establishing the characters (especially the non-viewpoint characters, who so often come off as opaque in Martin's writing), and holding together a sprawling epic. Certain scenes and bits of action are admittedly scaled down from the novels (like the depiction of Vaes Dothrak and the Twins, and the major battles), but never in such a way that it compromised the story (though one might wonder about their presentation of the Battle of the Green Fork). And of course, the cast has been deservedly praised (with Peter Dinklage notably picking up an Emmy for his performance as Tyrion Lannister - a rare win for genre television in the acting categories). As a reader of the original novels I came away satisfied, but I think the series would have been accessible and entertaining even if I came to it without that background, and it has left me looking forward to season two, which starts airing Sunday, April 1.

C. Wright Mills and the Day Job

The sociologist C. Wright Mills concisely described a major problem facing those who must hold day jobs in his classic sociological study White Collar:
Alienation in work means that the most alert hours of one's life are sacrificed to the making of money with which to "live." Alienation means boredom and the frustration of potentially creative effort, of the productive sides of personality. It means that while men must seek all values that matter to them outside of work . . . they must be serious and steady about something that does not mean anything to them, and moreover during the best hours of their day, the best hours of their life (236).
For more about the book, check out my review of the book at my other blog.

Monday, March 26, 2012

The Matarese Circle – Coming to a Theater Near You?

With the success of the Jason Bourne film series, Hollywood has unsurprisingly taken an interest in bringing other Ludlum novels to the screen – including 1979's The Matarese Circle - the book Ludlum published right before Jason Bourne's first adventure.

For the moment, it seems that the project is dead, but it could yet come back from development hell. If it does, the age of the novel (over three decades old now) presents challenges to filmmakers trying to update it for a twenty-first century audience, just like The Bourne Identity did. Indeed, they may be even bigger challenges.

Perhaps the first has to do with the novel's principal characters - American spy Brandon Scofield and Soviet operative Vasili Taleniekov - putting aside an old enmity to fight the titular villains. In this Circle is very much a Cold War story, rooted in a now-vanished context, the loss of which will mean that putting an American and Russian agent together simply doesn't involve the same drama. Indeed, retaining the Russian nationality of Scofield's ally seems more like inertia than anything else given today's geopolitics.1

Much the same is the case with the novel's two central real-world themes – international terrorism, and corporate power. Terrorism is still topical, and this book doesn't have the disadvantage The Bourne Identity did of being centered on a specific, real-life terrorist, and the highly idiosyncratic strategy (to put it generously) for taking him down Ludlum dreamed up. Yet, the character of terrorism (or at least, what we label as terrorism) has changed, the cosmopolitan, leftish groups that captured headlines in that era, like the Italian Red Brigades, giving way to groups with ethno-religious identifications (in the American imagination, synonymous with Muslim fundamentalism). Aside from affecting nuances of the plot, it makes me wonder what the writers would do with the character of ex-Red Brigade member Antonia.

Corporate power may also be topical – but again, the image of this has also changed after nearly four decades of neoliberal globalization. I suppose Dr. Evil's Number Two said it best in Austin Powers: International Man of Mystery (1997):
"I spent the last thirty years of my life turning this two-bit evil empire into a world-class multi-national. I was going to have a cover story with Forbes. But you, like an idiot, want to take over the world. And you don't even realize that there is no world anymore! There's just corporations!"
When that's how the world appears, when we live in the age of the Davos World Forum, elaborate plots by corporations to seize political power seem quaint - and even superfluous.

Moreover, the Matarese's rationale for seizing control belongs to another era, the very same one that gave us the original Rollerball (1975). Dubious as they have always been (critiques of corporate influence over politics go back at least to Adam Smith's oft-cited but generally unread Wealth of Nations), the claims of corporations to a technocratic, meritocratic rationality that would represent an advance over the administration national governments supply possessed a shred more credibility then – credibility long since obliterated by the endlessly demonstrated greed of executives unable to think beyond their next bonus, and the pandemic of short-termism that follows from it, with all its fiscal and other consequences; by corporate indifference to environmental degradation, resource exhaustion and not just social justice, but social stability, as a part of their standard operating procedure; by business's continued reliance on politicians who cover corporate economic agendas with retrograde versions of nationalism and religious fundamentalism. (And of course, as even neoliberal cheerleader Thomas Friedman owned in The Lexus and the Olive Tree, "McDonald's cannot flourish without McDonnell-Douglas.")

The recent remake of Rollerball (2003) (which deservedly made io9's recent list of "worst science fiction movie remakes of all time") was a toothless mess, unsatisfying even as a simple action movie, despite its being helmed by the director of Die Hard. It is not a foregone conclusion that The Matarese Circle will be that, but I wouldn't bet against that either.

1. I'm actually reminded of 2009's G.I. Joe: The Rise of Cobra, which for its plot had a lame rip-off of the 1977 Bond movie The Spy Who Loved Me. Like that earlier film, the G.I. Joe movie had for its villain a madman (Destro this time) in an undersea fortress (incidentally, visually reminiscent of Karl Stromberg's aquatic facility in the Bond film) whose plan was to launch stolen weapons of mass destruction (nanite-based warheads, instead of submarine-launched nuclear warheads) at both Moscow and a major American city on the eastern seaboard of the United States (the newer film shifted the target from New York to Washington), then build a new order in the aftermath. The writers even decided to throw in a love story between Duke and the Baroness - a move with no basis in the history of these characters.

Friday, March 23, 2012

Stargate as Star Trek

It has become something of a tradition for new space operas to proudly label themselves an un-Star Trek or an anti-Star Trek. Yet, the only show that actually approached Star Trek-like success was the show that perhaps came closer than any of the others to Star Trek in concept and feel – the Stargate television franchise.1

Like that previous show, it had a team of humans from a military organization (one can think of as Star Fleet as at least quasi-military), with a usually stone-faced alien companion (Teal'C, to the original Star Trek's Mr. Spock) venturing out on journeys of exploration, in which they routinely encountered English-speaking humanoids on other planets.1 As the story was set in the present day Earth was not a utopia – but it certainly appeared more attractive than the rest of the galaxy, where most of the human species lived in theocratic slavery under the iron heel of the principal villains, the Goa'uld, or where these were absent, some other form of backwardness or repression. (More technologically advanced or socially progressive human societies were few in number, and generally lost their luster when closely examined.) One might add, too, that when the show explored Serious Themes, or offered social or political commentary, it was typically by way of the depiction of those other worlds – our current problems, on the screen at least, looking more like Other People's Problems. The result was that, for all its flaws, Earth seemed a beacon of freedom and enlightenment, its condition something to be aspired to rather than transcended. Stargate even used the tropes of some of the original Star Trek's most memorable episodes, like "Mirror Mirror" (paralleled in the SG-1 episode "The Road Not Taken").

To be sure, Stargate drew on other inspirations. The titular stargate itself can be regarded as a bit of steampunk – a big, solid, steam-snorting machine clearly modeled on that piece of Victorian high-tech, the rotary dial telephone (dialing seven numbers to get an interstellar connection), and evocative of the era of Howard Carter (and Indiana Jones) in its rediscovery in Egypt in the 1920s. The extravagant Goa'uld were Oriental despots right out of Flash Gordon. There was more than a bit of the Roswell mythology in the Stargate program's secrecy, the reverse-engineering of alien technology, and the appearance of "gray" aliens (the Asgard). There was even a bit of "media savvy" Galaxy Quest-like self-parody, with the milestone 100th and 200th episodes of the first Stargate series given over to humor of this kind (centering on the show-within-a-show Wormhole X-treme!), and fan service-y casting in the inclusion of Farscape's Ben Browder and Claudia Black as core members of the cast in the ninth season.

Yet, the Star Trek-like elements (admittedly, used in a rather conservative way) were what held the show together, and it seems notable that the Stargate series which met with the least success – Stargate: Universe – was the one that broke with that pattern, playing rather more like Battlestar Galactica lite in its head games and soap opera and overall tone. It ended after a mere two seasons, but in fairness, this had much to do with the show's second-season move to Monday nights, and the changing priorities of the channel's executives (wrestling, reality TV). For the time being it seems that the reimagined BSG seems likely to enjoy the status of fashionable template for anti-Treks for some time to come.1 Indeed, TV's next space opera will likely be a new entry in the Galactica saga: Battlestar Galactica: Blood and Chrome. (Alas, even with the dearth of such programming on television now, I can't say I'm really looking forward to it.)

1. Together the three series, Stargate: SG-1 (1997-2007), Stargate: Atlantis (2004-2009), and Stargate: Universe (2009-2011) ran for fourteen years, from 1997 to 2011, during which they produced 354 one-hour episodes – figures that are actually comparable only to Star Trek.

On Star Trek Bashing

Star Trek bashing is, of course, an old pastime among science fiction connoisseurs. Some of this is a genuine response to the artistic limitations of the show. (A significant part of the writing of the original series has not aged well, and its successors also have their clunky bits – while no one can dispute that the genre's cutting edge had already moved beyond it in 1966.) Some of this is likely a reaction to the highly publicized excesses of some of its fans (like the study of the Klingon language). And some of this is the inevitable snobbery toward a franchise which has enjoyed such enormous commercial success as to be almost synonymous with the science fiction genre in the minds of "mundanes" (except for Star Wars, nothing else is comparable), making the limitations, and the fan excesses, all the more galling for those anxious about science fiction's image.

However, much of this has simply been a denigration (often overt, though rarely plain-spoken) of the show's humanism and "utopianism" – the last, a highly loaded and often misused term. Indeed, it might be more useful to discuss the show's setting less in terms of utopianism than what literary critic Patrick Parrinder described as the "scientific world-view," which held the world to be on a self-destructive course, and called for a new order not for the sake of some "fuzzy-minded" ideal, but as humanity's best (and perhaps only) shot at survival. H.G. Wells, a crucial formulator of this view, saw the combination of the nation-state system, the capitalist system and the irrationalities of nationalism and religion in an age of industrialized production and industrialized warfare as exactly that kind of dangerous situation, and called for the establishment of a world government, a socialist economy, and a rationalistic culture in their place.

Wells dramatized his argument in novels like The War in the Air, The World Set Free, and The Shape of Things to Come. In those works he depicted the decadence of the old order, its horrific collapse (predictions of which were, to some extent, validated by the Depression and the world wars), and the building of the new and different order that followed (which in our history did not proceed along the lines hoped for by the proponents of this view). Star Trek is set fairly far along in the history of such a new age, the series depicting a future in which some sort of global (indeed, interstellar) governance, a more humane economics (never labeled socialist, but at the very least post-capitalist), and a more rationalistic and tolerant conception of life has long since trumped the superstition, small-mindedness and bigotry of the past.

Of course the last four decades – this age of unreason, identity politics, ideologically convenient pseudo-science and neoliberal economics – have been a grave disappointment for those espousing such visions, who might well feel that instead of the Federation of Planets, we are turning into a frightening hybrid of the Ferengi and the Cardassians. Yet, the problems that gave rise to the scientific world-view continue to hang over our heads. Indeed, some of them have proven worse than Wells imagined (real-world nuclear arsenals have far more potential to destroy civilization than what he portrayed in The World Set Free), while the list of problems has actually lengthened since his day (as Wells certainly did not anticipate anything like anthropogenic climate change). And there has certainly been a scarcity of alternative ideas for dealing with them, "optimism" (another loaded, misused word) now seeming to consist mainly of dismissing or shrugging off very real problems, or a "faith" that a technical fix will conveniently appear, and conveniently be implemented in some market-friendly way – the techno-libertarianism that has been discredited time and time again. (Think, for instance, of what the first decade of the twenty-first century was expected to look like at the height of the tech bubble.) This leaves just despair on the part of those who regard something better than the present muddle as not merely an ideal but a necessity, gloating on the part of those who never had anything but contempt for their ideas, and the outlook of the "bright-sided" – hardly a healthy state of affairs.

The state of this particular subgenre of science fiction is just a symptom of the problem, not the disease itself, but the more I consider the situation the more it seems to me that not only is there still room for a vision like the one Star Trek presented, but that its presence would be a worthwhile thing, and perhaps even vital. Certainly the genre is richer for the emergence of shows with very different outlooks, like Lexx and Firefly – but we are far past the point at which the original Star Trek's approach can really seem stifling to would-be producers of science fiction television and film. If anything, with Battlestar Galactica now the template for TV space opera, it is humanism, and the hope of a better tomorrow – or even human survival – that is in danger of being stifled (all as the "dark and gritty" approach seems increasingly trite).

Where such drama is concerned, the trick is to make that humanism seem credible. The Star Trek franchise's last two television incarnations (Star Trek: Voyager and Star Trek: Enterprise) didn't quite pull off that feat, and J.J. Abrams' cinematic reboot didn't even try. Indeed, it seems to me that in their own ways, other, more recent writers have made better use of the basic principle, like Iain M. Banks in his Culture novels, or Ken MacLeod's Fall Revolution novels (particularly The Stone Canal and The Cassini Division). Of course, well-known as they are to readers of recent science fiction, they have reached a much more limited audience than the multimedia Star Trek franchise, and neither is a likely bet for a Hollywood production – but they do demonstrate what remains possible, all as the need for alternatives to continued wallowing in our age's low and still falling expectations grows only more pressing.

Saturday, March 10, 2012

The Blockbuster Strategy

Lisa Richwine and Ronald Glover published an interesting article yesterday on Hollywood's gamble of making fewer films, with a greater emphasis on "high-risk, high-reward" tentpoles – like this weekend's John Carter.

There is no mystery as to why Hollywood follows this path. The blockbuster, "high concept" strategy has become the standard operating procedure for Big Media as a whole (stodgy old publishing included), and there are especially pressing reasons for the major U.S. studios to adhere to it in film. After all, it is exactly this kind of film that is Hollywood's strength, because other countries' film production has simply lacked the deep pockets to compete in this area with any regularity. Additionally, the spectacle these movies offer is exactly the kind of product that other branches of the media, like television production, cannot match, and capitalizes most fully on the big screens that are the principal draw of the theatrical experience so crucial to a movie's earnings (as well as offering the best justification for the 3-D surcharges which have done so much to pad them these last few years). These films also travel quite well, routinely pulling in enormous grosses in foreign markets, often more than enough to compensate for a disappointing performance in North America (as was the case with the last Pirates of the Caribbean movie) – in contrast with a great many other Hollywood movies (like Will Ferrell comedies).

Yet, as those in business seem to constantly forget (or to have never learned in the first place), effective demand has its limits, and the market has always been crowded enough with this kind of product as to make costly flops routine. Thus far, the hits have proved more than sufficient to keep the studios at the strategy despite the losses. Indeed, Richwine and Glover note that Disney recently halved its film output to focus on "franchise films." However, as the budgets continue to go up (no one is shocked by a $250-300 million production budget anymore), and these movies' traditional target audience (young males) seems less and less interested in buying tickets, while DVD sales decline, I doubt this will stay a commercially tenable approach for long. Given that Hollywood seems unlikely to successfully revert to a more diverse output, I wonder if Hollywood won't soon come to appear a stumbling dinosaur - the way publishing has been for as long as I can remember.

Wednesday, March 7, 2012

The Cancellation of Terra Nova

Terra Nova has now traced most of the trajectory of big-budget science fiction shows on the major American television networks (especially FOX), from a launch with great fanfare and unrealistic expectations – to the ignominious end when the Suits lose their nerve and cancel it. Now we are in the post-cancellation phase where there is still talk of the show being continued on another channel, something that very rarely happens.

I don't get the sense that many will miss Terra Nova the way they did Dark Angel, Firefly or Terminator: The Sarah Connor Chronicles, though. Yes, the show had a promising concept and appealing visuals, for which it won a great deal of praise, but many also complained of lackluster characters and the overall quality of the writing. The complaints seemed a bit exaggerated at times (as other shows, not really much better, attracted far less complaint of the kind), but they were certainly not baseless, and are easier to understand when one considers those who complained mostly loudly – generally those who follow science fiction television most closely.

Despite its futuristic setting, reliance on classic, "hard" science fiction tropes (like travel to parallel universes) and flashy visual effects, the makers of Terra Nova gave short shrift to the tastes of "hardcore" science fiction fans. The show was actually rather light on world-building, extrapolation and general conceptual development compared with the television space operas with which its premise invites comparison. It made no effort to appeal to the considerable portion of this fan base which is intertextually inclined, the way that, for instance, Heroes and Sanctuary, two of the more significant hits of recent years, did. Even worse from their perspective was the fact that it was a family show, quite far removed from the dark, edgy treatment of its themes many of those who'd looked forward to the show expected, given its dystopic image of the twenty-second century, and the ways in which that milieu had presumably marked its characters.

Unsurprisingly, "bland" was a word I frequently encountered in discussions of the show in the blogosphere, and I suspect that the complaints were all the more pointed because, Terra Nova having been intended as lighter, family-friendly fare, the weaknesses of the writing were not obfuscated in the ways to which we have become accustomed – the cheap button-pushing that looks like intellectual, political or dramatic daring to superficial viewers; the obnoxiousness for obnoxiousness's sake on the part of the dramatis personae so often mistaken for a "courageous" willingness to present unlikable characters (while their verbal abuse of each other is praised as sharp and witty dialogue); the head games that can make a show's lapses in logic or coherence instead appear to trusting viewers like part of some intriguing mystery that will be satisfactorily solved later; the soap opera-like subplots which distract the audience from a story going nowhere by fixing their attention on such questions as who is sleeping with whom (or trying to); and the fan service that makes watchers more forgiving of the flaws that do come to their attention. (The remake of Battlestar Galactica offered plenty of all these – from its exploitation of current issues like the War on Terror and the culture war, to its muddled religiosity, to its sexuality – with the result that many a fan was shocked, shocked, when the finale proved that the show's story had never been anything but a pile of nonsense. Lost, too, had plenty of head games and soap opera, and likewise let many a viewer down when it was all over.)

In all this, Terra Nova was comparable in its feel and its fate to genre shows the Big Four made in the mid-'90s before virtually abandoning this territory to syndication and cable - like Earth 2 and Seaquest (which also had an especially large budget, and the Steven Spielberg name attached), and a reminder of the dilemma facing producers of genre shows on the major networks. On the one hand, the hardcore fans looking for innovative science fiction are not numerous enough to warrant their special attention, forcing them to look beyond these to a more general audience (like the families with dinosaur-loving kids the network was clearly shooting for with Nova).1 All too typically they end up pleasing neither group, alienating the genre fans without really winning over the more casual viewers (small-screen spectacle can do only so much from week to week, as Entertainment Weekly's Darren Franich points out), and so – launch with fanfare, disappointment, cancellation, a cycle that was a bit quicker and more intense in the case of Nova, I suppose, because the execs keep their creatives on shorter leashes, and because after the booms of the 1990s and 2000s, genre fans have become more demanding, further distancing them from those larger audiences the programmers hope to capture.2

Still, for all its faults I did think Terra Nova was improving as it went along, the way many science fiction series do on their way to being good or even great - including Star Trek: The Next Generation, which made so much else possible. I never expected anyhing quite so consequential for genre TV from this show, even if it did get the chance that I was fairly sure FOX wouldn't give it – but I would certainly give a second season of the show a look if it were to materialize.

1. I calculated this at 10-15 million for the United States in my article "The Golden Age of Science Fiction Television: Looking Back at the 1990s."

Monday, March 5, 2012

The Irrelevance of Oscar Night?

I remember neoconservative Robert Kaplan lamenting in his essay "The Dangers of Peace" that the "Academy Awards ceremony has achieved a status akin to a national holiday." Now the story many an entertainment journalist tells is that is that the affair seems decreasingly relevant to our pop cultural life – and some seem to fear, already irrelevant. There is, of course, some of the Hollywood press's usual exaggeration of such matters, as David Poland demonstrates over at Movie City News, but there is at least some basis for these concerns. The show's viewership has been declining for years , with the young appearing especially uninterested – the attitude paralleling what seems to be their increasing (for Hollywood, worrying) disinclination to go to the movies. (Indeed, the producers were so concerned with their loss of interest that last year they took some pains to win them back, in part by having Anne Hathaway and James Franco host the ceremony – a move which proved to be a complete failure at elevating this demographic's viewership.)

However, it seems to me less surprising that viewers are losing interest in the ceremony than that they ever took much interest in the first place. I myself have only watched the ceremony beginning to end once – way back in 1995. I haven't even tried to do that since, even through the years in which I vaguely hoped to (someday) pen a screenplay that would be deemed worthy of an Oscar nomination, and read Variety every week as part of my study of the craft – and now that I think about it, didn't see anything at all odd about this.

I suppose I didn't find it all that interesting a show. After all, the thing drags at four hours, and for all the energy and humor even a Billy Crystal brings to the stage, tends to be a rather staid affair. It's also awfully predictable. The press buzzes for weeks in advance about the front-runner, though anyone who's followed the business for any length of time hardly needs its speculations to hazard a good guess about how it will play out; there's a tedious predictability about the whole process of distributing the awards. Certain themes, certain characters, are simply much more likely to receive the Academy's accolades. (I think of the scene from Bowfinger in which Kit rants about how he needs to play a mentally handicapped slave, or the "Tearjerker" episode of American Dad.) Best Supporting Actor/Actress Oscars, while normally given to worthy actors, also function as bones thrown to performances and movies of sorts not typically recognized, because they are too funny, or too popular, or worst of all, too "fan boy" – the last type of movie particularly likely to win big in the technical categories, but be slighted when the rewards for writing, acting and directing are distributed. And so on and so forth.

So why not just read about the winners in the paper (or on the web) the next morning? I'd think. But people watched anyway. I suppose it was the romance of the cinema, the "cult" of the movie star, and the sense that Hollywood was the very center of the entertainment world, with the Academy Awards ceremony the biggest night of the year for all three. Alas, the mystique of all these has suffered badly in recent years.

The Romance of the Cinema
Some of the cinema's loss of its magic would seem to be due to the change in the nature of the movie going experience. Of course, a considerable element of nostalgia is an undeniable part of such arguments. Yet, there seems something ritual, dream-like, even quasi-religious, about people gathering in a dark chamber to be entranced by a flickering image the size of a wall. That was diminished as movie palaces gave way to multiplexes, and seeing a movie became an occasional detail in the course of a day at the mall rather than a destination in itself, with the viewing marred by a pre-movie battering with ads, and then during the movie itself, the cell phone conversations of gadget-addicted oafs. It is diminished further still when instead of a theater screen you see a movie on a TV in your well-lit living room with all the distractions of a household about you and a finger on the pause button, or worse still, the even smaller screen of a laptop or handheld device you watch on a crowded bus or train during your daily commute to work.

These newer ways to watch a film are today far more characteristic of how we go about that activity. In the 1930s and 1940s a clear majority of the population of the U.S. (65 percent) was at the movies each week, which worked out to the average American going to the movies more than thirty times a year.1 By contrast, the average American has gone four to five times a year in the 1980s, 1990s and 2000s, to go by per-capita ticket sales. Additionally, the time they spend in the theater on those rarer occasions has been reduced by the changed practices of exhibitors (like the abandonment of such parts of the experience as pre-movie cartoons, shorts and newsreels, and double-features). The result is that they spend only a small fraction of the time they used to looking at movie screens, perhaps ten hours a year, which is also much less time than they now spend in front of a television or computer screen in a typical week, or even day (doing, among other things, watching the vast majority of the movies they take in).

It might be added that the places where we watch our movies, and the technologies we watch them with, are not the only ways in which the experience has changed. We are far less likely to experience a film as a shared cultural moment, the movie that "everyone" sees increasingly a rare thing now – the distance between highbrow and lowbrow, between old and young, and persons split by innumerable other cultural and demographic divides ever greater. And of course, those diminished moments seem ever more fleeting, as movies come and go ever more quickly. Even an "event" film is likely to be "#1" at the box office for one week, or two, before attention has shifted to the next not-so-big thing.

And film as a whole has given ground to other media, the differences with which have been blurred by more than the smallness of the screens on which we watch them. Those who said in the 1950s that television spelled the end of cinema spoke prematurely, but they were right in identifying it as a very formidable, perhaps in the long-run unbeatable, competitor, one that has long been closing the gap with movie production in technical accomplishment and thematic ambition. In the process it has stolen an increasing portion of film's thunder artistically and commercially, a point evident even in video sales (DVD making television shows a force here such as they never were in the day of VHS). At the same time, one might wonder if moments which saw the medium of film revolutionized, with new ground broken, or something presented to us that we really didn't see before, have not grown scarcer – and owed much more to the genius of the FX artists who have topped themselves year after year in films like Jurassic Park, the Lord of the Rings trilogy, and Avatar than to anything else. Where other aspects of cinematic storytelling are concerned, the medium seems increasingly backward-looking and self-referential, with its remakes and reboots and homages and other kinds of recycling, its movies about movies and the people who make them, when film doesn't seem to be turning into something entirely different (as Jonathan McCalmont suggested in his review of Transformers 3, subtitled "The Ambivalence of the Metallic Sublime").

Indeed, in an age in which Ipods are treated like life support devices, Ipads and laptops are always in hand, and the cell phones are never off, I would suggest that film is for many (whom one suspects increasingly find the devotion of their full attention to a two-hour narrative a strain) just one more source of sound and image – like YouTube videos of low IQ amateur stuntmen – on which to draw for the highly personalized multimedia blitz in which they swaddle themselves during their waking hours.

The Cult of the Movie Star
One might add to the above the familiar lament that the stars of today fall short of their predecessors in the day of the Dream Factory. Once again, there tends to be a heavy element of nostalgia in such talk, but there still seems to be something to it nonetheless.

Certainly stars in the old moulds, stars like the big box office draws of the studio era, seem at the least an improbability today. In their day stars were held up as exemplars of some contemporary, yet archetypal, ideal, in a way that seems to have been possible only in the hothouse of a more genteel screen-world with very particular attitudes toward such matters as social class and gender. The machismo of Clark Gable and Errol Flynn and Humphrey Bogart, the sophistication of Cary Grant and David Niven, the innocent sweetness of Audrey Hepburn and Marilyn Monroe, all existed in that context – and became impossible when it disappeared. The social signifiers out of which a distinctive, appealing screen image might be built up seem ever-shakier things, while the ideals that they would convey have become a far less certain thing. (Take, for instance, the idea of gentlemanly sophistication. As Jeremy Black observed in The Politics of James Bond, there has long since ceased to be "a secure basis" for its representation, while it has been increasingly unclear that this is a desirable thing for, for example, an action hero to display.) At the same time, the gritty, the crude, the scatological have become pervasive, and entail indignities incompatible with that larger-than-life, better-than-life quality stars are supposed to have (like being on the butt end of toilet humor).

Whatever the prospects for reconciliation between old-fashioned stardom and these changed cultural expectations, the classical idea of the star seems irreconcilable with the reality that today's biggest names tend to become those by appearing in the very productions (indeed, series' of productions) most prone to subordinate, even overwhelm, plot, character – and star – with concept and spectacle. Consider, for instance, Shia LaBeouf as he shares the screen with Autobots and Decepticons, or the number of actors whose biggest movies have them playing an already established, iconic pop cultural figure likely bigger than they can ever hope to be – like Christian Bale in his Batman films. The result is that while it is often lamented that the age of "high concept" has meant fewer good roles for women, it may have meant fewer good roles for anyone in the sense of a role giving an actor a chance to really make their mark in, let alone dominate, a major film – as a star is supposed to do. Stardom seems irreconcilable, too, with the fact that, just as our taste in subject and style is more fragmented, it is tougher than ever to find actors with anything approaching a universal appeal, the Big Names we have instead reminding us of just how divided we are. (Think, for instance, of how men and women tend to respond to Sarah Jessica Parker and Megan Fox, respectively.)

The changing character of publicity doesn't help. The hypertrophied portion of the media living off of public interest in the doings of the rich and famous – E!, TMZ and the rest of the multimedia paparazzi machine – has overloaded us with celebrity gossip-themed infotainment. And much of this constant coverage presents stars and would-be stars as slobs and oafs and worse as we see them going into or coming out of rehab or cop cars or courthouses, or watch "experts" minutely scrutinize the signs of their plastic surgery and their weight fluctuations and their emotional outbursts. There is, too, the Great Leap Into the Sewer of the reality TV revolution, which has turned a whole category of skuzzy, scummy snots into "stars." Meanwhile "real" (e.g. movie) stars routinely get reality shows of their own – on their way down, the indignities of their fall no longer taking place out of sight, but instead served up to their fans in a desperate grasp at another fifteen minutes of fame. That abundance of unflattering material not only makes it much harder to support some carefully cultivated persona, but is a constant reminder that stardom is itself a show – and while this was just as much the case circa 1930, it seems we have grown too conscious of the fact for any amount of postmodern reveling in superficiality for superficiality's sake to fully repair the damage to this delicate illusion.

What remains of old Hollywood-style glamour is, far more than before, shared with the worlds of music and fashion, the allure of which has often eclipsed tinsel town's luster. The '90s was the heyday of the supermodel, but something of the fascination with their industry endures in ways large and small. Much like the Oscars, the Victoria's Secret Fashion Show is an annual television event, broadcast on a major network in prime time. (At a mere one-hour in length, offering lingerie models and that aura of luxurious hedonism the brand's promotion excels at conveying, it is rather a different approach to entertainment – which despite drawing just a fraction of the Oscars' viewership, was judged to have had very good ratings last year.) Meanwhile, Christy and Linda and Cindy and the rest of the faces of haute couture from the supermodel's glory days are still on magazine covers and television screens (in commercials selling anti-aging products they credit with keeping them forever young), while Carla offers another reminder of that era every time news cameras are pointed at the First Lady of the French Republic.

This, too, adds to the difficulties in the way of attaining the larger-than-life, better-than-life quality expected of the star, even with the biggest and most sophisticated of publicity machines squarely behind them. Unsurprisingly, they make it less and less often, with the result that the "A-list" is in decline, and bankability fading as a meaningful concept (as Matthew Belinkie suggests in his post "The End of Movie Stars?"). Indeed, it often seems to me that the function the casting of stars serves in the moviemaking process is less about capitalizing on their draw than convincing the audience that a particular film is indeed a major production deserving of their time and money through the inclusion of well-known (and therefore costly) performers.

Thinking of all this I remember hearing the common refrain among British monarchists that Britain "needs" the royal family because it "doesn't have a Hollywood of its own." I've always found this an unbelievably stupid thing to say, for a multitude of reasons, above all the claim that celebrity worship is some deep-seated human need which can all by itself justify feudal political anachronisms. But it's also the case that even Hollywood isn't what it used to be.

The Centrality of Hollywood
These developments reinforce each other, the diminished status of film making movie stars seem smaller, and the weakening of stars' celebrity changing the way in which audiences respond to film. At the same time, even where film is concerned, Hollywood seems less and less the center of the entertainment universe.

Certainly where the U.S. box office is concerned, only one foreign-language film (excepting Mel Gibson's The Passion of the Christ and Apocalypto) grossed more than $100 million at the U.S. box office, only five more than $20 million, only seventeen more than $10 million during the whole decade of the 2000s according to Peter Knegt of Indiewire. This makes for an average of less than two a year hitting the $10 million mark, and the total from all these films typically accounting for well under 1 percent of the revenue from ticket sales. Moreover, even these movies tend to have been made by directors with some Hollywood experience (like Ang Lee's Crouching Tiger, Hidden Dragon, and Guillermo del Toro's Pan's Labyrinth) with actors known to American audiences from their appearance in Hollywood movies (like Chow Yun-Fat and Michelle Yeoh in Crouching Tiger) – the foreign films that make it even this far generally those that seem least foreign to Americans. And of course, where global moneymakers are concerned, the U.S. still remains far and away Number One, year in, year out, at home and abroad.

And yet, there are ways in which Hollywood is weaker than it appears. There are sizable markets where domestic productions have cut into Hollywood's share of the annual lists of top-earning movies in the past decade, like Japan and South Korea – a fact that has drawn surprisingly little commentary. It also tends to be forgotten that while Hollywood continues to have a near-monopoly on the megabudget blockbusters, a great deal of American production does not travel very well – like Will Ferrell's comedies. (The movie widely regarded as his biggest hit, Talladega Nights: The Ballad of Ricky Bobby, earned over 90 percent of its ticket sales domestically, and much the same has been the case with his other hits Anchorman: The Legend of Ron Burgundy, Blades of Glory and Step Brothers.)

Additionally, it seems that the traditionally insular American market, so much more accustomed to exporting rather than importing pop culture, has become more permeable. It is not only the case that British imports have long been a routine part of what Americans see on their screens, or that today American and Canadian production are increasingly difficult to distinguish from one another. Spanish-language networks like Univision and Telemundo beam product from across Latin America into a great many American homes, and MHz Networks brings far more diversity than that to cable viewers. The profile of the Hong Kong film industry that produced Bruce Lee, John Woo, Jackie Chan and Jet Li has fallen, but Japanese anime and the Korean Wave and Bollywood are strong presences – more cult than mainstream admittedly, but still a part of the landscape. Equally, the originals of the foreign movies Hollywood remakes so frequently are easier to access – typically on cable and video, so that a focus on ticket sales understates their impact, which is not wholly confined to snobbish intellectuals, hardcore fan boys or immigrant communities. This all makes Hollywood seem a smaller place, one that often comes off as self-important and even provincial in its attitudes – not least, the fact that the whole of production from outside the English-speaking world is typically segregated into a Best Foreign Language Film at the Oscars (a far cry from the cosmopolitanism of the Cannes Film Festival). The more sophisticated viewers, those most likely to be appreciative of film as art form and tradition, are also those most likely to see Hollywood's limitations.

A Generation Gap
Every section of the filmgoing audience has been affected by these changes, but these attitudes are more evident among younger viewers than the rest. It is they whose experience of film is least connected with the theater, they who make the fullest use of the new media technologies.2 (Senior citizens love their cell phones as much as everyone else, but I generally don't see them watching videos on such devices.) It is they whose take on stardom and celebrity most fully reflects the age of high-concept, tasteless gag and reality TV. (Indeed, a crucial reason why newer actors are failing to make it as stars may be that this demographic simply can't be sold on the concept.) And it is they who are most prone to look beyond America's movieland not just for their images of glamour, but their entertainment.

It is worth noting, too, that younger filmgoers have virtually no cinematic memory, despite their unprecedentedly easy access to old movies via video, cable and the web. Walk into a college classroom and mention just about any film from before the last decade, for instance – let alone the stuff that used to run on AMC back when those letters stood for "American Classic Movies" – and with very few exceptions you will get blank stares. (I have even had this experience in university classrooms that I knew had more than their fair share of film majors.) Not only do they not share the nostalgia of (mostly) older film fans; they have little consciousness of exactly what it is for which older filmgoers feel such nostalgia.

Consequently, the Oscars seem just one more awards show in a whole season of them, all minutely covered in the press, and one that seems comparatively remote at that in its standards and mechanisms of selection, next to less prestigious, but more audience-oriented – and youth-oriented – presentations like the People's Choice Awards, or the MTV Movie Awards. In many a case, they haven't even heard of the movies up for little statues simply because they don't even see those sorts of films, and those among them who have artier sensibilities are likely to feel that the Academy overlooks their particular favorites (as Julie Gray confesses has been her experience). And all of these awards shows put together appear just one part of the ever-more visible life of the worldwide media-industrial complex (every other quarter of which has its own awards shows, from TV commercials to video games).

None of this is to say that the end of film, or the end of Hollywood, or even the end of the Oscars as an annual event, is at hand. Rather it means they are part of a scene in which cultural influence is more widely distributed; in which film, Hollywood and the Oscars remain, but share their prestige, their power to excite, their claim on audience affections with other media. Additionally, given that the trends that have already wrought the change – like the fragmentation and globalization of media, and the increasing ubiquity of media itself – seem likely only to continue progressing (or even accelerate) in the years to come, it seems that this diffusion and complexification will only continue as time goes by.

1. The data on movie attendance has been taken from Michelle Pautz, "The Decline in Average Weekly Cinema Attendance," Issues in Political Economy 11 (2002). Accessed at http://org.elon.edu/ipe/pautz2.pdf. The figures on per-capita ticket sales were calculated from data from Box Office Mojo.
2. John Anderson observed that on many occasions during this year's ceremony, "participants reflected on movie going as something they remembered fondly from their childhoods. They might have been talking about the Civil War."

Subscribe Now: Feed Icon