Sunday, September 25, 2011

New Review: The Politics of James Bond: From Fleming's Books to the Big Screen, by Jeremy Black

Westport, CN: Praeger, 2001, pp. 227.

I initially became familiar with Jeremy Black through his historical writings, which include a number of excellent works (like his critique of the state of military historiography, Rethinking Military History, which I reviewed at my other blog). It later came as a pleasant surprise that he'd written a study of the James Bond series. Black examines both the novels (the original works by Ian Fleming, as well as the series' continuation by Kingsley Amis, John Gardner and Raymond Bensen) and the films (focusing on the EON series, but also offering briefer mentions of 1967's Casino Royale and 1983's Never Say Never Again) as these stood by the time of writing (early 1999).

As the title indicates, the politics of the series' evolution is the main theme of that examination. Black certainly does the race/class/gender stuff fashionable in academic literary criticism, but his writing in this area is better-grounded than most such studies, and virtually free of obfuscating jargon – in part, I suspect, because being a historian rather than a professional critic, he is less prone to the failings of literary theorists. He also has a strong sense of the appeal of the works, never forgetting that the works he examines are more than objects in which to root around for ever subtler instances of yesteryear's prejudices. His background as a historian also makes him particularly well-positioned to deal with the evolving geopolitics of Britain's position during the period under study.

Still, there are ways in which it could have been stronger. There are a number of points where I felt more treatment of a subject was called for, from Fleming's own personal background (certainly a significant influence here) to the scripting of The Spy Who Loved Me (during which the question of approaching the period's politics was a major issue). Additionally, it is the case that any book about a still-continuing series like the Bond novels and films dates quickly. The fact that the book's analysis ends prior to the release of the last two Pierce Brosnan-starring Bond films, and the "reboot" of the series when Daniel Craig replaced him in the role (and for that matter, the contemporaneous real-world political changes such a work might have commented on, like the War on Terror), means that it leaves a significant amount of territory uncovered.

The result is a book that falls short of being definitive, but which is still well worth the while of anyone interested in the evolution of the franchise in its sixty years of existence.

Wednesday, September 14, 2011

The 2011 Summer Movie Season

With Labor Day behind us, the summer movie season is over.

Once more, the published statistics present a contrast between rising revenues and stagnating attendance, with the two last seasons regarded as the weakest since the late 1990s (1997 and 1998 having been similarly lackluster).

The backlash against 3-D (and 3-D surcharges) seems to be continuing, even as individual 3-D movies (like the last Harry Potter and third Transformer films) succeed with audiences. There would also seem to be signs of dissatisfaction with the endless parade of sequels and reboots (though the biggest, like the aforementioned Harry Potter and Transformers films, and at least when considered internationally, the fourth Pirates of the Caribbean movie did well, each of them making more than a billion dollars), and perhaps superheroes as well (three of the four superhero movies this summer having enjoyed respectable grosses, but nothing comparable with the Spiderman and Iron Man series, or The Dark Knight – though Thor was an exceptionally strong overseas performer, earning almost 60 percent of its $450 million take internationally).

The slow summer may also be indicative of the limited salability of the "retro" science fiction that was so abundant this past summer (like the science fiction Western Cowboys & Aliens, the World War II-set Captain America, the 1960s-set X-Men: First Class, the 1970s-set Super 8). (Indeed, the Box Office Guru has speculated that the plenitude of this matreial might have been a factor in the poor turnout among that traditional target audience for the summer blockbuster, young males, hardly a solid audience for period pieces.)

However, it is also the case that hesitant moviegoers know they will be able to rent the DVD or stream the films in the theater in a matter of months – in the comfort of their own home, and rather more cheaply than if they were to go to the local mulitplex, which are not small things. The unpleasantness of much of what goes along with the movie-going experience, like the massive, draining assault of commercials and previews viewers are subjected to before the film they actually came to see begins, and the company of the mental defectives who won't turn off their cell phones, are plenty of reason to keep away. So are the ever-rising prices of tickets, transport and concessions (the last, universally recognized as long having been obscene). Especially in tough economic times (and remember, it's not that we're headed for another recession; it's that we never really left the last one), consumers are more conscious of the gap between the cost of that trip, and the pleasure it actually affords, especially in a season in which the fare is so much less exciting to them than to "high concept"-obsessed Suits apparently incapable of reading the scripts they greenlight.

Still, for all of the angst over the numbers, this would hardly seem to be a turning point, especially when one extends their consideration from the summer season to the year as a whole. Despite all of the changes in the media market during the last three decades (the explosion in the number of channels, the proliferation of videotapes and discs, the advent of the Internet), annual ticket sales have consistently averaged 4-5 per capita in North America. The ups and downs of the film business these last few years are safely within that range, while the increasing profitability of the global market reduces the risk attendant on churning out the next $300 million sequel no one asked for. Naturally, there is ample reason to think the crassness and stupidity seen in 2011 will continue for a good long while to come, even if the current downward proves to be a deep and meaningful shift in the market.

Monday, August 8, 2011

A Revolution of Falling Expectations: Whither the Singularity?

By Nader Elhefnawy
Originally published in the New York Review of Science Fiction 23.9 (May 2011), pp. 19-20.

Science fiction readers and writers began the first decade of the twenty-first century preoccupied by the prospect of a technological Singularity, a race toward tomorrow at such a speed that even the very near future was held to be unknowable – a vision epitomized for me by the stories of Charles Stross's Accelerando cycle. It seems we end it preoccupied by "retro" science fiction in various forms, looking not to what lies ahead of us, but to what earlier generations may have thought was ahead of them, and different ways history may have unfolded since then, viewed with more or less irony or nostalgia – steampunk, dieselpunk, and all the rest – even as the Singularity enters the literary mainstream in novels like Gary Shteyngart's Super Sad True Love Story.

Maybe that is how we will look back on the Singularity one day, the same way that we now look back at flying cars and personal jetpacks. I sometimes think that day is not far off. After all, the last ten years have seen a profound lowering of expectations since the mid- and late 1990s when technological optimism hit a recent peak. Those were the years when futurists like Vernor Vinge, Hans Moravec and Ray Kurzweil seemed to be handing down cyber-prophecies from on high, and finding a wide receptive audience for the claims most dramatically related in Kurzweil's 1999 book The Age of Spiritual Machines. Their ideas evoked anxiety as well as hope, and they drew challenges as well as enthusiastic acclaim, but this line of thought was certainly fecund and vibrant.

That this should have been so seems unsurprising in retrospect. The late '90s saw the extension of the "digital age" to whole new classes of consumers as use of the personal computer, the cell phone and the Internet exploded, while the American economy achieved growth rates not seen in a generation. There was, too, the associated shrinkage of the budget deficit, the reports of plunging crime rates, the apparent irrelevance of so many of the problems prominent in the "declinist" rhetoric fashionable before 1995. The worries about deindustrialization that filled national debate for several years seemed especially passé, Rust Belts and trade deficits irrelevant. As the boom was held to demonstrate, economics was about information, not the making of clunky, old-fashioned things.

More than anything that actually happened, however, there was the euphoric promise of much, much more to come. In The Age of Spiritual Machines Kurzweil predicted that by 2009 we would have self-driving cars traveling down intelligent highways, or when we preferred not to go out, immersive virtual reality in every home providing an almost complete substitute. He predicted "intelligent assistants" with "natural-language understanding, problem solving, and animated personalities" routinely assisting us "with finding information, answering questions and conducting transactions." He predicted that we would put away our keyboards as text became primarily speech-created; that our telephones would have handy translation software that would let us speak in one language and be immediately heard in another; that paraplegics would walk with orthotic devices, the blind put away their canes in favor of speech-commanded electronic navigational devices, and the deaf simplify their lives with equally capable speech-to-print technology. To top it all off, he predicted that working molecular assemblers would be demonstrated in the lab, laying the groundwork for an economic revolution unlike all those that preceded it.

In those heady days the stock market was going up like a rocket (anyone remember James Glassman and Kevin Hassett's Dow 36,000?), restless employees fantasized about early retirement as their savings accounts swelled (or even quitting their jobs to become day traders), and would-be entrepreneurs eyed cyberspace as a bold new frontier (where it seemed like a company consisting merely of a web site really could be worth a billion dollars). The idea that prosperity would go hand in hand with ecological sanity enjoyed great currency, since the information economy was supposed to make natural resources much less important, something that was easier to believe with oil going for $10 a barrel. Besides, if worse came to worst and the world got a little too warm for our liking, we could just have our nanites eat up all the pesky carbon molecules trapping excess heat in our atmosphere, and spit them back out as the buckytubes with which we would build the skyhooks launching us on a new age of space exploration.

In short, all the rules of the game that made economics a dismal science seemed to have been repealed, and we were all supposed to be bound for globalized cyber-utopia. Yet the tech boom did not go on forever – far from it. Consumers did get new toys, courtesy of Apple. But there's no denying that the Ipods and Iphones and Ipads to which so many consumers thrill are a far cry from self-driving cars and immersive VR (which Kurzweil has since quietly and cautiously deferred from 2009 to the 2020s), let alone Eric Drexler-style "engines of creation."

Indeed, it would seem that the developments of the last decade represent little more than a refinement of the technologies that arrived in the '90s, rather than a prelude to those wonders that have yet to arrive, or to endless good times. Not only was the bull market on which so many hopes were pinned really and truly gone after the tech bubble burst, but that disappointment set the stage for the even bigger crisis in which we remain mired, resulting not from an overexcited response to some new technology, but speculation over that most old-fashioned of goods, real estate. After it was all over, the first decade of the 2000s saw only anemic GDP growth in the United States – a mere 1.5 percent a year when adjusted for inflation. (That's about a third lower than we managed in the much-maligned 1970s, and comparable to the 1930s.)

The picture was even worse than the growth numbers suggested, our deeper problems never having gone away. Debt and deindustrialization clearly had not been rendered irrelevant, as our swelling trade deficit and the plunging value of the dollar showed, and neither had natural resources. Despite the mediocre rate of national economic growth, commodities prices shot up, way up, oil hitting a $150 a barrel. Americans grumbled at the expense of paying more at the pump – but in less fortunate parts of the globe, the result was nothing short of a crisis as tens of millions were pushed over the line into hunger, and deadly food riots made the headlines. And while the 1990s had wars, terrorism and fears about weapons of mass destruction, the September 11 attacks and the subsequent length, intrusiveness, divisiveness and cost in blood, treasure and much else of the wars waged in its aftermath has been a rude and powerful shock. (The changing experience of commercial flying, a relatively small matter when all things are considered, by itself seems enough to refute the image of the world as a place of borderless convenience.)

There's no shortage of ideas for fixes, but in an echo of the history of many a past troubled society that failed to halt its decline, those ideas remain largely talk, the rate of actual progress maddeningly glacial, or even negative. This is not to say that we don't expect change, however. We do expect change, plenty of it. But we expect that ever less of it will be benevolent, or even to include much that can be thought of as exciting. Pundit Michael Lind has gone so far as to dub the first half of the twenty-first century "The Boring Age." Given what we were promised, The Boring Age is necessarily an Age of Disappointment. Given the challenges facing us, the failure to grow and innovate seems all too likely to make the Boring Age also an Age of Failure and an Age of Catastrophe – and not the kind of ultra-high-tech catastrophe that those credulous toward but anxious about a Singularity (like Bill Joy) have in mind, but the far more banal ends we have dreaded so much more for so much longer. (The word "Malthusian" comes to mind.)

The claims by writers like Lind still go against the conventional wisdom, of course. A great many people are very enthusiastic about the implications of the latest consumer products, and dismissive of the failure of the grander promises to arrive. Still, I can't help being struck by how the production of original claims and arguments for the technological Singularity supposed to signify such change have fallen off since the tech boom years. Surveys of technological futurology like defense analyst Peter Singer's Wired for War increasingly seem to go over and over the same, aging list of authors and works (Kurzweil, Vernor Vinge, Hans Moravec, and the rest), rather than presenting the new ideas and new advocates one might expect if this line of thought was more deeply influential. Indeed, I have found the vigor of the critics of Singularitarianism far more striking as of late, with talk of a backlash against the Singularity recently prominent even within science fiction, the word going from selling point to liability in book titles. In place of techno-optimism, Paolo Bacigalupi's portraits of eco-catastrophe seem more representative of our moment.

Of course, social, economic and political trends may not be the whole story. Thesis invites antithesis, and there is no question that the associated tropes have been heavily used for quite a while now, so that even as good novels continue to be written about the theme of runaway technological change sweeping us toward the realization of new dreams (and new nightmares), a case can be made for letting it lie fallow a while for purely artistic reasons. Still, looking back science fiction seems to have been more vibrant in boom times, less so in periods of bust – perhaps, because the genre flourishes when utopian hopes are in tension with the inevitable dreads, rather than surrendering the field to them. Just as history and economics may lean toward cyclical theories in bleak times, such doldrums seem conducive to gloomy scenarios of thwarted progress, or escape into outright fantasy, in our speculative fiction.

Even without going so far as that, with the Singularity rebuffed we are shoved away from our most extreme visions of a world transformed. The alternatives being less radical, and most of them quite amply treated by earlier writers, they cannot but appear old-fashioned, and as a practical matter their handling tends to result in a recycling of yesteryear's ideas. (Even Vinge wasn't able to get away from old standards like nuclear war and Gunther Stent-like "golden ages" in his 2007 seminar "What if the Singularity Does NOT Happen?") The temptation and tendency to look backward rather than forward is all the stronger since a self-consciously retro approach is all about the relationship of the present with the past. Whether the purpose is to take refuge from a frightening, alienating or unpromising present (or future) in a romanticized past, imagine other directions we might have taken (or try and understand the one we did take), search where we came from for clues to where we may be going, or simply savor the aesthetic and other pleasures of creative anachronism, the pull of stories about a modified yesteryear is powerful indeed under such circumstances.

Still, even if the Singularitarians prove to be wrong about particular theories (like Kurzweil's "law of accelerating returns") or the precise timing of particular innovations (like the prediction that mind uploading will be practical circa 2045), they may yet be right about the broader direction of our technological development. It is worth remembering that artificial intelligence research has been an area of repeated booms and busts, the raising of high hopes followed by crushing disappointments that give way to a renewal of optimism some time later, for at least half a century now, AI spring following AI winter. It is worth remembering, too, that Singularitarian thinking did not begin with Vinge, Kurzweil and company, going back as it does not only through the whole history of science fiction, but serious thinkers like I.J. Good and J.D. Bernal, and even before them.

For as long as there has been a human condition, human life has been defined by the struggle against its limits, and the aspiration to transcend them. A straight line can be drawn from the builders of the splendid tombs of antiquity (if not their unrecorded, nameless predecessors), and the alchemists' pursuit of the philosopher's stone, to the Renaissance humanism celebrated in Pico della Mirandola's "Oration on the Dignity of Man," all the way down to the debate between William Godwin, the Marquis de Condorcet and Thomas Malthus, to the Russian Cosmists who were the first to articulate a truly posthuman vision, and only after many more evolutions, the Singularitarians who made such a splash in the 1990s. It seems inconceivable they will be the last to espouse the idea, or to claim that the Big Moment is close at hand.

In short, that more basic idea's more likely to be on the backburner than on its way out. A comeback's all but certain, both inside and outside the genre, and perhaps sooner than we can guess.

Saturday, July 9, 2011

Of Mary Sue and Gary Stu

The term "Mary Sue" (and its male counterpart, "Gary Stu") typically refers to idealized characters that seem to function principally as wish-fulfillment fantasies, implying one-dimensionality, and a narrowed range of dramatic possibility.1 It is also generally regarded as pejorative, and the proclamation of a character as a Sue or a Stu widely taken for an attack.

Still, Sues and Stus are pervasive in fiction, across the media spectrum. After all, a writer putting a version of themselves, but better, to paper is likely to be well inside Sue/Stu territory. The same goes for a writer creating an idealized figure with which they hope to make their reader identify in such a way. A writer setting out to create a character which is a role model, a "positive counterstereotype," or anything else of the kind has to be very careful indeed if they don't want to cross that line.

One may go so far as to say that the Sue/Stu is a default mode for fiction, so standard in some stories that we barely notice their Sue-ness. After all, what does the average television show offer? Attractive people saying polished, scripted things, usually while wearing nice clothes and sitting in handsomely furnished rooms. The world of the TV professional--the doctors and lawyers who give the impression virtually no other kind of work exists--tends to be a fantasy of competence and glamour, with most of the ethical rough edges rubbed off. The TV physician or surgeon is always brilliant, and always deeply concerned with the care of the patient; the worst that can be said of them is that they're cranky or eccentric, like John Becker (Becker) or Perry Cox (Scrubs) or Gregory House (House), and even that crankiness is often presented as part of their excellence--for instance, in their demanding the best from their subordinates, or their willingness to buck the system to save a life. From the deadpan Law & Order to the strutting, speechifying (and surprisingly musical) inanities of David E. Kelley-produced dramedies like Boston Legal, the TV attorney is little different, and usually better dressed, their well-cut suits rather more becoming than surgeon's scrubs.

It's not a real doctor or lawyer viewers are seeing, but the fantasy version of what they do and how they live--and it may well be that some write their Mary Sues and Gary Stus defensively, to forestall the "minority pressure" (to use Ray Bradbury's term) that would descend on them if an influential group (such as doctors and lawyers happen to be) were to be offended by a more realistic depiction.

Leave aside the upper middle-class vision of life predominant in TV drama for a moment, and consider the action hero instead. Most of these have been the author's Gary Stu. Robert Howard had Conan, Ian Fleming had James Bond, Clive Cussler had Dirk Pitt, and Tom Clancy had Jack Ryan and Mr. Clark. Ex-Navy SEAL Richard Marcinko was his own Gary Stu, picking up where his autobiography left off with a series of action-adventure novels--and even a first-person shooter--in which Richard Marcinko himself is the star. Such characters are smarter, stronger, tougher, braver and generally better than ordinary people because they have to be; they wouldn't survive what their authors put them through otherwise. (It takes a cartoon character to make it in a cartoon world.) Very often they have ridiculously large portfolios of skills, because it's the only way the writer can keep them at the center of the plot, moving it along, rather than having to cede the spotlight to other characters.

In fact, outright geniuses in all fields are a dime a dozen, and their intellectual feats--the lightning-fast hacks of computer systems they've never seen before; the immediate improvisation of effective on-the-spot solutions to world-ending crises; the overnight, single-handed creation of futuristic inventions that in real life defy the scientific-industrial complexes of the most powerful countries on Earth--bear about as much relationship to the capabilities and accomplishments of real-life geniuses as the physical feats of comic book superheroes do to those of the greatest athletes. (I suspect few appreciate the latter, though. Unbelievable as it may seem, the idiocies of Eureka's scripts reflect a widespread view of how scientific R & D actually goes, and there are actual working engineers who lament their employers will never give them the chance to be Reed Richards.)

In short, Mary Sues and Gary Stus do have their weaknesses, especially when judged by a standard of realism and character drama, but they certainly have their uses and their places, which is why writers write them, and why so many readers gravitate to them (or at least, stories featuring them). Unfortunately, writers don't always recognize those uses and places, and make a botch of things by placing them in an inappropriate context--or simply by writing the character badly.

The truth is that a good Sue/Stu is just as tough to write as a more realistic character, and perhaps even tougher. It's not easy to make such characters likable, let alone relatable. It's not easy to sustain suspense when the hero's unbeatable and indestructible. And all too often, the execution ends up lacking subtlety, or nuance. The writer pushes things too far. We end up with a character who is not just a talented winner, but someone who is talented at everything, always right, always gets the last word, always one-ups everyone else, always gets away with everything, is always being fawned over for their abilities and general wonderfulness--and it's quite a feat to keep this from getting wearisome in a hurry, one that exceeds the skill of many an author. In other cases they make the character a surrogate for their nastier impulses. We end up with a braggart, a jerk, a bully or worse, and so we sympathize, and empathize, with the people who have to put up with them instead of our ostensible protagonist.

Nonetheless, even in the best of cases, the plain and simple truth where fantasy is concerned is that we don't all have the same fantasies. Quite the contrary, others' fantasies very often touch our insecurities, our bitter memories of rejection or failure, our deepest, pettiest resentments and antipathies; we can experience someone else's wish-fulfillment as an assault on our own wishes, and even our own being (and it must be admitted, a Sue or Stu is sometimes conceived as exactly that, the wishes to be fulfilled by them not pleasant for others, or the Other). And at the same time, we can be irrationally offended when someone criticizes our favorite Sues and Stus.

In principle, there's no reason why one can't enjoy their own Sues and Stus, recognize the right of others to the same, and still be annoyed by particular fantasy characters, still call characters what they are, and respond appropriately to genuinely pernicious Sues and Stus. Alas, the elevation of gut feelings to moral principles by identity politics, and all the irrationality and hypocrisy and viciousness that go with this has made the unavoidable wrangling about Mary Sues and Gary Stus far more bitter than it has to be, so that in many a situation one hesitates even to verbalize the identification. (Call it the "politics of personal fantasy.") It's yet another way in which real life is unkind to fantasy--and unalloyed fantasy, a thing one shares at their own risk.

1. The 1974 short story "A Trekkie's Tale" (available here), actually a parody of the tendency in fan fiction about Star Trek, is the source of the term.

Monday, March 28, 2011

Fantastique Unfettered #1, Winter 2010

Reviewed By Nader Elhefnawy

Originally published at The Future Fire Reviews, March 24, 2011

M-Brane Press, the publisher of M-Brane SF science fiction magazine, launched a fantasy counterpart to that publication last year, Fantastique Unfettered. Fantastique Unfettered (or FU), under the editorship of Brandon H. Bell, has as its stated purpose the "publication of well-written, compellingly readable, original stories of fantasist fiction," both short fiction and poetry, which is "unfettered by tradition copyright," so that all its content carries a CC-BY-SA "Creative Commons" license.

The authors appearing in the premier, Winter 2010 issue of the publication (which has eleven short stories and three poems in its 140 pages) offer a wide range of approaches and settings. Perhaps exemplary in this respect is the story to which the issue's cover art is devoted, Michael J. Shell's "The Death of a Soybean," which presents an off-the-wall, alternate version of the Manhattan Project and World War II. More a uchronia than an alternate history, "Soybean" surreally scrambles the events of our timeline rather than exploring a counterfactual scenario, with Robert J. Oppenheimer just a Los Alamos security guard who happens to be eccentrically preoccupied with an idea called "nuclear fission," and a femme fatale lady physicist with the unlikely name of Maladi scheming, seducing and killing her way to fame, fortune and a place in scientific history.

Offering a nightmare complement to Shell’s noirish dream is Kaolin Fire's "The Aetheric God," in which a young technician named Asher who spends his days building steam-men for his employer, "Chief Technician" Father Isaiah. He spends his nights hiding in the cathedral’s library-desperately burying himself in its books to try and quiet "the voice of God within his head" calling for Asher's mutilation and destruction, a crisis that soon enough moves out of his head and into the physical world. Going in a sharply different direction from either is Alan Frackelton's "A Blessing From the Blind Boy," the story of a disgruntled gaucho named Juan Hernandez who burglarizes the mansion of his ruthless landowner employer somewhere (and somewhen) in twentieth century Latin America, putting Hernandez's young son Ramon in the center of a cycle of revenge, loss and longing. In a lighter, more fanciful vein, Rochita Loenen-Ruiz's "Breaking the Spell" (a reprint from Philippine Speculative Fiction 4) has for its protagonist a little girl who becomes fascinated with the miniature world her father keeps under a bell jar. While her father's fairy tales never ring true for her (she is "determined not to kiss a prince"), entry into that little world becomes the object of her own fairy tale quest.

However, in contrast with such exoticism, the issue favors contemporary contexts, and compared with the world-changing (and rather nihilistic) events of Shell's story, or the intense confrontation with the supernatural of Fire's, subtler uses of speculative elements inside quieter, more personal stories. The descriptor that came to mind when I read Frank Ard's story of a love triangle between a man, mer-man and woman "Small Fish in the Deep Blue" is "slipstream." Others incorporate surreal intrusions into what might otherwise be a realist narrative, like in Mary J. Daley's "The Book of Barnyard Souls," in which a young farm girl named Kalee receives nightly visits from the souls of deceased animals, Natania Barron's "Without a Light," in which a sixth-grade teacher in a small town starts an affair with a mysterious colleague, Elizabeth Creith's "Five Oak Leaves," where a man encounters a young changeling girl living on the street. In Anna Manthiram's "Boris," a meditation via fortune cookie-like clothing tags on the titular character’s involvements with various women; Christopher Green's "Holding Hands," in which a Vietnam veteran encounters a girl he left behind at thirteen many years later in his wife’s ballet studio; or Michael J. Deluca's "The Driftwood Chair," in which a man roams the beach trying to cope with the loss of a love; it is possible to blink and miss the speculative touch.

By and large the sensibility is ‘literary,’ and the quality is high (the two, of course, not always the same thing), virtually all the stories assembled here working, though to different degrees and in different ways. "Death of a Soybean" succeeds on the strength of its pacing and strangeness, Fire’s "The Aetheric God" on the nightmarish force of the telling. The poems offer similar grandiosity, particularly Bruce Boston's rich, dark, chaotic "The Time Traveler Leaves History Behind" and Alexandra Seidel's glittering "In Babel." Daley’s touching "Barnyard Souls" is the most emotionally resonant story in the volume, though the pieces by Frackleton, Creith and Green also succeed on this level.

That combination of quality and variety means that Fantastique Unfettered #1 offers something for many different tastes, in what seems to me a very promising start for the new publication.

This review is released under a Creative Commons Attribution-ShareAlike license, (c) Nader Elhefnawy. You are free to republish this review anywhere you like, so long as you give attribution to the author and to The Future Fire and keep this license text intact in any copy.

Tuesday, February 22, 2011

Review: The Sound of No Hands Clapping: A Memoir, by Toby Young

Cambridge, MA: De Capo Press, 2006, pp. 274.

Toby Young's second memoir (the first of course being How to Lose Friends and Alienate People) starts off after Toby's first book has got into print. It happens to catch the eye of a Hollywood producer who offers him a crack at adapting a biography about a self-destructive '70s-era record producer into a screenplay for a movie he wants to make. (The producer, identified to the reader only as the anonymous "Mr. Hollywood," is operating on the theory that if Toby succeeded in making himself seem likable in How to Lose Friends, then he ought to be able to do the same with the life story of the man in the biography.) Meanwhile a different set of producers comes around to talk about making his first book into a movie. Once again, he's in America going after the Big Time, albeit on the West coast rather than the East.

It would be a mistake to approach this book as Toby's Big Hollywood Adventure, however. No Hands Clapping is really a chronicle of his life after the events of the previous book, with his screenwriting experiences just the connecting thread running through the narrative. Along with the work's comparative looseness, it doesn't help that the freshness of the reader's first encounter with him is gone, or that he's lost some of his edge.

Toby still has ambitions, but the fire's not as hot as when he hopped across the Atlantic to work for Graydon Carter. The tensions and conflicts that did so much to make his first book interesting to me--the contrast between his intelligence and education and his starstruck hunger for glamour, his frustrated lust for the good life, the clash between his rather shallow goals and his parents' accomplishments and values--are far less evident here, having largely run their course the last time around. Toby's settling down at the end of the last book is a significant part of that, and as might be guessed, his life as a whipped (but mostly content) boyfriend, husband and father coping with domestic and mid-life crises is rather less entertaining than his earlier laddishness. He's still quite good at "losing friends and alienating people," but it starts to feel like a role he's enjoying.

The result is that this all seems more like an anticlimax to the history that made him an "icon of defeat" rather than a fully satisfactory follow-up. Still, I found chunks of the book almost as funny as How to Lose Friends, and while perhaps less insightful than the previous volume of his memoirs, it is still peppered with memorable observations by Young and the other "characters" in his story. Many of the best relate to his newfound domesticity, and as might be expected, also to the unbelievably bloated, shambling workings of what he learns to call "The Business" of film and television (admittedly, an even more overexposed subject than the glossy, Conde Nast-style magazine). One of my favorites among those he relates is the way it takes only thirty seconds for the aspirants who make it to stardom to move from the view that it's all a "crapshoot" to believing that it's "talent" and talent alone that put them at the top of the heap--a reminder that, contrary to the conventional wisdom of a society which chooses to see economic history as nothing but Horatio Alger stories and Edisonades, one's level of success tends to be inversely correlated with their understanding of how the System they inhabit really works.

However, the most provocative aspect of the book is unintended, namely what it suggests about our rhetoric of failure. Toby had chances of a kind others can scarcely dream of, thanks to his father's prominence, his familial and personal connections, and some astonishing strokes of luck, like those phone calls which started his adventures in both New York and Hollywood. (You can't be "fired from virtually every paper on Fleet Street" without being hired by them first, after all.)

By contrast, most of those who chase dreams of stardom and come to think of themselves as failures never had anything like those chances. Instead they waited for breaks that never came, their failure definable in negative terms as a lack of success that frequently is more reflective of the way the odds work against anyone, and the closed-off character of the businesses they try to enter, rather than their having been given a proper chance and blown it, as Young did again and again. This made his failures genuine failures in a way that theirs were not, while ironically putting him in a position to make a business out of telling the story of his woes that has given him something not unlike the stardom he sought in the first place, a far cozier and rosier position than is enjoyed by most of those who would call themselves "successful."

Alas, I suspect few readers will really appreciate that distinction, without which this story would not have been possible.

I suspect the author of No Hands Clapping doesn't appreciate it either.

Tuesday, February 15, 2011

Returning to Sahara: The Dirk Pitt Novels On Screen

Clive Cussler's Dirk Pitt series has had an exceptionally unfortunate history on the big screen, each of the two attempts to film one of his novels--1980's Raise the Titanic and 2005's Sahara--ending up a notoriously expensive critical and commercial flop.1 Raise the Titanic has been blamed for killing ITC Entertainment, while Sahara resulted in a loss of over a hundred million dollars for the producer, as well as a spectacular legal battle that, as far as I can tell, has continued to this day.2 The latter is strongly connected with the more active role Cussler got as part of the terms on which the movie was made, and which he claimed the producers never honored.

Looking back it seems that there was never any question of the script closely following the book. Sahara's images of a Third World country where everyone is either a villain or an anonymous victim, its scenes in which Africans turn cannibal and attack foreign tourists, in which Pitt threatens to bury a Malian antagonist with bacon in his mouth, its climax in which a handful of Western heroes hole up in an old colonial fort and fight off a siege by vastly more numerous native soldiers they kill by the hundred, before their rescue by the cavalry (literally, a U.S. Army cavalry unit)--one doesn't have to be trained in post-linguistic turn literary theory or more than ordinarily given to approaching popular fiction as "cultural text" to see that these elements could have been problematic for a twenty-first century audience (a point highlighted by the film's producers when discussing Cussler's input on the scripts).3

The plotline about a United Nations commando team reflected the anticipations in the early '90s that the United Nations would be a more powerful, independent entity after the Cold War's close, which have long since become passé. The heavy weaponry on the Calliope probably seemed a bit over the top, like the pre-reboot James Bond--and perhaps a bit pricier than the producers wanted to go--much like the siege at the end (an idea that had already been used quite heavily in movies during those years, as with the Lord of the Rings and Matrix series, though frankly it would have made for a good set piece). Ditto for the plotline about the end of Abraham Lincoln's life (the principal bit of historical-archaeological interest the novel had).

All that makes it seem an unlikely candidate for an adaptation, but all this is the sort of thing that only seems obvious when one has actually read the material. The more casual glance that likely decided the issue simply noted that the Dirk Pitt novels were a bestselling series of globe-trotting action thrillers, a natural enough object of Hollywood's interest, especially with the surprising success of the Jason Bourne series--the film versions of which also jettisoned most of the stuff of the books--apparently encouraging the tendency to seize on such work.4 And there certainly were some reasons to think the Pitt novels would be worth a shot. The fast-paced plots and rapid-fire action of Cussler's novels are very cinematic in feel. The element of historical mystery that is a prominent feature of the series probably looked like a significant plus at the time. (After all, these were the years when Dan Brown became a full-blown pop cultural phenomenon, and the National Treasure films became the biggest success of Nicholas Cage's career.) And Sahara, which certainly had these two traits going for it, likely seemed easier to adapt than some of Cussler's other books. Its plot is a bit more grounded than lost continent tales like Atlantis Found (2000), for instance. It doesn't require nearly so much updating as the apartheid-era Vixen 03 (1978) or Cold War intrigues like Deep Six (1984) and Cyclops (1986). And for all the political difficulties mentioned above (perhaps more obvious when one starts thinking seriously about the conversion from page to screen), adapting it remained less awkward than Night Probe (1981), with its plot about the United States struggling with Britain for possession of Canada, or the Sinophobia and xenophobia-laden storyline of Flood Tide (1998).

And so it seemed like a good idea at the time--just as bad ideas usually do when people get it into their heads to act on them.

1. Those who have not seen the film might want to check out a fair review of it at the Den of Geek, written retrospectively just last year.
2. The film has in fact been taken as an object lesson in Hollywood's mismanagement of large budgets--the production budget doubling from $80 to $160 million (in part, because of script problems), and the distribution costs coming to a preposterous $81 million more. The Los Angeles Times published a special report on the matter including exceptionally detailed figures for expenses (specifying everything from the $102,884 spent on walkie-talkies, to the $48,893 Matthew McConaughey's personal chef received in compensation--more than Rainn Wilson got for playing Rudi Gunn--to bribes to various Moroccan officials).
3. Alas, such elements are common throughout Cussler's work, perhaps more so than in most thriller fiction--as with the Japanaphobia of Dragon (1990) or the treatment of immigrants as instruments of a Chinese plot for the conquest of America in Flood Tide (1998)--but his books still seem like a font of citizen-of-the-world cosmopolitanism next to the writings of John Ringo, Thomas Kratman and a good many others who have come to prominence in the past decade. Say what you will, this is certainly not a case of "now we know better."
4. The Bourne Identity (2002) dropped the plotline about 1970s-era international terrorism and the hunt for Carlos the Jackal--both long passé by that point, with that era's groups all but vanished and Carlos himself sitting in a French prison. (Since the third book, The Bourne Ultimatum, was "round two" for the Bourne-Carlos fight, the 2007 film also used a plot developed from scratch.) The element of jet set sophistication, and the edgier aspects of Jason and Marie's relationship (like his kidnapping her at gunpoint) were similarly dropped. The result, in my view anyway, was pretty thin stuff since the writers didn't really bother to replace what they removed.

Thursday, February 10, 2011

Reflections on the Dirk Pitt Series

I first encountered Clive Cussler circa 1993 through Sahara, which was then his latest novel. I went right through it and subsequently tracked down every other novel Cussler published before it, then went on to follow his career closely for a good many years afterward. I enjoyed virtually all of his books, a good many of the earlier ones nearly as much as Sahara (especially 1986's Cyclops and 1988's Treasure).

With their big, exciting plots encompassing wild historical theories and high-stakes geopolitical games, their abundance of gimmicks (like cool high-tech toys), their plenitude of over-the-top, cinematic action, their unflappable James Bondian protagonist and colorful James Bondian villains, and their sense of humor, I found Cussler's novels a lot of fun--and also, a model for the kind of story I was then aspiring to write. (Matthew Reilly would be an even closer fit in many respects, but he hadn't even published his first book yet.)

The inelegant, often cliché-ridden prose, the one-dimensional characters--these things only started mattering to me later on, by which time the author's tendency to repeat himself to diminishing returns would have been enough to reduce my interest, even at a less discriminating point in my history as a reader. Of course, this may have been inevitable. As of the time of this writing, Dirk Pitt's adventures have been in print for thirty-seven years, and despite the use of a rather large bag of tricks, Cussler's hero's aged enough that Pitt's grown-up kids now get in on the action. Far from letting go gracefully, or at least taking a break every now and then, Cussler kept the books coming, despite his declining enthusiasm for them (Cussler himself confessing to boredom with his creation in at least one interview).

Indeed, Cussler stepped up production sharply--through a turn to literary sharecropping on a massive scale.1 Where action thriller writers are concerned, only Tom Clancy compares with Cussler's prolific involvement in "co-authored" books to which his principal contribution seems to be his brand name. This includes not only every Dirk Pitt novel to follow 2001's Valhalla Rising (the byline of which is shared with his son Dirk), but the NUMA File series cowritten with Paul Kemprecos, the Oregon Files novels written with Craig Dirgo and Jack DuBrul, the Isaac Bell novels (cowritten with Justin Scott from the second book on) and the Fargo Adventures written with Grant Blackwood (who also "co-authored" Clancy's latest, Dead or Alive)--some twenty-five novels in five series, a considerable output even before counting his foray into nonfiction (like the two-book Sea Hunters series, also cowritten with Dirgo).2 At this point co-authored books account not only for virtually all of Cussler's output in the past decade, but for a majority of the books he has published in his four decade career (27 of 46 books of all sorts).3

That others keep on buying the books, being disappointed, complaining online and then buying the next book, again and again and again (enough to keep Cussler on the bestseller lists), astonishes me. Rather than nostalgia, I find the whole phenomenon a depressing sign of the times.

NOTES
1. From the 1970s through the first half of the 1990s, Cussler published roughly one novel every two years (twelve from 1973 to 1994), where now two to three new books appear bearing the Cussler name annually, a staggering 400 percent rise in production.
2. As might be expected, I found the 1999 Numa Files novel Serpent--the first of the coauthored novels--lackluster, and while I still read Atlantis Found (2000) and Valhalla Rising, 2003's Trojan Odyssey was simply unreadable, and haven't bothered to return to the Cussler brand since.
3. The release of two more co-authored books (the Oregon Files novel The Jungle, the Fargo Adventure The Kingdom) has already been announced for this year, further increasing their share of the total (to 29 of 48). This should be considered a low estimate, however, given that not all "co-authored" books are announced as such.

Tuesday, February 1, 2011

Why I Can't Stand The Big Bang Theory

Last summer I watched a handful of episodes of The Big Bang Theory all the way through, then gave up on the show entirely, not just irritated by what I watched, but astonished by the show's success (at the time, anyway).

I might as well start with the characters. As anyone who has seen an episode knows, the four scientists at the heart of the story--Leonard Hofstadter (Johnny Galecki), Sheldon Cooper (Jim Parsons), Howard Wolowitz (Simon Helberg) and Rajesh Koothrappali (Kunal Nayyar)--are not just stereotypes, but very, very annoying. Of course, characters with annoying quirks can be engaging. Monk's titular character Adrian Monk is a perfect example of this, his combination of strengths and weaknesses making him compelling and endowing him with a genuine humanity, as well as affording the writers abundant inspiration for comedic material. Jim Parson's Emmy-nominated Sheldon, by contrast, never does anything more than grate, and the same goes for his friends (to varying degrees).

Of course, this is fairly standard. Treatments of "geeks" in American film and television tend to feel like they were written by the kinds of people who beat up geeks when they were growing up (if they ever did grow up). They are caricatures, and generally not good ones. Good caricature begins with an apprehension of reality, while pop culture depictions of everyone from the gifted child to the veteran scientist are caricatures of caricatures tending toward the grotesque. At the same time, the casting has often left me with the impression that the roles exist mainly to provide work to well-connected but awkward, uncharismatic (and frankly, unattractive) actors.

Big Bang is no exception to the pattern, either in the writing or the acting, all too predictably reusing, recycling and running right into the ground all the hoariest geek stereotypes and clichés--stuff that was tired in seventh grade--starting with the field of study all four of the scientists have in common, physics. After all, the greenest Hollywood hack knows that if you want to overawe an unsophisticated viewer, just throw in the word "physicist," preferably with the prefix "astro" or the word "quantum" in front of it. The average viewer may not be able to explain exactly what physics deals with, believe that the Theory of Relativity is a moral stance and think Albert Einstein's scientific contribution was his personal invention of the atomic bomb, but they realize that physics is "hard," especially because it "has math in it," and are therefore intimidated by anyone who can deal with it, so that where sheer intellectual one-upsmanship is considered, everything else is regarded as second-rate.1

Of course, pandering to widely held stupidities has never been an obstacle to popular success, and certainly there's always been an audience happy to laugh at nerds. (Remember, this show laughs at them, not "with" them.) Intellectuals like scientists may be an "elite" of sorts, but when push comes to shove, they are a rather powerless group, making them easy marks for everyone from comedians to demagogues. (As Morris Berman asked in his brilliant essay The Twilight of American Culture, "Can you imagine, in this country, a TV program along the lines of Cheers that ridiculed wealth instead of intelligence?") With anti-intellectualism running particularly hot in the last decade (as it usually does when national politics takes such turns), it may be no surprise that CBS scored a hit with a show founded on this kind of humor. And while I've never been a particular fan of Chuck Lorre's work (Cybill, Dharma & Greg, Two and a Half Men), there does seem to be a sizable audience which clearly enjoys it.

This being the case, what really surprises me is the critical acclaim the show enjoys, and in particular the enthusiasm from the proudly self-described geeks one would expect to see right through it, to recognize that it isn't the smart or geek-friendly viewing it has been taken for. I suppose that part of it may be that instead of being relegated to supporting roles, the geeks are the core of the cast this time, and like members of a marginalized minority group they're simply excited to see such a representation of themselves on screen, even when it's an unkind one. Part of it, too, seems to be that the show's writers are relatively literate when it comes to both the science and the pop culture (indeed, I imagine many a joke goes over the average viewer's head), which is rare enough that some will regard it as an oasis of relatively intelligent viewing for that reason alone.

I say relatively. For nuanced geek-culture comedy I generally find myself having to look accross the ocean--to Britain for Spaced, or better still, to Japan for manga and anime like Genshiken (the first series, at any rate; I didn't much care for the second) or Welcome to the N.H.K. ("masterpiece" is the word that comes to my mind). As it happens, they largely dispense with the scientists, but that's partly the point: there is geek culture outside physics and computer science (as club president Madarame explains about himself and his cohorts "We're not techie otaku . . ."), and anyway, The Big Bang Theory, despite its title, isn't really about physics or physicists, even to the modest extent that the crime show Numb3rs is about mathematics and mathematicians. And these other shows are much, much better stuff than anything we're likely to get out of American network television.

NOTES
1. There seems to be a widespread and deep-rooted view that there is a hierarchy of intellectual endeavor, with the physical sciences (of which physics is queen) and related engineering specialties (especially those including the words "rocket," "nuclear" or "computer") on top, ahead of the life sciences (though the word "neuro" has a cachet comparable to "astro" and "quantum" in physics; "molecular" is good too), the life sciences ahead of the social ones, and the humanities at the bottom, a view that the show routinely acknowledges (and if anything, reinforces, whether intentionally or unintentionally). A noteworthy example: in the episode "The Bad Fish Paradigm," Sheldon insists to Penny that a former girlfriend of Leonard's who possessed a Ph.d in French Literature is not a "brainiac" because "for one thing she was French, and for another it was literature."

Tuesday, January 11, 2011

Review: Wizardry & Wild Romance: A Study of Epic Fantasy, by Michael Moorcock

Austin, TX: MonkeyBrain, Inc., 2004, pp. 206.

I first encountered Michael Moorcock's classic study of epic fantasy many years ago, when I had only a faint idea that he is a Big Name in speculative fiction (picked up mainly from my readings in Brian Aldiss and David Wingrove's Trillion Year Spree), and not much idea as to why. I'd tried to read some of his Jerry Cornelius stories, which didn't work for me. (I wasn't a fan of the cut-up technique, and still am not.) At the same time, I'd read relatively little fantasy, and had some hope the book would help me get a handle on the bigger picture. I didn't find it very useful to that end, and didn't do much more than skim it before setting it aside.

The fact that I didn't find the book of much interest then doesn't say anything about its worth, but it does say a great deal about its conception, as I realized when rereading it later on. It was never intended as an introduction to the genre, the kind of thing someone in a hurry would read to get a clear image of the field. Moorcock does not offer a history of the genre, or a survey, though there is plenty of history, and quite a bit of breadth in the examination. Nor is the book unified by an overarching argument or concern. Instead Moorcock is sounding off on a selection of themes in a collection of chapter-long essays covering, respectively, the genre's origins (chapter one); the use of landscape in epic fantasy fiction (chapter two); the depiction of heroes and heroines (chapter three); the presence of wit and humor, and the lack thereof (chapter four); the escapist aspect of much of the genre, as well as the political implications of this kind of writing (chapter five, which is essentially a reprint of his essay "Epic Pooh," also the chapter's title); and finally, the evolution of the genre's expressions across the media spectrum (chapter six, "Excursions and Developments").

As might be expected, the breadth and depth of Moorcock's reading and range of reference is staggering. While enriching these discussions, it can also be a bit of an obstacle after the first chapter, since he makes little effort to meet a less prolific reader halfway in the later parts of the discussion. As China Mieville acknowledges in the introduction to the new edition,
Reading Michael Moorcock's history of literary fantasy is like walking an immense, brilliantly stocked library, through which you don't know the way, following a librarian who walks briskly, nodding and pointing at various books as he goes.
"Very interesting that one," he says, and "Ah, some good things in there," and you're desperate to stop and examine the volumes . . . but your custodian walks too quickly, and you can only stare back at them as you run to keep up (11).
The result is that Wizardry & Wild Romance is best approached as a polemic (or a collection of polemics) addressed by one expert to other experts. It is never impenetrable in the way that academic literary criticism so often can be--Moorcock's writing is far too lucid for that--but as Mieville notes, the fleetness with which it moves among a great many items can be frustrating.

Readers should also keep in mind that this study was first drafted in the 1970s, and even the latest edition strongly reflects that. Despite some revisions to the main text, there are no references to any fantasy fiction actually written after 1985. (A reader hoping to see something about Robert Jordan or J.K. Rowling, for instance, will come away disappointed.) At the same time, many of the authors he references are relatively obscure now. (In fact, I had never heard of most of those writers he praises most until I picked up this book.) Additionally, the moment colors his outlook in significant ways. As historian Nick Tiratsoo notes, this was a period of hysteria in Britain about the "respectable classes losing control," with much hyperbole about labor militancy, uppity young people with "common accents," and other sorts of things I first gleaned in reruns of Are You Being Served? and The Fall and Rise of Reginald Perrin a very long time ago--and Moorcock's opposition to that kind of reactionary thinking is strongly present, be it in his suspicion of rural nostalgia and escapism, or other sorts of "comfort food for the comfortable."1 (Indeed, he can be particularly harsh regarding the impulses and attitudes he sees under the surface of the wishes the genre so often speaks to--especially those for a "simpler" world in which a young man armed with gritty determination and cold steel can win himself an empire, or our heroes venture off to kill villains we are not even meant to try and understand in order to restore an "idyllic" status quo.)

Still, even at its most dated and partisan the book succeeds in both entertaining and informing. Wizardry offers plenty of bits of fascinating background. It points the reader to a great many authors, many of whom they would be unlikely to encounter elsewhere, and Moorcock's eye for their strengths and weaknesses, quirks and accomplishments, is uncommonly sharp. The book is also packed with insights well worth reading even three decades on, from his views on the transient appeal of so many particular works of fantastic literature, to the role a well-crafted landscape can play, to the underappreciation and frequent misunderstanding of comedy's place in the genre. (Even the political commentary is still relevant, given that the conservative resurgence of the 1970s pretty much set the world on the political course it has followed these last several decades.)

Even when I didn't agree with the author's argument, or his feeling--after all these years I'm still more sympathetic to escapism and comfort food than he is, rather less enthusiastic about Modernist and Postmodernist approaches to literature, and think what people tend to call "grown-up" fiction is overrated, to name but a few points--I always understood the reasoning behind them. Indeed, it all made me wish Moorcock had penned a proper follow-up. Naturally, anyone with a serious intellectual curiosity about the genre, or simply looking to broaden their sense of what epic fantasy is and can be, should make a place for it in their reading list--though those brand new to the field might put off getting to it until they've acquired more familiarity with the subject.

NOTES
1. Interestingly, Tiratsoo noted that those dismayed by the decade's trends frequently expressed those feelings in a nostalgia for the eighteenth century; imaginative flight to romanticized, pre-industrial, rural spaces; and less pleasantly, "retribution," violent and otherwise--with obvious implications for the ways in which fantasy fiction was being received. Nick Taratsoo, "'You've Never Had It So Bad': Britain in the 1970s." In Taratsoo, ed., From Blitz to Blair: A New History of Britain Since 1939 (London: Weidenfeld & Nicolson, 1997), pp. 187-188. Those looking for a bit of quick background about both the genuine difficulties and overblown alarmism of the period can check out this article from the September 30, 1974 issue of Time magazine, melodramatically titled "Will Democracy Survive?"

Friday, January 7, 2011

"Was 2010 the Worst Year for Movies Ever?"

Continuing in the vein of "2010 in review," Moviefone's Gary Susman offers his take on the year's films, and it is far from celebratory. Of course, measured in dollars and cents, this wasn't a bad year at all for Hollywood, which took in over $10 billion in the North American market. Still, the figures reflect rising ticket prices and the high take of 2009's Avatar, which actually pulled in most of its money in the early months of 2010. This compensated for a rather long string of disappointments. They were especially concentrated (and commented upon) in the early part of the summer season, but after that the flops never stopped coming.

Looking for an explanation, Susman points to the plethora of reboots, remakes and sequels; "warmed-over" '80s nostalgia; stalled star vehicles; failed new franchises; and the rush after 3-D movie experiences.

Of course, except for 3-D, none of this may seem to represent a significant change from the norm in recent decades. Stars have never been bulletproof, just about always enduring flops as well as appearing in hits, and at any rate, most of the talk about actors' bankability confuses correlation with causation. Nor is the crashing and burning of a would-be franchise a new development; that problem too would seem to be as old as the idea of a film series. Indeed, these points hardly seem to be worth discussing. Besides, reboots, remakes, sequels, prequels and the like have been mainstays of the box office ever since there was a box office. (In 1983, for instance, eight of the top twenty movies fell into this category.1) And before warmed-over '80s nostalgia, there was warmed-over '50s, '60s and '70s nostalgia. (Remember the #1 hit of 1985, Back to the Future, and the career trajectory of Oliver Stone in that decade? The big-screen spin-offs of shows like The Fugitive, The Flintstones and Mission: Impossible, the return of John Travolta and Pam Grier to starring roles, the comeback of the martial arts movie in the '90s?)

Still, an objective look at the numbers does show an increased propensity for retreads through the last decade. In the 1980s and 1990s, the average was closer to three to four of the top twenty grossers in a given year (3.9 for the years 1980-1989 and 3.7 for 1990-1999, respectively). From 2000 to 2009 the average was twice that high, seven to eight (7.5) of each year's top twenty commonly falling into that category. By my count, six of the top ten and nine of the top twenty movies of 2010 are retreads (while the category accounts for at least twenty-one out of the top one hundred).2

Additionally, as heavily as the 1990s mined the 1970s, the last few years may have worked the '80s still more aggressively, as the propensity for retreads suggests. (Where Travolta made his comeback in new movies with little relation to his earlier screen image, Stallone paved the way for his by reviving his '80s-era franchises-Rocky and Rambo-with himself in the starring role, and finally The Expendables, an "homage" to exactly the kind of action movie Rambo epitomized. And certainly the '90s had no equivalent to the success of The Transformers franchise.)

In short, while these approaches are far from now, Hollywood has relied on them even more heavily than before.

That said, after a string of flops like this year had, it's traditional for observers to predict big changes in how Hollywood will do things-changes that will see the Suits concede creative freedom to the Artists-and Susman is true to that tradition. Still, the pipeline movies must pass through is a long one, and even if the system were to do a one hundred and eighty degree turn today in its green-lighting of new projects, it would be 2013 before a full year's slate of movies might look substantially different. Besides, the truth is that, with inflated expectations the norm (even a $400 million global gross is "low" enough to stall or even kill a franchise), disappointment is a way of life, and it takes more than a few bumps in the road to shift the characteristic operational style of the bloated, bureaucratized studios. The causes of the preoccupation with retreads, spin-offs and the like have very deep roots, not least in
the ever-bigger gamble involved in gigantic and still-growing budgets, shortening theatrical runs, ever-more fickle attendance at theaters, and the ever-louder pop cultural cacophony which a project needs to get above to be seen or heard, something easier to do with an already-established IP.
It would also be a very great mistake to underrate the intrinsic appeal of "high concept" for a company run the way the studios are, or of the added control over the creative process that comes with assigning someone to work on a studio-owned IP, compared with the challenges involved in dealing with a new artist bearing a new concept. Likewise, it would be a mistake to overlook Hollywood's global orientation, which helps drive its present tendencies. The mid-budget dramas and comedies which Susman sees renewed studio interest in tend not to travel so well abroad as the glossy, spectacle-heavy blockbusters that remain Hollywood's strength, and the foreign grosses on those make a lot of difference. (Even a mediocre performer like Prince of Persia nearly quadruples its income with the help of those receipts, bumping its $90 million take in the U.S. to a $335 million global total--which leaves a sequel highly unlikely, but is certainly enough to make the nine figures laid out for the budget tolerable, especially when DVD sales, broadcast rights and the like are added in later. By contrast, the ballyhooed The Social Network did little more than match its domestic earnings.)

My guess is that the Suits are far more likely to dig in their heels and push the product they are most comfortable with all the harder, the critics be damned. Accordingly, it seems far more likely to me that there will be a convergence between (somewhat) chastened expectations and some lucky strike that will renew Hollywood's always incredible self-satisfaction . . . until the next run of disappointments, the inevitable talk of the studios changing their way and the repetition of the whole stupid pattern all over again.

In short, if there's going to be change, it will probably be in the direction of still more retreads. Bet on more nostalgia, too, though my guess is that the passion for the '80s will give way to one for the '90s any year now.

Remembering the '90s far better than I do the '80s, I'm already aghast at the thought.

NOTES
1. In 1983, Return of the Jedi was #1, the Bond movies Octopussy #6 and Never Say Never Again #14, the Dirty Harry movie Sudden Impact #7, the Saturday Night Fever sequel Staying Alive #8, Superman III #12, Jaws 3-D #15, the remade Scarface # 16, and Psycho II #20. The data used in this article all comes from the yearly listings of the Box Office Mojo web site.
2. The count goes even higher if one include in this count the remakes of films made overseas like Edge of Darkness, Death at a Funeral and Dinner for Schmucks, as well as the new versions of previously filmed stories like Alice in Wonderland and Robin Hood.

Thursday, December 16, 2010

Review: Super Sad True Love Story: A Novel, by Gary Shteyngart

New York: Random House, 2010, pp. 334.

The protagonist of Gary Shteyngart's bestselling novel Super Sad True Love Story is Leonard Abramov, the Queens-born son of Jewish Russian immigrants from what was then Leningrad. At this point in his life, Lenny is middle-aged, unhandsome, awkward, neurotic and hopelessly out of tune with the times. Only his possession of a relatively good job (he's a Life Lovers Outreach Coordinator for the Post-Human Services division of the Staatling-Wapachung Corporation, essentially a salesman hawking life extension to rich customers), and his friendships with a few somewhat more favored people, save him from being written off as a total loser.

At the start of the novel Lenny is winding up an unprofitable year in Europe and beginning a romance with Eunice Park, a young Korean-American woman he meets in Rome, a recent college graduate being prodded toward law school by her immigrant parents. Eunice is confused and at times floundering, and her sense of their relationship is irreconcilable with that of an older lover desperately trying to retain a tenuous hold on her affections.

And making their relationship more tenuous still, external obstacles to their being happy together, going far beyond the usual, very quickly start to loom large: their relationship unfolds in a near-future United States which, under the leadership of the Bipartisan Party (and the de facto dictatorship of neoconservative Defense Secretary Rubinstein), is fighting a war in Venezuela and on the verge of economic collapse. Indeed, the situation is already so severe that even the near-annihilation of civil liberties fails to stop a mounting tide of internal disorder.

These are not the most original of characters, and their relationship may not be the most original of situations. There is much about them that will annoy a good many readers by the end of the story (especially if the reader has ever had to deal with people like them). Yet, they both rang true for me, and held my interest throughout.

Additionally, and more surprisingly, I was genuinely impressed with the world-building, which was extensive, innovative, often zany and smoothly integrated into the narrative. As might be expected from the premise it contained much satirical caricature, but at the same time seemed eerily, depressingly plausible in its essentials. At its best it reads like Bruce Sterling's writing about the near future, but with his Davos Man libertarian-conservatism replaced by a critical take from the left.

Moreover, Shteyngart successfully interweaves the big picture with the personal tale of Lenny and Eunice, Big Events impinging on their all-too-familiar Little Story in ways large and small, at many points making what could easily have been cliché (not least, the treatment of the immigrant experience) something worthwhile. Especially significant, the book uncannily captures the voice of the consumerist, texting-addicted young adult who can't get through a conversation without repeatedly saying "Whatever" (and can't get through a book, period) in Eunice. The generation gap between her cohort and their elders (sometimes, their elders by only a slender margin) is a difference of epochs, the dividing line between the eras of Johannes Guttenberg and Steve Jobs.

Shteyngart shows as well as tells this in his switches back and forth between Abramov's viewpoint, related in entries in an old-fashioned diary, and Eunice's exchanges of text messages with her family and friends, as well as in their exchanges with each other. Lenny's old-fashioned bookishness is constantly a source of embarrassment and anxiety between the two, and at times, even a wall. In one of the book's more memorable scenes, Lenny tries to read to her out of Milan Kundera's The Unbearable Lightness of Being, to share a book that meant a great deal to him with her, and the experience turns out to be, well, unbearable.

Fortunately, Shteyngart's novel never is. Despite its switches of viewpoint, its stylistic experimentation, its density with concept and allusion, it is hugely readable, and a quick read as well as a satisfying one. In fact, while I've often felt that the genre's opinion leaders are too quick to embrace well-established "mainstream" authors who try their hand at science fiction, I would be disappointed to not see Super Sad True Love Story get some recognition at awards time next year.

Tuesday, December 14, 2010

Reflections on Stieg Larsson's The Girl With the Dragon Tattoo

The success of Stieg Larsson's Millennium trilogy is exactly the kind of hit that makes me suspicious - a book showered with acclaim by critics who are none too clear about the reasons for that acclaim, an apparent pop cultural phenomenon which does not fit the accustomed pattern, nor appear to have broken that pattern by fulfilling some undiscovered niche.1 The success of Larsson's books seemed all the more remarkable given that the U.S. is such a weak market for translations (or even for stories about other lands), and there has certainly been nothing to indicate a special openness to Swedish imports in particular - neither an upsurge of interest in Swedish culture as such, nor a sudden preoccupation with Sweden in foreign affairs.

Naturally, I looked into the issue, reading a few reviews. Every so often someone mentions the interest of Lisbeth Salander-Watson to protagonist Mikael Blomkvist's Holmes, as many a critic has put it (though frankly I think it's the other way around). However, she she struck me as a bag of clichés-the pierced, tattooed rebel-punk hacker (which decade is this?), the brilliant detective whose symptoms of autism are the basis of their talent as well as their weakness (meet Gil Grissom, Temperance Brennan, Adrian Monk, and that's just on television), the Bohemian freelance investigator (I did say she was Holmes to Blomkvist's Watson, didn't I?).

Eventually, I gave in and read the first book (The Girl With the Dragon Tattoo) for myself in the hopes of answering that question.

The book certainly had its limits, from a commercial perspective. The book's marketed as a thriller, but while there's certainly a "story of detection," involving dark deeds and an element of danger, the Millennium Trilogy's first installment is no adrenaline novel. The investigation's very slow to get going, the very skeptical hero expecting to find nothing for over a hundred pages after taking the job, and sure enough, not getting a handle on even the first clue until the mid-point of the 465 page narrative. The story's short on action, too. (Blomkvist doesn't even get shot at until after page three hundred.) When he's unmasked, the killer is no Hannibal Lecter, rather a stock figure as banal as his crimes are hideous, his personal share of the family's very heavy baggage as ugly and awful as one might expect given the circumstances, but hardly anything readers of the genre would not have seen before - something that can be said of just about anything else in the story.

Admittedly Larsson shows some innovation, and reasonable skill, in tying the pursuit of the serial killer with a sweeping family history and a financial crime story, one developed with a fair bit of sophistication (though I felt there was an element of wish-fulfillment in the conclusion). In a few instances, there seemed to be touches many a reader likely finds compelling in a retro way. The element of dynastic epic we get here, the protagonist who bed-hops adroitly and guiltlessly - this is stuff we got a lot more of in the '70s. (Indeed, Daniel Craig will probably bed more women in this film franchise than he did as 007.) Perhaps reflecting Larsson's own experiences, there is an edge to the predicaments and compromises Blomkvist faces as a journalist. Salander also turned out to be a more engaging creation than I expected. Many authors can't resist the temptation to turn a character like this one into a Mary Sue (as with the aforementioned, extremely tiresome Temperance Brennan), but Salander's combination of strengths and vulnerabilities, the marks left by her condition and society's incomprehension of it, make her more complex and interesting than that, the novel's characterization of her both sharp and sensitive. (Most of the time, anyway. There is a point late in the book when Salander's bag of tricks suddenly seems implausibly large.) The treatment of Salander's hacking is also a cut above the commonplace depictions of the activity as a black nerd-magic useful for moving the plot over and around any obstacle. Finally, while Larsson's is not the most visceral or swiftly paced writing, the narrative flows smoothly enough. Put another way, he knows how to keep readers turning the pages.

Where marketing the book in the U.S. is concerned, the element of financial crime may be a bit more intriguing to the general readership amid an economic crisis which has seen plenty of it, and as Charles McGrath noted in a lengthy article in the New York Times, that the books introduce American readers to
a Sweden that is vastly different from the bleak, repressed, guilt-ridden images we see in Ingmar Bergman movies and from the design-loving Socialist paradise we imagine whenever we visit Ikea . . . [which is instead] a country that turns out to be a lot like our own.
The fact that so much of the story seems familiar, particularly what is unpleasant in the story, may be exactly the point. There's probably no small amount of Schadenfreude in that, given the way attitudes toward Europe echo the country's own culture wars (awfully ironic given Larsson's own politics), but this moves product all the same.

Still, the book's strengths, exoticisms and timing notwithstanding, I can't help but feel that the heights the book has attained in the United States (and worldwide) represent a triumph of marketing more than anything else. Ultimately, the book is a bestseller because it is a bestseller (as it was in Europe before reaching the U.S.). In any event, the situation reverts to normal when one considers that Hollywood is going for a remake rather than a theatrical release of the Swedish films already made out of the books, despite this rather low-key material's questionable appropriateness to a high-concept production. Indeed, the questionable stylistic fit of the flashy David Fincher to Larsson's writing (so that I actually wonder if he hasn't been brought in specifically to add an element of flash); Daniel Craig's uncertain record in selling major releases outside the Bond franchise; Hollywood's well-known profligacy with budgets and the grosses studios are forced to expect to justify that profligacy (which make even a $400 million take a potentially franchise-ending disappointment in many cases); and the tendency toward diminishing returns on sequels and remakes (keep in mind that much of the intended audience will have seen the Swedish film versions first); make this an unlikely franchise.

1. Patrick Anderson of the Washington Post does better than most, but likewise fell short of answering my questions.

Friday, October 8, 2010

On the New York Times Bestseller List . . .

On last Sunday's New York Times hardcover fiction bestseller list, Jonathan Franzen, Nicholas Sparks and Stieg Larsson are all on top, but mystery writer Janet Evanovich's urban fantasy Wicked Appetite is at #6, while Guillermo Del Toro and Chuck Hogan's The Fall (book two of their Strain cycle) is at #8. There are, however, quite a few other authors using milder speculative elements in their fiction, Ted Bell's Warlord and Clive Cussler and Grant Blackwood's Lost Empire being at #13 and #14 respectively. If one stretches the definition of speculative fiction that much more, there's also Sara Gruen's story of missing bonobos who turn up on a reality show, Ape House, currently at #15.

The list of speculative-themed works lengthens considerably when one looks at the extended NYT list, where paranormal romance is evident, with Sherilyn Kenyon's No Mercy at #16 and Christine Feehan's Dark Peril at #32; still more urban fantasy from mystery writers who started out as "mundanes," with Charlaine Harris's Dead in the Family at #23; epic fantasy in Terry Brooks's Bearer of the Black Staff and Brandon Sanderson's The Way of Kings, at #25 and #29; more idiosyncratic, slipstream-ish work like William Gibson's "post-science fiction" novel Zero History at #17 and W. Bruce Cameron's story told from a dog's point of view, A Dog's Purpose, at #28; and finally, S.M. Stirling's latest entry in his "Emberverse" post-apocalyptic military adventure series, The High King of Montival at #31.

Once again, it's validation for the arguments that fantasy, the paranormal and what might be termed "slipstream" are more popular than science fiction more narrowly defined; that books and authors incorporating just a little of the stuff into their stories (e.g., contemporary urban fantasy) have an easier time reaching big audiences than work which uses more fully speculative contexts (like epic fantasy or space opera); and that the big names, by and large, remain old names (including quite a few 1970s-vintage names), both those which are more (Brooks, Gibson, Stirling) or less (Cussler, Evanovich) closely associated with science fiction and fantasy.

Tuesday, September 14, 2010

Give the Superheroes a Rest?

Alan Stanley Blair recently penned a piece at Airlock Alpha on the boom in movies (and television) based on superhero comics titled "Enough With The Comic Books Already!: Comic Book Adaptations are Ruining Movie Theaters and Network Lineups"--which pretty much says it all about his position on the phenomenon.1

That boom, generally regarded as having begun in 2000, has already been more sustained, prolific, robust and qualitatively impressive than just about any other (like cyber-themed movies, or space-themed movies, or historical epics, or film versions of fantasy novels) I can think of in recent decades, with Bryan Singer's X-Men movies (2000, 2003), Sam Raimi's first two Spiderman movies (2002 and 2004) and Christopher Nolan's Batman movies (2005 and 2008) generally seen as leading the way.2 Even the movies regarded as comparative disappointments, like Mark Steven Johnson's Daredevil (2003) or Ang Lee's Hulk (2003), still frequently did big business (each breaking the $100 million mark at the North American box office), and reflected the newly sophisticated approach to the material. In their eagerness to capitalize on the trend the studios have gone far beyond the list of D.C. and Marvel's most venerable characters, and even newer classics like Alan Moore's V for Vendetta (2006) and Watchmen (2009), to pursue projects based on "B-grade" superheroes like Marvel's The Punisher (2004) and Ghost Rider (2006). And of course, there are original movies using similar elements, mostly parodies like The Incredibles (2004), Sky High (2005), My Super Ex-Girlfriend (2006), Zoom (2006), Super Capers (2009), Kick-Ass (2010), and the new Super (2010), though the blockbuster Hancock (2008) played its material fairly straight.3

After a decade the films are still attracting audiences and making money. The more profitable franchises are still chugging along with new movies in the pipeline, and the franchises that have disappointed are frequently continuing in their own way too, the studios apparently reluctant to give up on any such property. While there was little potential for a Daredevil sequel, Warner Brothers nonetheless produced a more modestly budgeted spin-off, Elektra (2005). In other cases they quickly proceeded to reboots of their misfires, like the 2003 Hulk and 2004 Punisher films-brand new, unconnected Hulk and Punisher films promptly appearing in 2008. The Incredible Hulk was a comparative success, but Punisher: War Zone flopped even worse than the first film, and may get rebooted a second time. Even more surprising, given that Spiderman 3 (2007) was still a huge commercial success, that franchise too is going back to the drawing board--as is the case with the next Superman film after the ambivalent reception of 2006's Superman Returns. Meanwhile, whole new series are being prepped for launch.

As things stand, Captain America and Green Lantern and Thor are already well on their way to the big screen. Other movies, like the long-delayed Wonder Woman film, are being developed. Some of these may never get out of "development hell," but others will no doubt make it to the screen. In all likelihood some of them will make big money, some will at least be fun, and a few may even offer more substantial entertainment than that. However, it seems unlikely they can revolutionize the genre at this point, and I suppose that freshness and thrill have both faded. I've certainly enjoyed the boom--but like Mr. Blair I think the time has come to back off and give the superheroes a rest for a while.

NOTES
1. There has been far less effort to bring superheroes to the small screen, and much less success in the attempts (no doubt because the spectacle that is a large part of the appeal of the movies cannot be replicated within television's constraints, especially as the vileness that is reality TV threatens to swallow up everything else). Still, there are some noteworthy efforts, in particular Heroes (2006-2010), especially during its highly praised and widely watched first season (since which time interest withered until the show was canceled without a fuss earlier this year), and Smallville (2001-), which never commanded the kind of audience Heroes had or approached the pop cultural impact that show enjoyed at its height, but which is now entering into its tenth season. Still, some interest continues with the series' The Cape and No Ordinary Family premiering this year.
2. It should be remembered that the films of the 2000s could also be seen as a continuation of the wave that Tim Burton's Batman got started, which included such successes as Teenage Mutant Ninja Turtles (1990), Dick Tracy (1990), The Crow (1992), Men in Black (1997) and Blade (1998), and all their associated sequels--one which never stopped, even if it was on the whole less productive or successful than the rush of the 2000s.
3. Movies based on comics and graphic novels not featuring superheroes have been rather rarer, though a couple of notable commercial successes came out of Frank Miller's work, namely Sin City (2005) and 300 (2007). There was also Alan Moore's From Hell (2001), the Alien vs. Predator franchise (2004 and 2007), and even some more "highbrow" fare like Road to Perdition (2002), American Splendor (2003) and A History of Violence (2005).

Subscribe Now: Feed Icon