Tuesday, July 9, 2019

The Delay of Bond 25

The latest word about "Bond 25" is that it will start hitting theaters in April 2020.

Assuming the franchise keeps to the schedule, this will be the first time in over three decades that a Bond movie has entered the summer box office fray.

More significant, however, is the fact that an April 2020 release date will mark the second longest gap between one Bond film and the next in the series' nearly six decade history. coming only after the nearly six-and-a-half year gap between Licence to Kill and Goldeneye--a function of not just the production headaches of this like every other long-running, big money franchise, but among other things (not least, the death of producer Albert R. Broccoli), the culmination of a long trend of decline culminating in particularly weak box office performance of Licence to Kill; and the end of the Cold War with all it implied for the spy genre.

Nothing really comparable is operative this time around. (Spectre was considered a letdown, but with nearly $900 million banked it was still one of the more successful installments in the franchise's history, even after adjustment for inflation, while comparison with the preceding film Skyfall was unrealistic, given the exceptional interest, and marketing opportunities, the fiftieth anniversary of the series provided, and which did so much to make it the series' highest earner of all time.) And there seems to be no real consideration of wrenching changes in the series' tone, aesthetic, theme--the course of the rebooted series, if anything, reaffirmed by the choice of Cary Fukunaga to helm and Phoebe Waller-Bridge (groan) to cowrite the script.

Rather the delay has me thinking of how a slower rate of output has been the norm for the series this century. The current plan will mean five Bond films in seventeen-and-a-half years--once every three and a half years, almost twice as long as was the norm pre-reboot. (Forty movies between 1962 and 2002, with that rate maintained even in later periods, two Bond films following Goldeneye in a mere four years.)

It all strikes me as underlining what is too little admitted, what is almost immediately beaten down by the boosters when anyone breathes any such word about any series--the franchise's increasing difficulty staying relevant.

When it started out this was simply not a problem. In the '60s the Bond films, without apparent strain, contributed to and rode a wave of popular fascination with spies and jet age glamour, but what really made the series was something more distinctive to it--the series' invention of the action-adventure blockbuster, both the technique for making such films (the fast-paced, set piece-centered structure that casts logic to the wind, the battery of cinematographic and editing technique that derives the most impact from those thrills).

No one else achieved anything really comparable then, despite wide imitation, which tended to settle for copying its more superficial elements. (A suave superspy? Let's do that, they all thought. But making an action movie to compare with the Bonds was generally beyond them, as a glance at the relatively high-profile Derek Flint movies shows.) This near-monopoly on its style of action movie-making remained the case even after the flood of Bond knock-offs turned into a comparative trickle, so that after the series' period of real originality (that first decade and its first half dozen movies) passed, and the novelty was reduced to an occasional set piece or gimmick that would stick in the popular mind, and it became repetitive of its own successes and derivative of those of others to sustain the glitter of the brand name (unmistakable in the chase-packed, blaxploitation-themed Live and Let Die), there was not a whole lot of real, head-to-head competition. (The French Connection, Dirty Harry, sure, they were hits--but they were doing something fairly different.)

Still, if slow, Hollywood did begin to catch up, Star Wars a signal moment in the process, while by the '80s Hollywood was adroit enough at this that the latest Bond movie could get lost n the summer crowd (up against Rambo and Mad Max and Commando in '85, against Predator and Robocop and Lethal Weapon and Beverly Hills Cop in '87, against Indiana Jones and Lethal Weapon again and Batman in '89).

It got tougher still in the '90s, while by the twenty-first century it seemed like there was a new installment in a big-budget action franchise at the multiplex just about every week of the year--while the series' dispensing with much of what had made it distinctive made its standing out all the harder. (Already with Licence to Kill the results were looking like a generic '80s action film--while the producers' discomfort with the flamboyant plots and the self-indulgence of the character cost them much of what personality they had.)

As a result, instead of an assembly line more or less reliably chugging out Bond movie after Bond movie under the same team (director John Glen, who had been with the Bond franchise since the '60s, helmed five Bond movies in a row in the '80s), they have strained to "make it new" (ironically, while following the course of everyone else--"make it dark").

In the '90s changes of directors became frequent, while increasingly selecting the sort of "auteurs" who originally had no place in a franchise dominated by a creative producer (Broccoli a recipient of the Academy's Irving Thalberg Award for a reason). First there was the art-house break-out Lee Tamahori, given the charge of the 40th anniversary film Die Another Day, even before the reboot saw a turn to such directors as Marc Foster, Sam Mendes, and now the aforementioned Fukunaga. Again, the grosses have been good, but I am unconvinced this particular practice has really been all that helpful. There has, too, been the attempt at telling the story of how Bond became Bond, with a certain amount of soap operatic family drama and mythmaking tossed in. Not everyone was a fan of the approach. (I wasn't.) Still, many felt that this worked for Casino Royale, and Skyfall, each of those films giving viewers the sense that each of these was more than "another" Bond film. The reaction to Spectre, however, made fairly clear the limitations of that approach.

Is there much chance that the new team can come up with something, if only enough to give the franchise a bump like The Spy Who Loved Me, rather than continue a downward movement, the way A View to a Kill did?

I put that question to you, readers.

Saturday, June 29, 2019

The Historiography of Paramilitary Fiction

I have been reading and thinking and writing a great deal about spy fiction and military techno-thrillers.



It is not really possible to do that properly without giving a fair amount of thought to paramilitary action-adventure--the body of work about which I have had plenty of occasion to survey over the years.

I have found that there are plenty of works on particular series'--like William H. Young's A Study of Action-Adventure Fiction: The Executioner and Mack Bolan (the one which started it all)--and even works on subgenres, like John Newsinger's Dangerous Men: The SAS and Popular Culture; comprehensive round-ups of at least key swaths of the work published to date--like Bradley Mengel's Serial Vigilantes of Paperback Fiction; and even a measure of critical examination of the phenomenon as a whole, like James William Gibson's Warrior Dreams: Paramilitary Culture in Post-Vietnam America.

The works all have their interest and uses for the researcher, but none of these works amounts to a robust, "big picture" history of the genre's origins and development. My impression is that no one has attempted to produce such a history.

This may seem surprising. But it really isn't, given the way that scholars commonly treat genres of popular fiction. They tend not to take much interest in providing a comprehensive history of the genre, just look at such bits of it as fit within what happens to be fashionable at the moment. And even where a few buck the trend to produce something a little broader in perspective, the pressure to be rigorous, the notorious tendency to specialize it can encourage, leave them hesitate to go too far in putting the puzzle pieces together. This has certainly been the case with science fiction, for example--as I discuss at some length in Cyberpunk, Steampunk and Wizardry. The result is that there is a plenitude of rigorous but specialized work, and lots of stuff that affords a wider view but is written in a casual way (a principal reason why, before essaying the post-1980 history of the field that was my initial concern in the book, I had to spend so much time working out what came before).



However, even more fundamental than that is the fact that at the very high peak of its popularity the genre never attracted much serious critical attention. And since that peak the level of attention accorded the genre has receded very sharply indeed--so much so that when Gold Eagle, which is to paramilitary action-adventure what its owner Harlequin is to romance, never seems to have even got its own Wikipedia page. The site's disambiguation page for "Gold Eagle" simply identifies it as an "imprint of Harlequin Enterprises," and links the reader to Harlequin's page. There one finds in the three thousand word main text of the article only the acknowledgment that the company does own, among its other imprints, Gold Eagle, a publisher of "male action-adventure books." That's it--no reference even made to the fact that the company has been shut down, in part, I suppose, because literally no news outlet bothered to report it. (In fact, I only learned about the decision by chancing on a comment about it in a forum on the personal web site of one of its major writers, and afterward independently got confirmation from another of the company's authors--while so far as I can tell, my blog post about it is the most anyone has bothered to write on the subject.) It has likely factored into this that such related genres as the action movie or the techno-thriller have received surprisingly little of their attention, and that, all too often along the lines discussed above (which left me putting a lot of the pieces together myself when writing about James Bond, or Star Wars, or more recently, techno-thrillers).






One can hardly write a brilliant history, or even a bad one, of a body of work they have not even deigned to notice exists.

Wednesday, June 26, 2019

Science Fiction's Side Stories?

Most of this who dig into the history of science fiction in any systematic way quickly encounter a particular outline of that history, or at any rate, history of its leading theories and tendencies with regard to content and form, the associated movements, debates, the figures and publications with which they are most identified. This has the genre—definable as fiction in some important way founded on speculative science—increasingly cropping up in the increasingly scientifically-minded nineteenth century, with Mary Shelley, with Jules Verne, with HG. Wells, and on into the early twentieth, when, largely through the efforts of Hugo Gernsback in the '20s, this went from being an increasingly common story trait to a genre. He did not long remain its leading light, however, the years after shortly seeing the ascent of "harder," more extrapolative science fiction through Campbell's tenure at Astounding in the '30s and '40s, and the successors who built on his work in their turn in subsequent years, like the more social science-minded Horace Gold at Galaxy—while still others rebelled against it, often by favoring the standards of mainstream and even "highbrow" Modernist and postmodernist literature, as was already increasingly apparent in the '50s, the influential editors Anthony Boucher and Francis J. McComas declaring themselves interested in literature first and the speculative second, and the New Wave led by figures like Michael Moorcock and J.G. Ballard and Harlan Ellison in the '60s and '70s edging still further in that direction. Afterward has been a fuzzier thing, less defined by dominant figures and audacious movements, and more so by the synthesis of prior streams, by nostalgic evocations of earlier works and themes, and above all by postmodernist experiment at the highbrow end of the field and ruthless commercialization everywhere.

Those reading this history, if examining it closely, find that it is usually written out in a rather casual, even vague way, especially when the "big picture" is their concern. This tends to take the form of "folk history," based on the casual and casually expressed recollections of the story's heroes--an Asimov or Aldiss in their writings about the genre, for example. Not least because they lived within a large portion of the field's history, their works and other acts the stuff of a fair chunk of that history, they frequently show great insight. Still, it is a far different thing from systematically amassing the documentary and other evidence for similarly systematic examination, and rigorously deriving conclusions from that, then presenting it all in such a way that the reader can judge the claims and the evidence and decide for themselves—alas, to be found only in explicitly academic studies of much smaller portions of the history (like the work of Mike Ashley on genre science fiction's earlier days, or Colin Greenland's work on the New Wave).

Writing Cyberpunk, Steampunk and Wizardry what I aspired to was a work providing a "big picture" of the genre's history on a more secure basis, in which the more academic studies were of considerable help, but to which end I found myself having to do a good deal of primary-source investigation myself. What, I wanted to know, did Gernsback do that mattered so much? What did Campbell have to say for himself, and why was it revolutionary? And so on and so forth. Would the facts really support the claims I had seen, or would they suggest something altogether different?

Indeed, initially beginning as a history of science fiction since 1980, I found myself having to do so much of this that much of the book wound up recapitulating what happened in earlier decades, just to provide a proper basis for analyzing more recent developments. Still, when all was said and done I found that the facts did indeed support the familiar outline. And so what I ended up with, rather than something radically different, just tweaked the outline here and there (I was surprised to see Gernsback incluing masterfully, surprised to see that most of those writers we think of as "New Wave" didn't really get what Ballard was talking about), fleshed parts of it out more than I'd seen others do, sourced it with, I think, more than the usual caution, but the essentials were the same.



In the end, it seems just to say that the outline is reasonably faithful to what actually happened. However, at the very least the experience reminded me that the outline is also far from complete. Certainly the key figures, the key movements, in the history, were all real enough, and sincere enough in promoting their theories, tastes, standards, which did exercise their influence in the generally acknowledged ways. And in covering all that I had occasion to discuss a good deal else, not least the genre's treatment of those themes that it handles as no other form of literature can (utopia and dystopia, catastrophe and transcendence); the broader development of media, popular culture and politics, with which science fiction constantly interacted; and the standing of the genre in relation to the cultural mainstream, mass audience, upmarket reviewers and academic critics alike.

Still, much else was going on around, alongside and even underneath all this.

I think, for instance, of how the old Gernsbackian science fiction, flowing into the stream of old-time pulp fiction alongside more grounded work like Doc Savage, laid the foundations for the rise of the comic strip and comic book superhero--and the fantasies to which they all spoke--or the odd kinship between science fiction and the hard-boiled crime tradition that eventually gave us noir. I think of how, at the same time that he was refining and promoting his vision of rigorously extrapolative, thought experiment science fiction, John Campbell was preoccupied by elitist fantasies of ubermenschen rising above, transcending, the common herd, and ideas which seemed to speak to that--Scientology, General Semantics, psi. I think of how science fiction has so often been a kind of scientists' equivalent of chivalric fantasy, and how alternate history has been bound up with wish-fulfillment--perhaps, the fulfillment of very dark and disturbing wishes indeed when we look at the genre's serving up one scenario of Nazi victory in World War II after another.

And I think that it is about this side of the genre's history--side stories, perhaps, from the standpoint of the "main line," but interesting and significant for all that--is what I will increasingly write about when I turn to the subject again.

Science Fiction as Chivalric Literature?

The chivalric tale of the Middle Ages and after glorified and flattered the feudal warrior-aristocracy. Likewise science fiction has glorified and flattered the scientist, and in much the same way—by presenting an utterly false picture of what they do. Far from the knight-errant doing great deeds in the pursuit of fortune and glory ever having existed, David Graeber suggests in his brilliant anthropological-historical-political economic study Debt that the image may be an adaptation of the merchant's pursuit of his trade for an aristocratic culture. There is, too, the disinterest of such fiction in how wars are actually fought and won (or lost). Such mundane matters as numbers and logistics are given short shrift, while personal heroics are all.

Similarly, instead of endeavoring to depict the reality of science as a cumulative, collective enterprise, increasingly institutionalized, it presents science as a ruggedly individualistic endeavor. The image goes hand in hand with the gross exaggeration of its figures' abilities, presenting heroes and villains alike who, for instance, are unable to go a day without idly throwing together some mind-boggling machine that will change the world, often in the course of action-adventure befitting the knight-errant. (Thus in Skylark Three Richard Seaton, while journeying in a starship of his own construction, builds a machine capable of near-instantly imparting perfect command of a foreign language to its user just to have something to do—and of course, conveniently get the heroes through the obstacles that crop up soon enough.)

Along with the power of the scientist in the lab, science fiction exaggerates the power of the scientist in society, perhaps even more than the old chivalric tale did the power and influence of its aristocrats. The knight, as an aristocrat, was a member of the ruling elite of his day, with even an impoverished nobleman legally set apart from and above the commoners, and as a matter of law enjoying all manner of privileges they do not, with such examples as tax exemption, priority in assignment to government posts and disproportionate voting rights in the legislature enduring fairly late into modern times.

No such claim can be made for the scientist. They have no special legal status, and indeed, not much privilege of any kind at all as anyone familiar with the abundance of underemployed and underpaid adjunct professors and postdoctoral fellows, or the hell the lucky few who land steady work go through to get some grant money, can attest. Indeed, for all the pious esteem for science and its practitioners it is not the genius in the lab but the possessor of great wealth, the head of the vast bureaucratic machine, who wields power and enjoys supreme status. Moreover, to the extent that education was a qualification for the position such figures attained rather than a badge of the privilege that positioned them to achieve such heights, they were far more likely to be trained in law or management than science, of which they frequently know little or nothing.

Indeed, the fantasy epitomized by the Edisonade was arguably that a talented scientist could through their skills win their way to the possession of such wealth, and the headship of such organizations (never mind the fact that, at that point, they would likely cease to do any real scientific work). Or that, even if they do not become a "tai-kun," their knowledge will make them the man or woman of the hour when it turns out that, contrary to what everyone else may have believed, what they were working on all this time was not useless. (Think of every disaster movie, every alien contact movie, every grandiose B-science fiction movie plot, period. Crisis strikes! And then a Man in Black shows up on the doorstep of some scientist who had previously been toiling quietly in obscurity to say "You'll have to come with us" and whisks them off to the White House to tell the President of the United States what to do. Not exactly how things went with climate change, was it?)

Of course, all that being the case, one might wonder: why should anyone bother to flatter scientists in this way? Certainly one reason is that, to a greater degree than later, people who were scientists, or at least had scientific training, played critical roles in establishing science fiction as a genre, while also regarding scientists as an audience worth courting. The individual who might be credited with a greater role than anyone else in bringing that about, Hugo Gernsback, was an electronics industry entrepreneur promoting amateur radio, who began his career as a publisher with what was initially a glorified catalog for his wares, Modern Electrics--in the pages of which he first presented Ralph 124C 41+ to the world. Things worked the other way as well, Gernsback not just using his stories to promote science, but apparently looking to scientists as an audience for his stories, certainly by the time of Amazing Stories. (In his editorials in that magazine he was emphatic about the potential of a writer's speculations, even quite ill-informed and fantastic ones, to inspire a scientist or technologist to the achievement of genuine breakthroughs, a not-so-subtle call on them to pay attention.) During his tenure at Astounding Science Fiction and Fact John Campbell, generally regarded as Gernsback's successor, was likewise scientifically trained (an MIT physics graduate) who saw science fiction as a quasi-scientific enterprise, with his scientist-writers (biochemist Isaac Asimov, aeronautical engineer Robert Heinlein, inventor Murray Leinster) performing thought-experiments and writing up the reports as stories in the pages of his magazine--and Campbell making much of the fact that scientists did read his magazine.

With scientists publishing and writing stories promoted on such grounds, and looking for a readership among scientists and engineers, it was natural enough that there should be a certain wish-fulfillment there, in line with a sense that if the world was not like that, then it ought to be and, perhaps, also that it would be, manifest in the much fawned-over ultra-Competent Men who filled the pages of their magazines. (The plus at the end of Ralph's name was, in his future, the equivalent of an aristocrat's title, an honor bestowed upon the most accomplished of the scientific workers who were society's most esteemed members.) And of course, if the genre changed, much, these editors and their writers had such a wide and enduring influence on the field that it would be unsurprising if others did not follow in their footsteps in this respect (the more so as, again, so many later science fiction writers were scientists as well).

However, where the long run is concerned a more persuasive explanation seems to me to be that even if the scientist's position in society does not rate it, such flattery has been immensely useful to people who do have real power. The powerful today, at the heads of those corporate and government bureaucracies, do need scientific laborers, and they have long looked to tales like these to attract young people to scientific careers in spite of the long and rigorous education required, the scarce economic support for those going this route, and the lack of glamour and modest remuneration generally awaiting them at the end of that road. And certainly the idea that the economic elite have got where they are through scientific-technological genius has been handy in legitimizing the ever-more inordinate wealth and status they enjoy. The boosters of great wealth prefer that Americans picture Silicon Valley rather than Wall Street—and that they think Silicon Valley's billionaires are that because of super-human technical skills rather than the blend of privileged social backgrounds and "street smarts" that let them seize the main chance provided by long government subsidy of the computer field and the unleashed demons of high finance. Indeed, one could say that in the end the glorification and flattery of the scientist has, even on that level, really been all about them.

Counterfactuals as Wish-Fulfillment

In considering the historical counterfactual E.H. Carr argued in his classic What is History? that when facing a major turning point in the course of events historians do
discuss alternative courses available to the actors in the story on the assumption that the option was open, [but] go on quite correctly to explain why one course was eventually chosen rather than the other.
One could always suppose that another course was taken, which may in many cases be "theoretically conceivable," but imagining that history actually did take them is "a parlour game with the might-have-beens of history" which does not "have . . . anything to do with history" as such, or with the historian's job.

If the devotion to counterfactuals was parlor game rather than serious historiography, then what did it mean that so many devoted so much time to it? Reading Carr's book one can see a political game being played on two levels. One was the broadly philosophical—namely, right-wing hostility to those who see rationally discernible tendencies within history, or even simply argue for cause-and-effect, above all the Marxists to whom they were so bitterly hostile, and conservative historians have championed counterfactuals precisely for that reason. (Indeed, it seems worth noting that their most prominent advocate of recent years has been neoconservative Niall Ferguson, who explicitly stated in a self-declared "manifesto"--to be found in his introduction to the Virtual History anthology devoted to the theme--that he regards counterfactuals as valuable because they are an "antidote" to "determinist" views of history, and associated political ideologies of which he disapproves, in his list of which he pointedly includes "the class struggle" and "socialism.")

The other is the shallower level of specific events, with the counterfactual tending to be a wish-fulfillment, a tendency that, again, was heavily political, and reflected in the propensity to choose certain subjects over and over again, rather than others that, from the more standpoint of thought-experiment, seem equally worthy. Carr in particular noted the tendency of such writers to choose the Russian Revolution rather than, for example, the Norman conquest of England or the American Revolution, no one wishing to "reverse the results" or "express a passionate protest against" the latter, while the anti-Communist certainly wished to do so against the former. As a result, "when they read history" they let "their imagination run riot on all the more agreeable things that might have happened," and become "indignant with the historian who goes on quietly with his job of explaining what did happen and why their agreeable wish-dreams remain unfulfilled," driving them to come up with their preferred version.

It is equally hard to argue with Carr on this point, with the tendency to treat the Russian Revolution in this way hardly less strong a half century on, historians who otherwise show more restraint unable to help themselves when writing of that subject, lamenting the "might-have-beens." (To cite one telling, recent example, Adam Tooze does so repeatedly in The Deluge where the Russian Revolution is concerned, but never anything else.)

It seems to me that what can be said of the historian's counterfactual can still more be said of the science fiction writer's alternate history, less bound by the demands of historiography, more susceptible to being mere wish-fulfillment. Considering what Carr has said of the counterfactual's intellectual roots it does not seem implausible that the genre developed and flourished in a context of anti-Communism, or that it coalesced into a genre in a period when such sentiment was at its height, were mere coincidences.

Still more than that, Carr's remarks have me wondering about that most popular topic of that genre, an Axis victory in World War II. Previously considering the matter I thought there was a mix of reasons, namely the combination of the conflict's familiarity, the way its scale and complexity afford numerous possibilities for it playing out differently in ways of significant consequence, and the fact that despite the diversity of understanding and interpretation of the conflict, a broad gamut of opinions accepts to a certain, crucial extent, the Nazis as villains, and the fight against them a "good war." However, given the strong case for alternate history as wish-fulfillment it seems that one must ask—has this genre been catering to closet, or perhaps not so closet, Nazis all this time?

This may seem an appalling charge to make. By and large the genre's major works have, naturally, been deeply anti-Nazi, from C.M. Kornbluth's "Two Dooms" and Philip K. Dick's The Man in the High Castle forward. Still, if the Nazi victory is presented by the authors with distaste, seeing it presented at all may still afford a certain satisfaction to those who would be sympathetic to the outcome (especially when such presentation can scarcely take other forms); while the distaste of such "liberals" and "lefties" for the scenario may add to their enjoyment. It seems all the worth considering given that, as Ronald Smelser and Edward J. Davies have argued in The Myth of the Eastern Front, a considerable subculture exists within the U.S which romanticizes if not the Nazi cause, then the Wehrmacht which was its military instrument—partly as a result of the extent to which senior German generals from that conflict were allowed to write the record in English of the conflict, and the post-war desire to rehabilitate a conservative West German Establishment for the sake of the anti-Communist crusade.

Considering this issue one might also consider the second most common subject in alternate history (or at least, in American writing about it), namely a Confederate victory in the U.S. Civil War. There would seem to be more, not less, sympathy for such an outcome in the U.S., more openly expressed; and one might guess, a larger number of people who would similarly take pleasure in its presentation.

Both these tastes would seem to be reflected in the way a good many readers have taken S.M. Stirling's Draka novels. The books, which picture the emergence of an extremely militarist, racist culture (one with considerable Southern input, and which out-Nazis the Nazis) in southern Africa that goes on to conquer the world, appeal to both pro-Nazi and pro-Confederate fantasy. Stirling insists that his books are a dystopia, but even if one grants that he means what he says, Stirling himself has remarked in interviews that many a fan has expressed the wish to "move there."

It seems quite possible that the "Dominion of the Draka" is not the only dystopia of which that can be said.

Tuesday, June 25, 2019

The Military Techno-Thriller: A History

One little secret of writing--writers often have occasion to wonder if anyone is reading what they produce at all. (Indeed, the subject of how much scholarly and scientific research goes completely unread by anyone has turned into a genre of scholarly and scientific writing itself.)

Where my two pieces on the techno-thriller from 2009 are concerned (one for Strange Horizons, the other for the gone-but-fondly-remembered Internet Review of Science Fiction), I have found that not only did people look at them, but that they have gone on doing so for a decade after their first appearance--David Axe back in December closing a piece on the latest crop of China-themed techno-thrillers by quoting my SH article on the subject.

A decade is a very long time in Internet years, and without false modesty I think it reflects the fact that only so much is written about this genre--one reason why I have since developed what I had to offer in those two pieces into a book offering a fuller history of the military techno-thriller genre, from the first flickerings in the seventeenth and eighteenth centuries, to the emergence of the invasion story in the late Victorian era, and the broader development of the "future war" story in the hundred and fifty years since.



Now The Military Techno-thriller: A History is available in print and e-book formats from Amazon and other retailers.

Get your copy today.

Monday, June 24, 2019

The Evolution of the Thriller

Last year I published a pair of papers over at SSRN about the declining presence on the New York Times and Publisher's Weekly bestseller lists of both spy fiction and military techno-thrillers after the 1980s--and along with this, the shift of the thriller genre away from high politics and international affairs, to domestic settings and "ordinary" crime (and in particular, legal thrillers, serial killer profiler thrillers, forensic thrillers).

The articles were more concerned with empirically checking the (widely held) impression that this was going on against the available data rather than explaining it or discussing its implications.

However, it seems to me that there are a number of such explanations.

One may be that the genres themselves were a bit played out. As I have argued for a number of years, by the 1970s the spy genre had been around for three generations, about the normal lifetime of a literary genre, and showing it--not least in its increasing tendency to look backward, and in its penchant for self-parody and for borrowing from other genres to make new hybrid genres (because what it had to offer itself was looking less fresh).



The military techno-thriller seemed fresher and newer in the '80s, but it, too, was part of that very old tradition, and it seems to me that its essential requirements (politico-military scenarios, novel weaponry) were limiting, enough that it, too, was getting a bit worn-out by the '90s.



Another is that this was a matter of the Cold War's end. The plain and simple truth is that international politics only became a major theme of American popular fiction in the '60s because of that conflict, and it was only natural that its passing would mean a decline in such interest.

However, alongside these issues of exhaustion or diminished topicality there is the matter of the sort of reading experience these books offered. Thrillers about spies, techno-military crises and the like are "big picture"-oriented and information-heavy, particularly regarding the "machinery of civilization." In this they tended toward what Goethe and Schiller called the "epic." This is, in fact, part of their appeal for many. (It was certainly part of their appeal for me.)

But I suspect most readers are looking less for the epic than the "dramatic" (Goethe and Schiller's term again)--fiction offering the reader close identification with with a protagonist whose struggles they follow with baited breath, and never mind all the big picture, informational stuff that tends to get in the way of that kind of experience. It seems to me that domestic settings, ordinary legal and criminal stuff, fits in with those more easily than flying missiles and Presidents on the hotline to the Kremlin, reading about which the quotidian and character-type stuff only gets in the way. (I admit it: reading Jack Ryan novels I was never really interested in Jack Ryan's home life, just the crisis stuff and high-tech stuff and action stuff.)

The tilt toward dramatic rather than epic apart, it may even be that people were inclining more toward easy-to-read books--which the techno-thrillers, with their multiple plot threads and density with technological and other material detail often were not, the reader having to keep track of too much, process too much. (Jack Ryan is only on the page for a third of The Hunt for Red October, and in the paperback edition, does not show up for a hundred pages amid all the diplomatic and military maneuvering of a full-blown superpower chess game where A-10s buzz the Kirov battle-cruiser, and American and Soviet naval aviators shoot it out in the sky.)

I suspect this all the more looking at the work of perhaps that most dramatically inclined of the era's bestselling spy novelists, John le Carre. His fiction was indeed character-oriented, but his oblique, Show-don't-tell-oriented prose style made a very heavy demand on the reader, so much so that in high school I found him nearly unreadable, and even after having become a fan, am astonished that he managed to sell so many books. (As Raymond Benson declared some time ago, by the '90s Ian Fleming would have been deemed "too hard" for the broad market--and his prose was a walk in the park compared to le Carre's.)

I suspect all this, too, looking at the other sorts of popular fiction that have flourished in recent years--like that penchant for young adult-oriented dystopias. As I argued in Cyberpunk, Steampunk and Wizardry, what we tended to get in works like Suzanne Collins' Hunger Games and Veronica Roth's Divergent were dystopias with little-to-no politics (!) or world-building, information-light stories that were narrowly focused and intimate in their presentation of protagonists with whom our identification was everything; and as young adult stories, all the easier to read.

Why Nothing Ever Seems to Go Viral

As Hit Makers author Derek Thompson explained in an interview with Forbes, for something to go viral means the product or idea in question "spread through many generations of intimate sharing." People who liked it shared it with other people, who in their turn shared it with other people, who in turn shared it with other people, again and again in short sequence, rapidly multiplying the number who saw it (and shared it) in a manner analogous to the spread of an epidemic from Patient Zero.

Of course, for anyone looking for an audience--and who isn't?--the prospect of such success is enormously attractive. But it also turns out that, as Thompson himself remarks (and as many a frustrated marketer has doubtless come to suspect), it almost never happens.

When I consider how social media works this seems to me the more obvious. Take Twitter, for example, which makes certain data conveniently available to everyone with an account--among that data, the number of "impressions" they made in a given period, which is to say, the number of times the item appears on screen in front of a follower, whether or not they look at.

Based on the numbers I have seen (not just my own) someone who Tweets something can expect impressions from it equivalent to 1-3 percent of their followers. Someone with a thousand followers, for example, might expect to score ten to thirty impressions per Tweet.

Not an impressive figure, is it?

And remember, impressions are just that, a thing far short of the deeper concept of "engagement," about which Twitter also provides figures to its users. This is the Twitter user interacting with the Tweet some way--if only by clicking on it.

As it happens, the rates of engagement-per-impression are not very high. An engagement rate of 0.9 percent--in other words, a less than one in a hundred chance of an impression leading to engagement--is considered a good, solid rate.

As this includes such things as accidental clicks, or multiple engagements by the same person (who might click on it, and like it, and Retweet it), one may safely assume that the odds of any one person really taking an interest in any single Tweet presented them is lower.

So 0.9 percent, of 1-3 percent, and you might get one engagement for every ten thousand followers you have, with the more substantive engagements, like likes and Retweets, apt to be a fraction of that. One in several tens of thousands.

Ninety-nine percent of Twitter users have 3,000 followers or less. Far too few for anyone to expect that their own followers will engage any one Tweet in a meaningful way, let alone Retweet the Tweet. Apparently 0.1 percent have 25,000 followers or more, enough to have a reasonable chance of their Tweet being Retweeted. Still, given the aforementioned figure one can expect that only 0.1 percent of their followers has a similarly sized following (25 people). And given that only one in tens of thousands of those followers of their followers might Retweet, the odds that one of their followers with a large following of their own will Retweet any of their Retweets are very slim indeed.

The result is that the chances of any one Tweet getting a Retweet, and still more a second Retweet, are minuscule. The chances of one Retweet leading to another leading to another Retweet, again and again, in line with the "going viral" model, are a minuscule fraction of that.

The figures of which I write here are averages. Of course, averages contain wide divergences. Some will do much better--but others will do much worse. And it is worth remembering that the bigger one's following, the lower the engagement rate they are likely to have (if someone is famous enough to have millions of followers, a lot of them are likely to be pretty casual), offsetting at least somewhat the benefit of their bigger following from the standpoint of the Tweet's wider propagation.

On top of that some argue that the continuing fragmentation of the media--social media most certainly included--is making anything's going viral even less likely than before. But it was never all that plausible a thing to hope for in the first place.

Tuesday, June 18, 2019

Reflections on Erich Maria Remarque's All Quiet on the Western Front

I recently revisited Erich Maria Remarque's All Quiet on the Western Front. The book is not a sprawling epic in the manner of War and Peace but a compact (175 pages in my paperback edition) chronicle of the experiences of its narrator, a young enlistee in the German army, by the start of the story already in the field a good while, through the remainder of the conflict.

As might be expected given the book's reputation, to say nothing of the stuff of the more serious war literature generally, it depicts the gruesome destruction of bodies and minds, the narrowing of horizons amid it all, the sharpening of the little pleasures they steal (often literally) in the quieter moments--a fine meal, a few hours in the company of a woman. The alienation of the foot soldier trying to hold onto life and limb and sanity from talk of the "big picture," and from the civilians they are (often justly) sure cannot understand what they have been through. The baggage and the scars they bear forever after. The martinet sergeants and medical corp quacks and corrupt officers and, just off-stage, businessmen thriving financially far behind the lines. Still, familiar as it all is, Remarque's crisp treatment expresses it all with great clarity and force.

Naturally the Nazis hated it, and not solely because it tells the truth about war rather than glorifying it, but because it gave the lie to their "stab in the back" propaganda. (As the novel proceeds one sees the German army ground down, and in the concluding chapters, finally overwhelmed by superior Allied resources and manpower.)

The Nazis' hatred of Remarque's book, in fact, started a long-running conflict between the German government and Universal over it in which Hollywood, frankly, disgraced itself, subordinating even what artistic freedom the Hays Code left to foreign profits. (Ben Urwand tells that story in some detail in his very worthwhile The Collaboration: Hollywood's Pact With Hitler.)

Today one might imagine the book to be no more popular than that with their heirs, of whom we now see and hear so much more than we did just a short while before, in Germany and everywhere else. Any reader of recent historiography (for instance, neocon court historian Niall Ferguson's The Pity of War) quickly sees the evidence that the right-wingers seeking to rehabilitate that war, and war more generally, have been getting the upper hand.

That seems to me all the more reason to take a look at this deserved classic if you haven't seen it, and reason for a second look if you already have.

Monday, June 17, 2019

Rewatching the '90s?

I recently had occasion to think about how we never left the '80s in the really important things--in our economics and politics, the right-wing backlash, the financialization and inequality and the rest going on and on and on.

To my surprise, this is even the case with television, as I am reminded when I run across a rerun of a TV show from the '90s. The Pretender, for instance.

This may sound odd. The long opening credits sequences have been discarded--for the sake of cramming in more commercials, I'm sure, no matter what anyone says. There is less inclination to standalone episodes and more toward arcs (albeit shoddily assembled ones). There is much pompous display of self-important edginess and middlebrow pretentiousness (the indie movie sensibility come to the small-screen).

But simply happening on an episode in the middle, especially if no one is holding a cell phone or making an ostentatiously current pop culture reference, none of this is necessarily apparent. What is apparent is the way people look and talk, the hair and the clothes, the ring of the dialogue--even the texture of the images--they don't scream '90s the way a show from the '80s or '70s or earlier does. It might have been made yesterday. Or so it seems to me.

But then I actually am old enough to remember when the '90s was new.

I wonder, does that factor into all this?

Thursday, June 6, 2019

The End of High Concept Comedy?

Recently running into a commercial for the Will Ferrell-John C. Reilly vehicle Holmes & Watson on On Demand I was surprised that I didn't recall hearing a word about it, and eventually got around to looking it up. I quickly ran across Jesse Hassenger's article on the matter at The Verge that made it clear that not only did the film drop into theaters last December with little fanfare, before flopping, but that it appeared symptomatic of a significant trend--the decline of the A-list, star-centered comedy (the top slots at the box office, more than ever, are monopolized by sci-fi/fantasy action and splashy animated movies), which has long been noticeable but which hit a new low in 2018.

While Hassenger is right to notice the trend, his explanation--that movie-goers are getting their comedy from action movies and animation--does not really convince me. I don't deny that there may be some truth in it, but I suspect a far bigger competitor to these films, like all films, is the small screen--to which not only are movies zipping from the theater in record time, but which is replete with original comedy of its own through the burgeoning array of channels and streaming services.

To get people out of the house and make them pay $20 a ticket, parking fees, and notoriously obscene concession stand prices for mediocre food, the big screen really has to deliver something they can get elsewhere.

There are only two things it can offer now.

One is to serve up something that plays differently on a big screen than a small.

The other is to make the film's release feel like an event--something they want to participate in right now with everybody else rather than wait two months to see it much more cheaply in the comfort of their own home.

The Marvel Cinematic Universe flourished by accomplishing both. However, neither works nearly so well for comedy. While action can seem more thrilling on a giant screen, I'm not sure it does nearly so much for slapstick, let alone one-liners. And where the creation of a sense of an event is concerned--the latest entry in an action-adventure epic seems more plausible material for such an effort than a repeat of goofy antics.

It is the case, too, that Hollywood has less incentive to try--and perhaps, less ability than it used to. Where incentive is concerned, consider the things that make movies lucrative commercially--foreign earnings, sequels, and merchandising. High-concept A-list comedy is less natural material for this than the action movies and family animation. Comedy travels less well than action, it lends itself less easily to "Part 6," and the rest.

Where ability is concerned, it should be acknowledged that comedy is simply harder to do passably than action (what is funny is a lot harder to pin down than the stuff of tentpole thrills)--and the audience more unforgiving when filmmakers make a botch of it. One can botch a big action blockbuster badly, and people will still walk away feeling reasonably satisfied if they saw that quickly cut, glossy, flashy-looking $200 million parade of CGI up on the screen. (Remember Transformers 2? Despite the criticism and the outsized budgets the series made it to Transformers 5, and Bumblebee, which might be getting its own sequels even as the endurance of the main series has become uncertain.) But screw up a comedy, and the people who are not laughing, entirely aware of the fact that they are not laughing, will be quicker to punish those who have wasted their time and money.

It might be added, too, that this is not a promising political climate where much of the traditional material of comedy--the sex comedy, the romantic comedy whose decline has already been the subject of any number of articles--is concerned. (Is the Hays Code era Ernest Lubitsch too edgy for today's puritanical Hollywood? Quite likely.) And of course, those inside the Hollywood bubble, utterly insulated from the lives and concerns of ninety-nine percent-plus of the planet, are likely neither inclined nor equipped to tackle any real social satire (and the right-wing press ready to tear them apart if they even try).*

Naturally, I do not see the trend running its course any time soon.

* Again I cite David Graeber: "Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste."

The Action Film's Transitional Years: Recalling the 1990s

Once upon a time cops and, especially in the '80s, the commando, were the characteristic figures of the action movie, a relatively grounded, often quite gritty, mid-budget and typically R-rated affair stressing gunplay and car chases. Today it seems that the genre is nearly synonymous with super-powered superheroes at the center of megabudgeted, Computer Generated Imagery-fueled science fiction extravaganzas, more family-friendly than not, but so big and so fast that looking right at them one can scarcely tell what is happening on the screen for much of the time. I would argue that where this was concerned, the 1990s was the key transitional period.

A Generation of Action Films: From the 1960s to the 1980s
In the 1960s the first half dozen or so Bond films largely established the essentials of the genre, from their filmic structure (specifically, the packing in of more thrills, substantially through the use of multiple set pieces, if necessary at the expense of narrative logic), to the range of essential material (the variety of fight and chase scenes, from frogmen-with-knives to ninjas to pursuits on skis), to the associated cinematographic and editing techniques (the use of close shots, short takes, jump cuts and exaggerated sound effects in portraying the fight scenes), as well as developing much of the required technology (techniques for underwater and aerial photography), and probing the limits to the intensification or scaling up of the pattern. In pace and spectacle You Only Live Twice represented a maximum that later Bond films only occasionally and marginally exceeded for decades afterward, while only once was a Bond film to cost more to make before the 1990s.



Still, it must be admitted that Hollywood was slow to replicate their success. It certainly cashed in on the Bond films' success with a rush of spy-themed movies, but generally went the cheaper, easier route of parody, while showing little alertness to what might be called the "poetics of the action movie"—saving its resources for splashy musicals, while the younger, fresher talents capable of walking a new path generally seem to have been more interested in the edgier material. For all that Hollywood did produce contemporary-set films with action in them, but a good look at the crime films and disaster films of the '60s and '70s shows that they were just regular crime dramas that might have a big car chase in the middle (Bullitt, The French Connection, Dirty Harry), or variations on Grand Hotel in which things periodically blew up or crashed or burned down (Airport, The Poseidon Adventure, The Towering Inferno).

As the years progressed Hollywood continued its movement in this direction (Magnum Force and still more The Enforcer are rather more action-packed than the original Dirty Harry), but American filmmaking only really assimilated the pattern, let alone added anything to it, with Star Wars (1977). George Lucas' blend of action movie mechanics with space operatic imagery and revolutionary special effects (principally, the use of a computer-controlled camera that permitted a greater precision and replicability in effects shots, making the process cheaper, and allowing the production of more complex, denser, more elaborate shots of the type) was, where this kind of filmmaking was concerned, a revolution for Hollywood, and the world.



Once again Hollywood was not particularly quick to learn the lesson. Just as in the heyday of the Bond films Hollywood imitated the spy theme rather than the action movie mechanics, it imitated the space theme of Star Wars rather than what the movie did with it—as a viewing of those first two great follow-ups to Star Wars' success, Alien and Star Trek: The Motion Picture, makes clear. Still, more action movies did not follow, not least by way of Lucas, who scripted and produced the Steven Spielberg-helmed Raiders of the Lost Ark (1981), initiating a wave of H. Rider Haggardesque action-adventure, while encouraging the trend toward faster pacing, more gunplay and the rest—the second big-screen Star Trek film reflecting the logic in opting for a battle between Kirk and his most storied enemy, Khan (1982).

The same summer also saw John Rambo make his way to the screen in a battle with the authorities that escalated into a one-man war on the National Guard, First Blood. The finale had him cutting loose on the town that mistreated him with a machine gun, but it was really James Cameron's mix of science fiction and paramilitary action, The Terminator (1984), that established the pattern for films where the central figures, rather than shooting down opponents with a sidearm, takes them out in greater numbers with a machine gun—prevailing in the subsequent '80s action classics Commando (1985); that same summer's Rambo: First Blood, Part II (1985); Cameron's sequel to what had previously been a haunted house horror movie in space, Alien, the ingeniously titled Aliens (1986); and Predator (1987), which actually begins with Arnold Schwarzenegger participating in a conventional enough commando raid in Latin America, such that one could easily mistake it for a sequel to Commando. Heavier firepower of the kind also became increasingly characteristic of the similarly prolific cop movies, like that same summer's Lethal Weapon (1987), Robocop (1987) and Beverly Hills Cop II (1987).

Of course, the tendency had already hit something of a natural limit, and even self-parody, in Commando, where at the climax the protagonist mows down line after line of oncoming attackers just by facing in their direction as the machine gun he has on his hip blazes away. So did it also go for the genre more generally, a fact harder to ignore as the decade continued.* Over-the-top as the second Rambo film, the gargantuan Rambo III (1988), at the time the most expensive production in movie history, tried to top that, and simply looked foolish, while the finale of Stallone's next action film, 1989's gadget, assault vehicle and explosion-filled Tango & Cash, reinforced the impression. It was the case, too, that the angst to which the paramilitary action genre spoke increasingly felt like yesterday's issue. (Rambo was a spectacle of post-Vietnam anguish—fifteen years after the troops came home.) The box office grosses made this harder to ignore as the genre's two most iconic paramilitary franchises each flopped, the fifth Dirty Harry film, The Dead Pool, virtually ignored by moviegoers, while Rambo III made a mere third of what the second Rambo movie had in North America.

The Action Movie is Dead. Long Live the Action Movie!
One flop, or two, has never been enough to make Hollywood shift gears all by itself, the more so as at any given moment the movies to hit theaters two years on have probably already been greenlit—and while entertainment journalists, displaying their usual lack of historical memory for the business they write about and their tendency to read in any little up or down an epoch-defining trend, made noises about the end of the action film soon enough, this was not to be so. Indeed, Hollywood continued to make '80s-style Stallone and Schwarzenegger films, and Schwarzenegger in particular scoring a number of successes (Total Recall, Terminator 2, Cliffhanger, True Lies, Eraser); while milking those franchises that still seemed viable—and relatively successful with Die Hard (which was reasonably successful during that same summer in which Rambo III flopped), and Lethal Weapon (Lethal Weapon 2 having done very well in 1989, and leading to two more successful sequels in 1992 and 1998).

However, the fact that the action movie actually became more rather than less prominent at the box office during the 1990s was due to Hollywood's willingness to try other ideas. Not by any means did they all pan out. The decade saw quite a few video game-based action films, for example, like Super Mario Bros. (1993), Double Dragon (1994) and Street Fighter: The Movie (1994). (In these years only the relatively low-budget, August dump month release Mortal Kombat (1995) was a big enough hit to warrant a sequel, and only one (1997).) There was, too, quite an effort to produce female-centered action films, with an American remake of Luc Besson's La Femme Nikita, Point of No Return (1992); and Geena Davis' Cutthroat Island (1995) and The Long Kiss Goodnight (1996). Things turned up a bit for both endeavors with Charlie's Angels (2000) and Lara Croft: Tomb Raider (2001), but only to a slight degree. Instead the real successes derived from other approaches.

Reuse the Die Hard Formula . . . Endlessly
In the course of squeezing those extant franchises, there was the repetition of their ideas. After the second Die Hard film the actual Die Hard franchise generally moved away from the pattern of the original—a situation in which "terrorists" or somesuch seize a building or other structure and take everyone in it hostage for the sake of executing some film; miss an individual who remains loose and, despite their isolation, perhaps because they have loved ones at stake, undertakes a one-man guerrilla war against the bad guys with the clock ticking, the authorities outside trying to help and often making a botch of it, and likely some final twist awaiting us which will show the villains' plan was not quite what we thought it was at the beginning. However, no one else did, the imitators numerous and often shameless, as the innumerable lists of them on the web show.

Steven Seagal's career high film, Under Siege (1992), was "Die Hard on a battleship." Speed (1994) was "Die Hard on a bus." The Rock (1996) was "Die Hard on Alacatraz." Air Force One was . . . "Die Hard on Air Force One," the same summer that The Rock star Nicholas Cage wound up in a similar situation on a plane full of convicts, Con Air (1997). We even got Die Hard out on the "final frontier" in the eighth Star Trek film, Star Trek: First Contact (1996), where the Borg took over the Enterprise, and Jean-Luc Picard, initially single-handed, had to recover the ship and save the day and the universe from their control.

And of course, alongside these big-budgeted hits there were a good many less celebrated efforts, like those other "Die Hard on a plane" films, Passenger 57 (1992) and Executive Decision (1996); the "Die Hard in a school" movies Toy Soldiers (1991) and Masterminds (1997) (where Patrick Stewart was the villain rather than the day-saving hero this time); "Die Hard in a stadium," Sudden Death (1995); the less successful sequels to Under Siege and Speed, Under Siege 2: Dark Territory (1995) (train), and Speed 2: Cruise Control (1997); and an abundance of straight-to-video stuff, like the Anna Nicole Smith vehicle Skyscraper (1996), which stuck with the skyscraper idea of the original because, why not, and anyway, no one was really watching that for the plot, were they? Meanwhile, one can see more modest influences of the '88 classic in more mobile action films like Cliffhanger (1993) and Broken Arrow (1996), or even the family comedy Home Alone (1990).

When American Movies Aren't Enough, Look Abroad
Besides imitating Die Hard, there was a measure of borrowing from abroad, some more successful than others, with France an obvious case. Hollywood's remake of Besson's Nikita was a critical and commercial disappointment, and the reception to Besson's own English-language film The Professional (1994) was cold, but the biggest action hit of 1994, True Lies, was a remake of Claude Zidi's La Totale! (1991).

Moreover, the borrowing from France was minor next to that from Hong Kong. Hardly a new idea for action filmmakers (Tango & Cash "borrowed" its opening from Jackie Chan's Police Story), this intensified drastically, with recent Hong Kong hits getting the kind of highly publicized wide releases virtually unheard of for foreign productions, starting with the Jackie Chan vehicle Rumble in the Bronx in early 1996. More significantly, Hollywood also copied the fight choreography of the films, particularly the tight, choppy martial arts style made famous by Jackie Chan, the "heroic violence" of director John Woo's films, and the "wire fu" of directors like Yuen Woo-ping, to such a degree that they would soon become standard. Hollywood also brought many of the figures themselves into its own movies, with Woo making American films like Hard Target (1992), Broken Arrow and Face/Off (1997), and Jet Li, Michelle Yeoh and Tsui Hark not long in following them. (Sammo Hung even got a prime time Big Three network show for two seasons, Martial Law (1998-2000) on CBS.)

Often the tropes, and the figures, were used to inject new life into tiring or idea-starved franchises. The Bond series, serving up yet another, and particularly nonsensical, do-over of The Spy Who Loved Me (itself a do-over of You Only Live Twice) in Tomorrow Never Dies (1997), sought a measure of novelty in having Bond work with a Chinese agent this time, in an adventure set in Vietnam, with Hong Kong star Yeoh cast as said agent and getting a martial arts sequence in which to do her thing. The next year the fourth installment of the Lethal Weapon series (1998) featured Jet Li in a supporting role that was hyped far out of proportion to his actual screen time, while the sequel to the 1996 film adaptation of the old Mission: Impossible TV series (2000) had director John Woo bringing his trademark "heroic violence" into the franchise, epitomized by the climactic fight scene in which protagonist Ethan Hunt and his current adversary raced at each other on their motorcycles, then leaped off them to fight in mid-air.

More subtly, the pairing of Jackie Chan with Chris Tucker permitted Hollywood a significant success with the long-flagging buddy-cop action-comedy genre, Rush Hour (1998), and a more minor one with the even less vibrant Western, Shanghai Noon (2000). More significant as a pop cultural moment than any one of these films, however, a more than usually entertaining use of what Hong Kong martial arts films had to offer was part of the package that made the original The Matrix (1999) one of the decade's biggest pop cultural phenomena.

When You Run Out of Movies to Copy, Turn to TV
The 1990s, in general, was a decade of nostalgia for earlier periods, with Hollywood in particular seizing on a good many old TV shows as inspiration for big screen films—like The Addams Family, The Flintstones and The Brady Bunch, each of which was followed up by a sequel. Where the action genre was concerned the possibilities were more limited than with comedy because of the limitations of that medium in this genre, especially in its earlier days. Still, there were some efforts, the two great successes among which were, of course, The Fugitive (1993) and Mission: Impossible (1996). Notable, too, were the space opera Lost in Space (1998), and the James Bond-meets-Western series, Wild Wild West (1999).

By and large the "adaptations" were superficial, relying more on nostalgic evocation of brand names than anything else where the source material was concerned, and deriving much of their interest from elsewhere, even when they were being thoroughly derivative. Mission: Impossible director Brian De Palma, whose previous transformation of an old TV show into a feature film had borrowed its most famous scene from a classic film (The Untouchables, which worked in the baby carriage rolling down the steps from Sergei Eisenstein's The Battleship Potemkin), likewise turned to another classic movie for MI's most memorable sequence, the hanging-by-a-rope break-in scene from Topkapi (1964). (Much imitated since, everyone seems to think Mission: Impossible was the first to do it, testifying to the brevity of most cinemagoers' memories.)

Unsurprisingly, there was not much of a basis for a major franchise in these. The Fugitive, whose plot about the innocent Dr. Richard Kimble's attempt to survive and clear his name was scarcely repeatable, was nonetheless a big enough hit that the producers endeavored to do so on the basis of Tommy Lee Jones' well-received turn as Sam Gerard, U.S. Marshals (1998), and did not succeed, the series ending there, while there was insufficient enthusiasm to attempt even that with Lost in Space or Wild Wild West. However, the Mission: Impossible series has, with the aid of new borrowings, soldiered on into the present (numbers seven and eight being shot back-to-back at last report).

More Military Hardware
As the cop-and-commando films of the '80s showed, there was a tendency to incorporate more military hardware into the adventure to up the level of action, while this heyday of the military techno-thriller saw a still broader fascination with high-tech weaponry. Where bildungsroman had been a reliable basis for box office success with films like Private Benjamin (1980), Stripes (1981), and An Officer and a Gentleman (1982), mixing into such a story F-14 Tomcats in aerial showdowns with foreign enemies made Top Gun the biggest hit of the year in 1986, while that same year's Iron Eagle (where Louis Gossett Jr. again played the mentor) enjoyed a more modest success.

This did not immediately lead to a slew of military-themed action movies. However, the successful adaptation of Tom Clancy's submarine thriller The Hunt for Red October (1990) for the big screen changed that, not only making a multi-movie franchise out of the adventures of its protagonist Jack Ryan (followed up in 1992 with Patriot Games and 1994 with Clear and Present Danger), but also leading to such hits as Under Siege, Crimson Tide, Broken Arrow, Executive Orders and Air Force One. (As noted earlier, many of them were variants on the Die Hard formula, and it does seem worth noting that the confined spaces and destructive potential of warships and military aircraft made convenient settings for such plots, while the fact that such scenarios were smaller in scale may have made them more approachable for audiences.) At the same time there was more hardware on display in movies not built around this theme, with True Lies, the biggest action hit of its year, incorporating a Harrier jet fighter into the climax, while it seems worth noting that the Bond films of the period, particularly Goldeneye (1995) and Tomorrow Never Dies, made especially lavish use of combat aircraft, warships and the like in their action scenes.

Intensified Editing
Besides recycling classic movie stunts so old that to many they seemed new, adding in Hong Kong-style martial arts and gunplay, and continuing to up the military-grade firepower, Hollywood action films worked to squeeze more effect out of such spectacle as they served up, not least through a more intensive use of editing. That approach, of course, was not wholly without precedent, in cases precedent going far back into film history. Montage pioneer Sergei Eisenstein's penchant for short shot lengths is strikingly contemporary, while it is worth noting that more action-oriented film has always tended to be more tightly edited than other kinds (John Ford or Howard Hawks, for instance, tending to shorter takes than Billy Wilder). Later the advent of television and the television commercial provided an arena for ultra-tight editing, facilitated by the shortness of the product, which in the 1960s became a major influence on cinema, by way of those formative James Bond films (so much so that they may be said to have helped create the action movie), and more widely evident in moviemaking as the work of Richard Lester on films like A Hard Day's Night (1964). In the 1970s Don Simpson's famous and infamous promotion of "high concept"-style filmmaking, which turned a movie into a two hour commercial, or music video (an approach encouraged by his enlistment of makers of commercials and music videos as feature film directors, like Ridley and Tony Scott, and Adrian Lyne), which became routine across the industry, intensified this yet again.

Still, technological change made this easier and more commonplace, notably the commercial availability of digital editing technology from 1989 on, which contributed to a sharp drop in Average Shot Length. Indeed, where feature film was concerned the possibilities were rather rapidly exploited, most notoriously by Michael Bay, yet another director of commercials and music videos who made a major splash in feature film with Bad Boys (1995), The Rock (1996) and Armageddon, though others were more aggressive still, not least Paul W.S. Anderson. It was the latter who, in short order, took the trend to what appears a natural limit in Resident Evil 2: Apocalypse (2004) (with an ALS of a mere 1.7 seconds), virtually no movie pushing the envelope further in the fifteen years since.

More Science Fiction. Much, Much More Science Fiction—and Much, Much Less of Some Other Things . . .
As the late '80s already demonstrated, the desire of filmmakers and audiences alike for bigger and better action movie spectacle was exceeding the limits of the frameworks provided by the old cop-and-commando stories, even with the additions discussed above. And so there was a turn to the fantastic—to displays of more than superhuman ability and outright superheroics (Batman), to exotic creatures (Jurassic Park), to the depiction of disaster (Twister), the cast-of-thousands epic (Braveheart), and the space opera that brought all these together (Independence Day, and above all Star Wars: Episode I—the Phantom Menace).

None of these ideas was new, of course. Indeed, there was a certain amount of nostalgia evident here as well, whether for old-fashioned superheroes, B-movie monsters, Irwin Allen disaster films, historical epics or Star Wars and its imitators. Nonetheless, new technology—above all, Computer Generated Imagery (CGI)—permitted such visions to be realized with a new ease, polish and flair. It was this that made the dinosaurs in Jurassic Park what they were, this that created those casts of thousands in a movie like Braveheart—while at the end of the decade, in The Phantom Menace, it permitted George Lucas to conjure all-CGI worlds, such as was hardly dreamed of back when he made the original Star Wars. Detractors of that film and the other prequels were harshly critical of the approach, but there is no denying that it was something new, that at least some found pleasure in the results, and that it did change the way movies were made.

Going along with the shift toward science fiction, and at that, science fiction films of these particular kinds, there were other, subtler shifts. One, reflecting the fact that science fiction's attraction was its affording room for bigger and more exotic action, was that rise in movie budgets. As late as the mid-1980s a summer blockbuster could be made for between $10 and $20 million, as was the case with Commando, or the science fiction-tinged-but-still-thoroughly-paramilitary Aliens and Robocop. By the late 1990s budgets on the order of $100 million (and often much more) were standard.

Another was the way the movies served up their thrills. Where paramilitary adventures emphasized bloody mayhem, these larger-budgeted, more briskly edited films were broadly staged effects extravaganzas, less apt to stress gore—or grit of any kind. Along with the generally declining tendency of popular film to "gratuitous" nudity (in contrast with the bits Commando or Die Hard served up), this meant less reason for an R rating, which in any case became a riskier bet when such large budgets were at stake. The result was that where any list of '80s action movies mostly consisted of movies advertised with the caution that "People under 17 years may only be admitted if accompanied by a parent or guardian," the R-rated action movies generally slipped from their genre's top ranks.

To be sure, there was still a very great deal of the '80s action film in those machine gun-packing Die Hard retreads and last Schwarzenegger films and the rest. Yet, it seems worth noting that the science fiction films outperformed them with increasing consistency and by widening margins, even without a Lucas or a Spielberg at the helm. In 1992 Batman Returns was the highest-grossing film of the year, in 1993 Jurassic Park, in 1995 the next Batman sequel, Batman Forever the biggest action success. The trend became still more pronounced in 1996 with The Rock and Eraser doing well, but nowhere near so well as Twister and Independence Day. The following year Air Force One was a hit, but well behind Men in Black and the Jurassic Park sequel The Lost World. In 1998 Lethal Weapon 4 was a moneymaker, but one well short of Godzilla, Deep Impact and Armageddon, while in 1999 The Phantom Menace, The Matrix and The Mummy were the year's action hits, all safely placing among the top ten earners of the year. By contrast, the highest-ranked, non-science fiction action film of the year was the latest entry in the perennial Bond series at #13, while Mel Gibson's Payback was down in the #26 position, and Schwarzenegger's End of Days (in spite of its supernatural and end-of-the-millennium theme) was all the way down at #33.

Unsurprisingly, the Die Hard retreads puttered to a halt sometime around 1997, and so did the techno-military action (Air Force One appears a last hurrah for both those genres as summer box office staples), while Stallone and then Schwarzenegger became steadily more marginal performers—all of this coming to look like rather B-movie stuff. Hong Kong flavoring remained very much part of any old-fashioned hand-to-hand combat that still took place, but was an element in the mix rather than the star of the show.

Meanwhile the colossally-budgeted newer-style science fiction epics were the titans fighting for the box office's commanding heights. In all that, The Matrix and its sequels seem to merit special attention. An R-rated, bullet-riddled mix of sci-fi and machine gun-packing paramilitary action that starts off in--where else?--L.A., it looked backward in that respect, while in its accent on superpowers it looked forward, and in being so Janus-faced, appears to have been very much of its moment.

* The parody of this kind of action film in Jim Abrahams' Hot Shots! Part Deux did not really have to do much more than imitate what Commando already did.

Looking Past the Hardcovers: Techno-Thrillers in Other Media

Writing about the military techno-thriller--



--I generally focused on print works.

This was for several reasons--because this is where the over century old tradition has been longest and strongest; because my training and experience was overwhelmingly in analyzing literary text; because I had spent a lot of time reading techno-thriller novels back in the 1990s and wanted to write about that, the more so as very little has been written of them.

Still, recently reading David Sirota's cultural history of the '80s Back to Our Future, the strongest part of which was its discussion of the marketing of pop militarism in the '80s in response to the "Vietnam Syndrome," I was struck by how little the print techno-thriller came up. Tom Clancy was, for two years running, the author of the year's top-selling novel (1988 and 1989), and the decade's top-selling novelist, on the strength of The Cardinal of the Kremlin and Clear and Present Danger (and The Hunt for Red October, Red Storm Rising and Patriot Games too), but save for an explanatory note for a single, brief, rather offhand, reference to Jack Ryan, the name simply does not come up. (Still less is there any reference to, for instance, Stephen Coonts or Larry Bond or the rest.)

Instead the discussion focuses on a number of films and TV shows--in particular, Red Dawn and Top Gun and G.I. Joe--and video games. This does, to some extent, reflect the fact that where his recounting of the relevant political history relies on solid scholarship to produce a robustly comprehensive picture of that side of the period, there is an element of personal reminiscence in his discussion of the pop culture. And Sirota was, as Goldbergs viewers remember, a young person at the time, much more likely to rewatch Red Dawn on VHS or play Contra on Nintendo than plow through one of Clancy's doorstops. However, that is exactly the point. A lot more people, of all ages, watched movies and TV, and played video games, than read books, even the bestselling books of the day, and this was all the more relevant with his focus in that portion of his study on the molding of the outlook of the young.

This does not seem to me to make the print techno-thriller less important or interesting. But, content for the time being with what I have written about that, it did suggest to me where my research might go from here, in the direction of all those works with which I was not unfamiliar (his generation is more or less my generation, and I sure did play Contra and the rest well before I picked up a book by Clancy), but to which I had been less attentive academically.

In fact, I have found myself revisiting the body of writing relevant to this side of film, television, and especially gaming history (because a "big picture" seems most elusive here). So far as I can tell it is much like those bodies of work on anything else to do with pop culture.

On the one hand there is journalism, more interested (often too much so) in retelling the personal stories of "industry players" in that manner so thrilling to devotees of the cult-of-the-tech-entrepreneur than offering comprehensive understandings of the subject matter, or much cultural or political insight; often downright sloppily researched and written; and frequently passing off material suitable for a decent article as a doorstop of a book (the easier for them to do because they have set their sights on telling a story, and telling it sloppily).

On the other hand there is scholarship, tending in the other direction, toward the overly narrow rather than the overly broad, normally in line with the fashionable academic obsessions (with regard to subject matter and methodology) I find . . . well, less than illuminating, especially when undertaking a project like this.

Still, that suggests to me that there is more to be said about the matter--and I hope to say something of it when I get around to revisiting the subject.

Crazed Fruit: Some Thoughts

The screen adaptation of Shintaro Ishihara's second novel, Crazed Fruit, ran on TCM some time ago.

I understand the movie has been recognized with a place in the Criterion Collection.

I have not read the book. But if the film is at all representative of it, it seems that Ishihara, who has screenwriting credit, simply reshuffled the elements of his first book, Season of the Sun--snotty rich college kids who spend their time playing on the beach and with boats, going to nightclubs, trying to get laid and getting into fights; some "tough" talk about generation gaps and the worthlessness of ideas next to action that comes off as the pseudo-macho, pseudo-intellectual drivel of posers; a conflict between brothers over a girl with more than a little history, and a proneness to manipulate and use men, with the younger brother the more susceptible and the older the more tough-minded; and just so we know the author's serious, the culmination of all that in sudden, violent death. We also see plenty of certain unattractive elements more pronounced in his other writing, and out of which he made a political career--like racism and xenophobia with which a certain amount of sexual baggage is nakedly mixed in. (Yes, there is a "The foreigners are here to take away our women!" element in this particular mix.)

The result is a piece of sh--excuse me, melodrama, inadequate to support the movie's 86 minute running time.

An interesting note: in addition to furnishing the source material and the screenplay, Ishihara also appears in the movie. And not just appears. With the camera focused on his face he bullyingly declares "I'm Ishihara!"--just in case we didn't know. And this alter ego of his, also named Ishihara, who happens to be part of a pack of snotty rich college kids having a run-in with the aforementioned pack of snotty rich college kids (in this case, also members of the Tokyo U wrestling team) participates in beating up the film's star, Shintaro Ishihara's real-life younger brother, the "James Dean of Japan," Yujiro Ishihara.

So basically Ishihara, going far beyond a subtle Alfred Hitchcock-like cameo, announces on camera that he is indeed Ishihara, then physically attacks his younger but more celebrated sibling on camera.

Make of that what you will.

In light of, well, everything the man has done since.

Interestingly, it seems Francois Truffaut was impressed, so much so that his seeing this film led to a collaboration between the French New Waver and the ultra-rightist who would call French a "failed language" (1962's Love at Twenty).

Make of that what you will, too.

I know I do.

Tuesday, May 28, 2019

George Orwell's The Road to Wigan Pier and Obesity

Considering the unending debate on obesity my thoughts go back to what George Orwell had to say of the dietary habits of the poor in his journalistic classic, The Road to Wigan Pier. As he remarked, rather than fruits and vegetables, whole grains, lean meat, the "basis of their diet . . . is white bread and margarine, corned beef, sugared tea, and potatoes," which he admits to finding "appalling." However,
the peculiar evil is this, that the less money you have, the less inclined you feel to spend it on wholesome food. A millionaire may enjoy breakfasting off orange juice and Ryvita biscuits; an unemployed man doesn't.
Why is that? Simply put,
When you are unemployed, which is to say when you are underfed, harassed, bored, and miserable, you don't want to eat dull wholesome food. You want something a little bit "tasty."
And then as now there was plenty of the cheap but tempting available--white bread, tinned beef, sugared tea, and the rest--because, after all, "[u]nemployment is an endless misery that has got to be constantly palliated," and such food does a rather better job of that than what the conventional wisdom recommends as nutrition.

That the observation is so little known, or appreciated, attests to how much more cited than really read Orwell, like all great writers, happens to be--and the utter alienness of what he really thought from the mainstream today, whose perspective is, as much as ever, that of complacent "millionaires." Considering the problem of obesity, it reduces the whole matter to personal, consumption, "lifestyle" choices, without the slightest respect for the matter of context. They have no interest in the class angle, no knowledge, curiosity, sympathy in regard to the hardships faced by those less well-off. The factors that incline the poor to cheap temptation do not enter into their calculations; the only thought is of taking away the palliatives. Thus they crack down on foods with sodium and transfat and large sodas, but poverty, economic insecurity, and the rest are not even seen as relevant.

I will not go so far as to say that matters like advertising or portion sizes are irrelevant, or that action on them has been completely unhelpful. Business does, after all, take advantage of the consumer, in this way as in others, through advertising, and by foisting larger portion sizes on the consumer. (When you buy a 20 ounce soda bottle because none of the machines in your vicinity are offering 12 ounce cans, that has consequences that add up.) Equally there may be factors that are even less a matter of personal choice than the response of the deprived and stressed to temptations--not least, the effect of industrial chemicals or viruses on our bodies (for which a strong case has been made, sadly to little effect thus far). Yet the failure is significant in itself, telling about the larger problem, and a significant reason why obesity has been yet another entry in that long, long list of problems--like the wreck of the natural environment, the health care system, the financialization of the economy--that are eternally talked about, and acted on ineffectively or not all, such that it has just got worse and worse, and which can be expected to go on doing so barring a change in the way society thinks and acts about these matters.

Subscribe Now: Feed Icon