Tuesday, June 25, 2019

The Military Techno-Thriller: A History

One little secret of writing--writers often have occasion to wonder if anyone is reading what they produce at all. (Indeed, the subject of how much scholarly and scientific research goes completely unread by anyone has turned into a genre of scholarly and scientific writing itself.)

Where my two pieces on the techno-thriller from 2009 are concerned (one for Strange Horizons, the other for the gone-but-fondly-remembered Internet Review of Science Fiction), I have found that not only did people look at them, but that they have gone on doing so for a decade after their first appearance--David Axe back in December closing a piece on the latest crop of China-themed techno-thrillers by quoting my SH article on the subject.

A decade is a very long time in Internet years, and without false modesty I think it reflects the fact that only so much is written about this genre--one reason why I have since developed what I had to offer in those two pieces into a book offering a fuller history of the military techno-thriller genre, from the first flickerings in the seventeenth and eighteenth centuries, to the emergence of the invasion story in the late Victorian era, and the broader development of the "future war" story in the hundred and fifty years since.



Now The Military Techno-thriller: A History is available in print and e-book formats from Amazon and other retailers.

Get your copy today.

Monday, June 24, 2019

The Evolution of the Thriller

Last year I published a pair of papers over at SSRN about the declining presence on the New York Times and Publisher's Weekly bestseller lists of both spy fiction and military techno-thrillers after the 1980s--and along with this, the shift of the thriller genre away from high politics and international affairs, to domestic settings and "ordinary" crime (and in particular, legal thrillers, serial killer profiler thrillers, forensic thrillers).

The articles were more concerned with empirically checking the (widely held) impression that this was going on against the available data rather than explaining it or discussing its implications.

However, it seems to me that there are a number of such explanations.

One may be that the genres themselves were a bit played out. As I have argued for a number of years, by the 1970s the spy genre had been around for three generations, about the normal lifetime of a literary genre, and showing it--not least in its increasing tendency to look backward, and in its penchant for self-parody and for borrowing from other genres to make new hybrid genres (because what it had to offer itself was looking less fresh).



The military techno-thriller seemed fresher and newer in the '80s, but it, too, was part of that very old tradition, and it seems to me that its essential requirements (politico-military scenarios, novel weaponry) were limiting, enough that it, too, was getting a bit worn-out by the '90s.



Another is that this was a matter of the Cold War's end. The plain and simple truth is that international politics only became a major theme of American popular fiction in the '60s because of that conflict, and it was only natural that its passing would mean a decline in such interest.

However, alongside these issues of exhaustion or diminished topicality there is the matter of the sort of reading experience these books offered. Thrillers about spies, techno-military crises and the like are "big picture"-oriented and information-heavy, particularly regarding the "machinery of civilization." In this they tended toward what Goethe and Schiller called the "epic." This is, in fact, part of their appeal for many. (It was certainly part of their appeal for me.)

But I suspect most readers are looking less for the epic than the "dramatic" (Goethe and Schiller's term again)--fiction offering the reader close identification with with a protagonist whose struggles they follow with baited breath, and never mind all the big picture, informational stuff that tends to get in the way of that kind of experience. It seems to me that domestic settings, ordinary legal and criminal stuff, fits in with those more easily than flying missiles and Presidents on the hotline to the Kremlin, reading about which the quotidian and character-type stuff only gets in the way. (I admit it: reading Jack Ryan novels I was never really interested in Jack Ryan's home life, just the crisis stuff and high-tech stuff and action stuff.)

The tilt toward dramatic rather than epic apart, it may even be that people were inclining more toward easy-to-read books--which the techno-thrillers, with their multiple plot threads and density with technological and other material detail often were not, the reader having to keep track of too much, process too much. (Jack Ryan is only on the page for a third of The Hunt for Red October, and in the paperback edition, does not show up for a hundred pages amid all the diplomatic and military maneuvering of a full-blown superpower chess game where A-10s buzz the Kirov battle-cruiser, and American and Soviet naval aviators shoot it out in the sky.)

I suspect this all the more looking at the work of perhaps that most dramatically inclined of the era's bestselling spy novelists, John le Carre. His fiction was indeed character-oriented, but his oblique, Show-don't-tell-oriented prose style made a very heavy demand on the reader, so much so that in high school I found him nearly unreadable, and even after having become a fan, am astonished that he managed to sell so many books. (As Raymond Benson declared some time ago, by the '90s Ian Fleming would have been deemed "too hard" for the broad market--and his prose was a walk in the park compared to le Carre's.)

I suspect all this, too, looking at the other sorts of popular fiction that have flourished in recent years--like that penchant for young adult-oriented dystopias. As I argued in Cyberpunk, Steampunk and Wizardry, what we tended to get in works like Suzanne Collins' Hunger Games and Veronica Roth's Divergent were dystopias with little-to-no politics (!) or world-building, information-light stories that were narrowly focused and intimate in their presentation of protagonists with whom our identification was everything; and as young adult stories, all the easier to read.

Why Nothing Ever Seems to Go Viral

As Hit Makers author Derek Thompson explained in an interview with Forbes, for something to go viral means the product or idea in question "spread through many generations of intimate sharing." People who liked it shared it with other people, who in their turn shared it with other people, who in turn shared it with other people, again and again in short sequence, rapidly multiplying the number who saw it (and shared it) in a manner analogous to the spread of an epidemic from Patient Zero.

Of course, for anyone looking for an audience--and who isn't?--the prospect of such success is enormously attractive. But it also turns out that, as Thompson himself remarks (and as many a frustrated marketer has doubtless come to suspect), it almost never happens.

When I consider how social media works this seems to me the more obvious. Take Twitter, for example, which makes certain data conveniently available to everyone with an account--among that data, the number of "impressions" they made in a given period, which is to say, the number of times the item appears on screen in front of a follower, whether or not they look at.

Based on the numbers I have seen (not just my own) someone who Tweets something can expect impressions from it equivalent to 1-3 percent of their followers. Someone with a thousand followers, for example, might expect to score ten to thirty impressions per Tweet.

Not an impressive figure, is it?

And remember, impressions are just that, a thing far short of the deeper concept of "engagement," about which Twitter also provides figures to its users. This is the Twitter user interacting with the Tweet some way--if only by clicking on it.

As it happens, the rates of engagement-per-impression are not very high. An engagement rate of 0.9 percent--in other words, a less than one in a hundred chance of an impression leading to engagement--is considered a good, solid rate.

As this includes such things as accidental clicks, or multiple engagements by the same person (who might click on it, and like it, and Retweet it), one may safely assume that the odds of any one person really taking an interest in any single Tweet presented them is lower.

So 0.9 percent, of 1-3 percent, and you might get one engagement for every ten thousand followers you have, with the more substantive engagements, like likes and Retweets, apt to be a fraction of that. One in several tens of thousands.

Ninety-nine percent of Twitter users have 3,000 followers or less. Far too few for anyone to expect that their own followers will engage any one Tweet in a meaningful way, let alone Retweet the Tweet. Apparently 0.1 percent have 25,000 followers or more, enough to have a reasonable chance of their Tweet being Retweeted. Still, given the aforementioned figure one can expect that only 0.1 percent of their followers has a similarly sized following (25 people). And given that only one in tens of thousands of those followers of their followers might Retweet, the odds that one of their followers with a large following of their own will Retweet any of their Retweets are very slim indeed.

The result is that the chances of any one Tweet getting a Retweet, and still more a second Retweet, are minuscule. The chances of one Retweet leading to another leading to another Retweet, again and again, in line with the "going viral" model, are a minuscule fraction of that.

The figures of which I write here are averages. Of course, averages contain wide divergences. Some will do much better--but others will do much worse. And it is worth remembering that the bigger one's following, the lower the engagement rate they are likely to have (if someone is famous enough to have millions of followers, a lot of them are likely to be pretty casual), offsetting at least somewhat the benefit of their bigger following from the standpoint of the Tweet's wider propagation.

On top of that some argue that the continuing fragmentation of the media--social media most certainly included--is making anything's going viral even less likely than before. But it was never all that plausible a thing to hope for in the first place.

Tuesday, June 18, 2019

Reflections on Erich Maria Remarque's All Quiet on the Western Front

I recently revisited Erich Maria Remarque's All Quiet on the Western Front. The book is not a sprawling epic in the manner of War and Peace but a compact (175 pages in my paperback edition) chronicle of the experiences of its narrator, a young enlistee in the German army, by the start of the story already in the field a good while, through the remainder of the conflict.

As might be expected given the book's reputation, to say nothing of the stuff of the more serious war literature generally, it depicts the gruesome destruction of bodies and minds, the narrowing of horizons amid it all, the sharpening of the little pleasures they steal (often literally) in the quieter moments--a fine meal, a few hours in the company of a woman. The alienation of the foot soldier trying to hold onto life and limb and sanity from talk of the "big picture," and from the civilians they are (often justly) sure cannot understand what they have been through. The baggage and the scars they bear forever after. The martinet sergeants and medical corp quacks and corrupt officers and, just off-stage, businessmen thriving financially far behind the lines. Still, familiar as it all is, Remarque's crisp treatment expresses it all with great clarity and force.

Naturally the Nazis hated it, and not solely because it tells the truth about war rather than glorifying it, but because it gave the lie to their "stab in the back" propaganda. (As the novel proceeds one sees the German army ground down, and in the concluding chapters, finally overwhelmed by superior Allied resources and manpower.)

The Nazis' hatred of Remarque's book, in fact, started a long-running conflict between the German government and Universal over it in which Hollywood, frankly, disgraced itself, subordinating even what artistic freedom the Hays Code left to foreign profits. (Ben Urwand tells that story in some detail in his very worthwhile The Collaboration: Hollywood's Pact With Hitler.)

Today one might imagine the book to be no more popular than that with their heirs, of whom we now see and hear so much more than we did just a short while before, in Germany and everywhere else. Any reader of recent historiography (for instance, neocon court historian Niall Ferguson's The Pity of War) quickly sees the evidence that the right-wingers seeking to rehabilitate that war, and war more generally, have been getting the upper hand.

That seems to me all the more reason to take a look at this deserved classic if you haven't seen it, and reason for a second look if you already have.

Monday, June 17, 2019

Rewatching the '90s?

I recently had occasion to think about how we never left the '80s in the really important things--in our economics and politics, the right-wing backlash, the financialization and inequality and the rest going on and on and on.

To my surprise, this is even the case with television, as I am reminded when I run across a rerun of a TV show from the '90s. The Pretender, for instance.

This may sound odd. The long opening credits sequences have been discarded--for the sake of cramming in more commercials, I'm sure, no matter what anyone says. There is less inclination to standalone episodes and more toward arcs (albeit shoddily assembled ones). There is much pompous display of self-important edginess and middlebrow pretentiousness (the indie movie sensibility come to the small-screen).

But simply happening on an episode in the middle, especially if no one is holding a cell phone or making an ostentatiously current pop culture reference, none of this is necessarily apparent. What is apparent is the way people look and talk, the hair and the clothes, the ring of the dialogue--even the texture of the images--they don't scream '90s the way a show from the '80s or '70s or earlier does. It might have been made yesterday. Or so it seems to me.

But then I actually am old enough to remember when the '90s was new.

I wonder, does that factor into all this?

Thursday, June 6, 2019

The End of High Concept Comedy?

Recently running into a commercial for the Will Ferrell-John C. Reilly vehicle Holmes & Watson on On Demand I was surprised that I didn't recall hearing a word about it, and eventually got around to looking it up. I quickly ran across Jesse Hassenger's article on the matter at The Verge that made it clear that not only did the film drop into theaters last December with little fanfare, before flopping, but that it appeared symptomatic of a significant trend--the decline of the A-list, star-centered comedy (the top slots at the box office, more than ever, are monopolized by sci-fi/fantasy action and splashy animated movies), which has long been noticeable but which hit a new low in 2018.

While Hassenger is right to notice the trend, his explanation--that movie-goers are getting their comedy from action movies and animation--does not really convince me. I don't deny that there may be some truth in it, but I suspect a far bigger competitor to these films, like all films, is the small screen--to which not only are movies zipping from the theater in record time, but which is replete with original comedy of its own through the burgeoning array of channels and streaming services.

To get people out of the house and make them pay $20 a ticket, parking fees, and notoriously obscene concession stand prices for mediocre food, the big screen really has to deliver something they can get elsewhere.

There are only two things it can offer now.

One is to serve up something that plays differently on a big screen than a small.

The other is to make the film's release feel like an event--something they want to participate in right now with everybody else rather than wait two months to see it much more cheaply in the comfort of their own home.

The Marvel Cinematic Universe flourished by accomplishing both. However, neither works nearly so well for comedy. While action can seem more thrilling on a giant screen, I'm not sure it does nearly so much for slapstick, let alone one-liners. And where the creation of a sense of an event is concerned--the latest entry in an action-adventure epic seems more plausible material for such an effort than a repeat of goofy antics.

It is the case, too, that Hollywood has less incentive to try--and perhaps, less ability than it used to. Where incentive is concerned, consider the things that make movies lucrative commercially--foreign earnings, sequels, and merchandising. High-concept A-list comedy is less natural material for this than the action movies and family animation. Comedy travels less well than action, it lends itself less easily to "Part 6," and the rest.

Where ability is concerned, it should be acknowledged that comedy is simply harder to do passably than action (what is funny is a lot harder to pin down than the stuff of tentpole thrills)--and the audience more unforgiving when filmmakers make a botch of it. One can botch a big action blockbuster badly, and people will still walk away feeling reasonably satisfied if they saw that quickly cut, glossy, flashy-looking $200 million parade of CGI up on the screen. (Remember Transformers 2? Despite the criticism and the outsized budgets the series made it to Transformers 5, and Bumblebee, which might be getting its own sequels even as the endurance of the main series has become uncertain.) But screw up a comedy, and the people who are not laughing, entirely aware of the fact that they are not laughing, will be quicker to punish those who have wasted their time and money.

It might be added, too, that this is not a promising political climate where much of the traditional material of comedy--the sex comedy, the romantic comedy whose decline has already been the subject of any number of articles--is concerned. (Is the Hays Code era Ernest Lubitsch too edgy for today's puritanical Hollywood? Quite likely.) And of course, those inside the Hollywood bubble, utterly insulated from the lives and concerns of ninety-nine percent-plus of the planet, are likely neither inclined nor equipped to tackle any real social satire (and the right-wing press ready to tear them apart if they even try).*

Naturally, I do not see the trend running its course any time soon.

* Again I cite David Graeber: "Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste."

The Action Film's Transitional Years: Recalling the 1990s

Once upon a time cops and, especially in the '80s, the commando, were the characteristic figures of the action movie, a relatively grounded, often quite gritty, mid-budget and typically R-rated affair stressing gunplay and car chases. Today it seems that the genre is nearly synonymous with super-powered superheroes at the center of megabudgeted, Computer Generated Imagery-fueled science fiction extravaganzas, more family-friendly than not, but so big and so fast that looking right at them one can scarcely tell what is happening on the screen for much of the time. I would argue that where this was concerned, the 1990s was the key transitional period.

A Generation of Action Films: From the 1960s to the 1980s
In the 1960s the first half dozen or so Bond films largely established the essentials of the genre, from their filmic structure (specifically, the packing in of more thrills, substantially through the use of multiple set pieces, if necessary at the expense of narrative logic), to the range of essential material (the variety of fight and chase scenes, from frogmen-with-knives to ninjas to pursuits on skis), to the associated cinematographic and editing techniques (the use of close shots, short takes, jump cuts and exaggerated sound effects in portraying the fight scenes), as well as developing much of the required technology (techniques for underwater and aerial photography), and probing the limits to the intensification or scaling up of the pattern. In pace and spectacle You Only Live Twice represented a maximum that later Bond films only occasionally and marginally exceeded for decades afterward, while only once was a Bond film to cost more to make before the 1990s.



Still, it must be admitted that Hollywood was slow to replicate their success. It certainly cashed in on the Bond films' success with a rush of spy-themed movies, but generally went the cheaper, easier route of parody, while showing little alertness to what might be called the "poetics of the action movie"—saving its resources for splashy musicals, while the younger, fresher talents capable of walking a new path generally seem to have been more interested in the edgier material. For all that Hollywood did produce contemporary-set films with action in them, but a good look at the crime films and disaster films of the '60s and '70s shows that they were just regular crime dramas that might have a big car chase in the middle (Bullitt, The French Connection, Dirty Harry), or variations on Grand Hotel in which things periodically blew up or crashed or burned down (Airport, The Poseidon Adventure, The Towering Inferno).

As the years progressed Hollywood continued its movement in this direction (Magnum Force and still more The Enforcer are rather more action-packed than the original Dirty Harry), but American filmmaking only really assimilated the pattern, let alone added anything to it, with Star Wars (1977). George Lucas' blend of action movie mechanics with space operatic imagery and revolutionary special effects (principally, the use of a computer-controlled camera that permitted a greater precision and replicability in effects shots, making the process cheaper, and allowing the production of more complex, denser, more elaborate shots of the type) was, where this kind of filmmaking was concerned, a revolution for Hollywood, and the world.



Once again Hollywood was not particularly quick to learn the lesson. Just as in the heyday of the Bond films Hollywood imitated the spy theme rather than the action movie mechanics, it imitated the space theme of Star Wars rather than what the movie did with it—as a viewing of those first two great follow-ups to Star Wars' success, Alien and Star Trek: The Motion Picture, makes clear. Still, more action movies did not follow, not least by way of Lucas, who scripted and produced the Steven Spielberg-helmed Raiders of the Lost Ark (1981), initiating a wave of H. Rider Haggardesque action-adventure, while encouraging the trend toward faster pacing, more gunplay and the rest—the second big-screen Star Trek film reflecting the logic in opting for a battle between Kirk and his most storied enemy, Khan (1982).

The same summer also saw John Rambo make his way to the screen in a battle with the authorities that escalated into a one-man war on the National Guard, First Blood. The finale had him cutting loose on the town that mistreated him with a machine gun, but it was really James Cameron's mix of science fiction and paramilitary action, The Terminator (1984), that established the pattern for films where the central figures, rather than shooting down opponents with a sidearm, takes them out in greater numbers with a machine gun—prevailing in the subsequent '80s action classics Commando (1985); that same summer's Rambo: First Blood, Part II (1985); Cameron's sequel to what had previously been a haunted house horror movie in space, Alien, the ingeniously titled Aliens (1986); and Predator (1987), which actually begins with Arnold Schwarzenegger participating in a conventional enough commando raid in Latin America, such that one could easily mistake it for a sequel to Commando. Heavier firepower of the kind also became increasingly characteristic of the similarly prolific cop movies, like that same summer's Lethal Weapon (1987), Robocop (1987) and Beverly Hills Cop II (1987).

Of course, the tendency had already hit something of a natural limit, and even self-parody, in Commando, where at the climax the protagonist mows down line after line of oncoming attackers just by facing in their direction as the machine gun he has on his hip blazes away. So did it also go for the genre more generally, a fact harder to ignore as the decade continued.* Over-the-top as the second Rambo film, the gargantuan Rambo III (1988), at the time the most expensive production in movie history, tried to top that, and simply looked foolish, while the finale of Stallone's next action film, 1989's gadget, assault vehicle and explosion-filled Tango & Cash, reinforced the impression. It was the case, too, that the angst to which the paramilitary action genre spoke increasingly felt like yesterday's issue. (Rambo was a spectacle of post-Vietnam anguish—fifteen years after the troops came home.) The box office grosses made this harder to ignore as the genre's two most iconic paramilitary franchises each flopped, the fifth Dirty Harry film, The Dead Pool, virtually ignored by moviegoers, while Rambo III made a mere third of what the second Rambo movie had in North America.

The Action Movie is Dead. Long Live the Action Movie!
One flop, or two, has never been enough to make Hollywood shift gears all by itself, the more so as at any given moment the movies to hit theaters two years on have probably already been greenlit—and while entertainment journalists, displaying their usual lack of historical memory for the business they write about and their tendency to read in any little up or down an epoch-defining trend, made noises about the end of the action film soon enough, this was not to be so. Indeed, Hollywood continued to make '80s-style Stallone and Schwarzenegger films, and Schwarzenegger in particular scoring a number of successes (Total Recall, Terminator 2, Cliffhanger, True Lies, Eraser); while milking those franchises that still seemed viable—and relatively successful with Die Hard (which was reasonably successful during that same summer in which Rambo III flopped), and Lethal Weapon (Lethal Weapon 2 having done very well in 1989, and leading to two more successful sequels in 1992 and 1998).

However, the fact that the action movie actually became more rather than less prominent at the box office during the 1990s was due to Hollywood's willingness to try other ideas. Not by any means did they all pan out. The decade saw quite a few video game-based action films, for example, like Super Mario Bros. (1993), Double Dragon (1994) and Street Fighter: The Movie (1994). (In these years only the relatively low-budget, August dump month release Mortal Kombat (1995) was a big enough hit to warrant a sequel, and only one (1997).) There was, too, quite an effort to produce female-centered action films, with an American remake of Luc Besson's La Femme Nikita, Point of No Return (1992); and Geena Davis' Cutthroat Island (1995) and The Long Kiss Goodnight (1996). Things turned up a bit for both endeavors with Charlie's Angels (2000) and Lara Croft: Tomb Raider (2001), but only to a slight degree. Instead the real successes derived from other approaches.

Reuse the Die Hard Formula . . . Endlessly
In the course of squeezing those extant franchises, there was the repetition of their ideas. After the second Die Hard film the actual Die Hard franchise generally moved away from the pattern of the original—a situation in which "terrorists" or somesuch seize a building or other structure and take everyone in it hostage for the sake of executing some film; miss an individual who remains loose and, despite their isolation, perhaps because they have loved ones at stake, undertakes a one-man guerrilla war against the bad guys with the clock ticking, the authorities outside trying to help and often making a botch of it, and likely some final twist awaiting us which will show the villains' plan was not quite what we thought it was at the beginning. However, no one else did, the imitators numerous and often shameless, as the innumerable lists of them on the web show.

Steven Seagal's career high film, Under Siege (1992), was "Die Hard on a battleship." Speed (1994) was "Die Hard on a bus." The Rock (1996) was "Die Hard on Alacatraz." Air Force One was . . . "Die Hard on Air Force One," the same summer that The Rock star Nicholas Cage wound up in a similar situation on a plane full of convicts, Con Air (1997). We even got Die Hard out on the "final frontier" in the eighth Star Trek film, Star Trek: First Contact (1996), where the Borg took over the Enterprise, and Jean-Luc Picard, initially single-handed, had to recover the ship and save the day and the universe from their control.

And of course, alongside these big-budgeted hits there were a good many less celebrated efforts, like those other "Die Hard on a plane" films, Passenger 57 (1992) and Executive Decision (1996); the "Die Hard in a school" movies Toy Soldiers (1991) and Masterminds (1997) (where Patrick Stewart was the villain rather than the day-saving hero this time); "Die Hard in a stadium," Sudden Death (1995); the less successful sequels to Under Siege and Speed, Under Siege 2: Dark Territory (1995) (train), and Speed 2: Cruise Control (1997); and an abundance of straight-to-video stuff, like the Anna Nicole Smith vehicle Skyscraper (1996), which stuck with the skyscraper idea of the original because, why not, and anyway, no one was really watching that for the plot, were they? Meanwhile, one can see more modest influences of the '88 classic in more mobile action films like Cliffhanger (1993) and Broken Arrow (1996), or even the family comedy Home Alone (1990).

When American Movies Aren't Enough, Look Abroad
Besides imitating Die Hard, there was a measure of borrowing from abroad, some more successful than others, with France an obvious case. Hollywood's remake of Besson's Nikita was a critical and commercial disappointment, and the reception to Besson's own English-language film The Professional (1994) was cold, but the biggest action hit of 1994, True Lies, was a remake of Claude Zidi's La Totale! (1991).

Moreover, the borrowing from France was minor next to that from Hong Kong. Hardly a new idea for action filmmakers (Tango & Cash "borrowed" its opening from Jackie Chan's Police Story), this intensified drastically, with recent Hong Kong hits getting the kind of highly publicized wide releases virtually unheard of for foreign productions, starting with the Jackie Chan vehicle Rumble in the Bronx in early 1996. More significantly, Hollywood also copied the fight choreography of the films, particularly the tight, choppy martial arts style made famous by Jackie Chan, the "heroic violence" of director John Woo's films, and the "wire fu" of directors like Yuen Woo-ping, to such a degree that they would soon become standard. Hollywood also brought many of the figures themselves into its own movies, with Woo making American films like Hard Target (1992), Broken Arrow and Face/Off (1997), and Jet Li, Michelle Yeoh and Tsui Hark not long in following them. (Sammo Hung even got a prime time Big Three network show for two seasons, Martial Law (1998-2000) on CBS.)

Often the tropes, and the figures, were used to inject new life into tiring or idea-starved franchises. The Bond series, serving up yet another, and particularly nonsensical, do-over of The Spy Who Loved Me (itself a do-over of You Only Live Twice) in Tomorrow Never Dies (1997), sought a measure of novelty in having Bond work with a Chinese agent this time, in an adventure set in Vietnam, with Hong Kong star Yeoh cast as said agent and getting a martial arts sequence in which to do her thing. The next year the fourth installment of the Lethal Weapon series (1998) featured Jet Li in a supporting role that was hyped far out of proportion to his actual screen time, while the sequel to the 1996 film adaptation of the old Mission: Impossible TV series (2000) had director John Woo bringing his trademark "heroic violence" into the franchise, epitomized by the climactic fight scene in which protagonist Ethan Hunt and his current adversary raced at each other on their motorcycles, then leaped off them to fight in mid-air.

More subtly, the pairing of Jackie Chan with Chris Tucker permitted Hollywood a significant success with the long-flagging buddy-cop action-comedy genre, Rush Hour (1998), and a more minor one with the even less vibrant Western, Shanghai Noon (2000). More significant as a pop cultural moment than any one of these films, however, a more than usually entertaining use of what Hong Kong martial arts films had to offer was part of the package that made the original The Matrix (1999) one of the decade's biggest pop cultural phenomena.

When You Run Out of Movies to Copy, Turn to TV
The 1990s, in general, was a decade of nostalgia for earlier periods, with Hollywood in particular seizing on a good many old TV shows as inspiration for big screen films—like The Addams Family, The Flintstones and The Brady Bunch, each of which was followed up by a sequel. Where the action genre was concerned the possibilities were more limited than with comedy because of the limitations of that medium in this genre, especially in its earlier days. Still, there were some efforts, the two great successes among which were, of course, The Fugitive (1993) and Mission: Impossible (1996). Notable, too, were the space opera Lost in Space (1998), and the James Bond-meets-Western series, Wild Wild West (1999).

By and large the "adaptations" were superficial, relying more on nostalgic evocation of brand names than anything else where the source material was concerned, and deriving much of their interest from elsewhere, even when they were being thoroughly derivative. Mission: Impossible director Brian De Palma, whose previous transformation of an old TV show into a feature film had borrowed its most famous scene from a classic film (The Untouchables, which worked in the baby carriage rolling down the steps from Sergei Eisenstein's The Battleship Potemkin), likewise turned to another classic movie for MI's most memorable sequence, the hanging-by-a-rope break-in scene from Topkapi (1964). (Much imitated since, everyone seems to think Mission: Impossible was the first to do it, testifying to the brevity of most cinemagoers' memories.)

Unsurprisingly, there was not much of a basis for a major franchise in these. The Fugitive, whose plot about the innocent Dr. Richard Kimble's attempt to survive and clear his name was scarcely repeatable, was nonetheless a big enough hit that the producers endeavored to do so on the basis of Tommy Lee Jones' well-received turn as Sam Gerard, U.S. Marshals (1998), and did not succeed, the series ending there, while there was insufficient enthusiasm to attempt even that with Lost in Space or Wild Wild West. However, the Mission: Impossible series has, with the aid of new borrowings, soldiered on into the present (numbers seven and eight being shot back-to-back at last report).

More Military Hardware
As the cop-and-commando films of the '80s showed, there was a tendency to incorporate more military hardware into the adventure to up the level of action, while this heyday of the military techno-thriller saw a still broader fascination with high-tech weaponry. Where bildungsroman had been a reliable basis for box office success with films like Private Benjamin (1980), Stripes (1981), and An Officer and a Gentleman (1982), mixing into such a story F-14 Tomcats in aerial showdowns with foreign enemies made Top Gun the biggest hit of the year in 1986, while that same year's Iron Eagle (where Louis Gossett Jr. again played the mentor) enjoyed a more modest success.

This did not immediately lead to a slew of military-themed action movies. However, the successful adaptation of Tom Clancy's submarine thriller The Hunt for Red October (1990) for the big screen changed that, not only making a multi-movie franchise out of the adventures of its protagonist Jack Ryan (followed up in 1992 with Patriot Games and 1994 with Clear and Present Danger), but also leading to such hits as Under Siege, Crimson Tide, Broken Arrow, Executive Orders and Air Force One. (As noted earlier, many of them were variants on the Die Hard formula, and it does seem worth noting that the confined spaces and destructive potential of warships and military aircraft made convenient settings for such plots, while the fact that such scenarios were smaller in scale may have made them more approachable for audiences.) At the same time there was more hardware on display in movies not built around this theme, with True Lies, the biggest action hit of its year, incorporating a Harrier jet fighter into the climax, while it seems worth noting that the Bond films of the period, particularly Goldeneye (1995) and Tomorrow Never Dies, made especially lavish use of combat aircraft, warships and the like in their action scenes.

Intensified Editing
Besides recycling classic movie stunts so old that to many they seemed new, adding in Hong Kong-style martial arts and gunplay, and continuing to up the military-grade firepower, Hollywood action films worked to squeeze more effect out of such spectacle as they served up, not least through a more intensive use of editing. That approach, of course, was not wholly without precedent, in cases precedent going far back into film history. Montage pioneer Sergei Eisenstein's penchant for short shot lengths is strikingly contemporary, while it is worth noting that more action-oriented film has always tended to be more tightly edited than other kinds (John Ford or Howard Hawks, for instance, tending to shorter takes than Billy Wilder). Later the advent of television and the television commercial provided an arena for ultra-tight editing, facilitated by the shortness of the product, which in the 1960s became a major influence on cinema, by way of those formative James Bond films (so much so that they may be said to have helped create the action movie), and more widely evident in moviemaking as the work of Richard Lester on films like A Hard Day's Night (1964). In the 1970s Don Simpson's famous and infamous promotion of "high concept"-style filmmaking, which turned a movie into a two hour commercial, or music video (an approach encouraged by his enlistment of makers of commercials and music videos as feature film directors, like Ridley and Tony Scott, and Adrian Lyne), which became routine across the industry, intensified this yet again.

Still, technological change made this easier and more commonplace, notably the commercial availability of digital editing technology from 1989 on, which contributed to a sharp drop in Average Shot Length. Indeed, where feature film was concerned the possibilities were rather rapidly exploited, most notoriously by Michael Bay, yet another director of commercials and music videos who made a major splash in feature film with Bad Boys (1995), The Rock (1996) and Armageddon, though others were more aggressive still, not least Paul W.S. Anderson. It was the latter who, in short order, took the trend to what appears a natural limit in Resident Evil 2: Apocalypse (2004) (with an ALS of a mere 1.7 seconds), virtually no movie pushing the envelope further in the fifteen years since.

More Science Fiction. Much, Much More Science Fiction—and Much, Much Less of Some Other Things . . .
As the late '80s already demonstrated, the desire of filmmakers and audiences alike for bigger and better action movie spectacle was exceeding the limits of the frameworks provided by the old cop-and-commando stories, even with the additions discussed above. And so there was a turn to the fantastic—to displays of more than superhuman ability and outright superheroics (Batman), to exotic creatures (Jurassic Park), to the depiction of disaster (Twister), the cast-of-thousands epic (Braveheart), and the space opera that brought all these together (Independence Day, and above all Star Wars: Episode I—the Phantom Menace).

None of these ideas was new, of course. Indeed, there was a certain amount of nostalgia evident here as well, whether for old-fashioned superheroes, B-movie monsters, Irwin Allen disaster films, historical epics or Star Wars and its imitators. Nonetheless, new technology—above all, Computer Generated Imagery (CGI)—permitted such visions to be realized with a new ease, polish and flair. It was this that made the dinosaurs in Jurassic Park what they were, this that created those casts of thousands in a movie like Braveheart—while at the end of the decade, in The Phantom Menace, it permitted George Lucas to conjure all-CGI worlds, such as was hardly dreamed of back when he made the original Star Wars. Detractors of that film and the other prequels were harshly critical of the approach, but there is no denying that it was something new, that at least some found pleasure in the results, and that it did change the way movies were made.

Going along with the shift toward science fiction, and at that, science fiction films of these particular kinds, there were other, subtler shifts. One, reflecting the fact that science fiction's attraction was its affording room for bigger and more exotic action, was that rise in movie budgets. As late as the mid-1980s a summer blockbuster could be made for between $10 and $20 million, as was the case with Commando, or the science fiction-tinged-but-still-thoroughly-paramilitary Aliens and Robocop. By the late 1990s budgets on the order of $100 million (and often much more) were standard.

Another was the way the movies served up their thrills. Where paramilitary adventures emphasized bloody mayhem, these larger-budgeted, more briskly edited films were broadly staged effects extravaganzas, less apt to stress gore—or grit of any kind. Along with the generally declining tendency of popular film to "gratuitous" nudity (in contrast with the bits Commando or Die Hard served up), this meant less reason for an R rating, which in any case became a riskier bet when such large budgets were at stake. The result was that where any list of '80s action movies mostly consisted of movies advertised with the caution that "People under 17 years may only be admitted if accompanied by a parent or guardian," the R-rated action movies generally slipped from their genre's top ranks.

To be sure, there was still a very great deal of the '80s action film in those machine gun-packing Die Hard retreads and last Schwarzenegger films and the rest. Yet, it seems worth noting that the science fiction films outperformed them with increasing consistency and by widening margins, even without a Lucas or a Spielberg at the helm. In 1992 Batman Returns was the highest-grossing film of the year, in 1993 Jurassic Park, in 1995 the next Batman sequel, Batman Forever the biggest action success. The trend became still more pronounced in 1996 with The Rock and Eraser doing well, but nowhere near so well as Twister and Independence Day. The following year Air Force One was a hit, but well behind Men in Black and the Jurassic Park sequel The Lost World. In 1998 Lethal Weapon 4 was a moneymaker, but one well short of Godzilla, Deep Impact and Armageddon, while in 1999 The Phantom Menace, The Matrix and The Mummy were the year's action hits, all safely placing among the top ten earners of the year. By contrast, the highest-ranked, non-science fiction action film of the year was the latest entry in the perennial Bond series at #13, while Mel Gibson's Payback was down in the #26 position, and Schwarzenegger's End of Days (in spite of its supernatural and end-of-the-millennium theme) was all the way down at #33.

Unsurprisingly, the Die Hard retreads puttered to a halt sometime around 1997, and so did the techno-military action (Air Force One appears a last hurrah for both those genres as summer box office staples), while Stallone and then Schwarzenegger became steadily more marginal performers—all of this coming to look like rather B-movie stuff. Hong Kong flavoring remained very much part of any old-fashioned hand-to-hand combat that still took place, but was an element in the mix rather than the star of the show.

Meanwhile the colossally-budgeted newer-style science fiction epics were the titans fighting for the box office's commanding heights. In all that, The Matrix and its sequels seem to merit special attention. An R-rated, bullet-riddled mix of sci-fi and machine gun-packing paramilitary action that starts off in--where else?--L.A., it looked backward in that respect, while in its accent on superpowers it looked forward, and in being so Janus-faced, appears to have been very much of its moment.

* The parody of this kind of action film in Jim Abrahams' Hot Shots! Part Deux did not really have to do much more than imitate what Commando already did.

Looking Past the Hardcovers: Techno-Thrillers in Other Media

Writing about the military techno-thriller--



--I generally focused on print works.

This was for several reasons--because this is where the over century old tradition has been longest and strongest; because my training and experience was overwhelmingly in analyzing literary text; because I had spent a lot of time reading techno-thriller novels back in the 1990s and wanted to write about that, the more so as very little has been written of them.

Still, recently reading David Sirota's cultural history of the '80s Back to Our Future, the strongest part of which was its discussion of the marketing of pop militarism in the '80s in response to the "Vietnam Syndrome," I was struck by how little the print techno-thriller came up. Tom Clancy was, for two years running, the author of the year's top-selling novel (1988 and 1989), and the decade's top-selling novelist, on the strength of The Cardinal of the Kremlin and Clear and Present Danger (and The Hunt for Red October, Red Storm Rising and Patriot Games too), but save for an explanatory note for a single, brief, rather offhand, reference to Jack Ryan, the name simply does not come up. (Still less is there any reference to, for instance, Stephen Coonts or Larry Bond or the rest.)

Instead the discussion focuses on a number of films and TV shows--in particular, Red Dawn and Top Gun and G.I. Joe--and video games. This does, to some extent, reflect the fact that where his recounting of the relevant political history relies on solid scholarship to produce a robustly comprehensive picture of that side of the period, there is an element of personal reminiscence in his discussion of the pop culture. And Sirota was, as Goldbergs viewers remember, a young person at the time, much more likely to rewatch Red Dawn on VHS or play Contra on Nintendo than plow through one of Clancy's doorstops. However, that is exactly the point. A lot more people, of all ages, watched movies and TV, and played video games, than read books, even the bestselling books of the day, and this was all the more relevant with his focus in that portion of his study on the molding of the outlook of the young.

This does not seem to me to make the print techno-thriller less important or interesting. But, content for the time being with what I have written about that, it did suggest to me where my research might go from here, in the direction of all those works with which I was not unfamiliar (his generation is more or less my generation, and I sure did play Contra and the rest well before I picked up a book by Clancy), but to which I had been less attentive academically.

In fact, I have found myself revisiting the body of writing relevant to this side of film, television, and especially gaming history (because a "big picture" seems most elusive here). So far as I can tell it is much like those bodies of work on anything else to do with pop culture.

On the one hand there is journalism, more interested (often too much so) in retelling the personal stories of "industry players" in that manner so thrilling to devotees of the cult-of-the-tech-entrepreneur than offering comprehensive understandings of the subject matter, or much cultural or political insight; often downright sloppily researched and written; and frequently passing off material suitable for a decent article as a doorstop of a book (the easier for them to do because they have set their sights on telling a story, and telling it sloppily).

On the other hand there is scholarship, tending in the other direction, toward the overly narrow rather than the overly broad, normally in line with the fashionable academic obsessions (with regard to subject matter and methodology) I find . . . well, less than illuminating, especially when undertaking a project like this.

Still, that suggests to me that there is more to be said about the matter--and I hope to say something of it when I get around to revisiting the subject.

Crazed Fruit: Some Thoughts

The screen adaptation of Shintaro Ishihara's second novel, Crazed Fruit, ran on TCM some time ago.

I understand the movie has been recognized with a place in the Criterion Collection.

I have not read the book. But if the film is at all representative of it, it seems that Ishihara, who has screenwriting credit, simply reshuffled the elements of his first book, Season of the Sun--snotty rich college kids who spend their time playing on the beach and with boats, going to nightclubs, trying to get laid and getting into fights; some "tough" talk about generation gaps and the worthlessness of ideas next to action that comes off as the pseudo-macho, pseudo-intellectual drivel of posers; a conflict between brothers over a girl with more than a little history, and a proneness to manipulate and use men, with the younger brother the more susceptible and the older the more tough-minded; and just so we know the author's serious, the culmination of all that in sudden, violent death. We also see plenty of certain unattractive elements more pronounced in his other writing, and out of which he made a political career--like racism and xenophobia with which a certain amount of sexual baggage is nakedly mixed in. (Yes, there is a "The foreigners are here to take away our women!" element in this particular mix.)

The result is a piece of sh--excuse me, melodrama, inadequate to support the movie's 86 minute running time.

An interesting note: in addition to furnishing the source material and the screenplay, Ishihara also appears in the movie. And not just appears. With the camera focused on his face he loudly declares "I'M ISHIHARA!"--just in case we didn't know. And this alter ego of his, also named Ishihara, who happens to be part of a pack of snotty rich college kids having a run-in with the aforementioned pack of snotty rich college kids (in this case, also members of the Tokyo U wrestling team) participates in beating up the film's star, Shintaro Ishihara's real-life younger brother, the "James Dean of Japan," Yujiro Ishihara.

So basically Ishihara, going far beyond a subtle Alfred Hitchcock-like cameo, announces on camera that he is indeed Ishihara, then physically attacks his younger but more celebrated sibling on camera.

Make of that what you will.

In light of, well, everything the man has done since.

Interestingly, it seems Francois Truffaut was impressed, so much so that his seeing this film led to a collaboration between the French New Waver and the ultra-rightist who would call French a "failed language" (1962's Love at Twenty).

Make of that what you will, too.

I know I do.

Tuesday, May 28, 2019

George Orwell's The Road to Wigan Pier and Obesity

Considering the unending debate on obesity my thoughts go back to what George Orwell had to say of the dietary habits of the poor in his journalistic classic, The Road to Wigan Pier. As he remarked, rather than fruits and vegetables, whole grains, lean meat, the "basis of their diet . . . is white bread and margarine, corned beef, sugared tea, and potatoes," which he admits to finding "appalling." However,
the peculiar evil is this, that the less money you have, the less inclined you feel to spend it on wholesome food. A millionaire may enjoy breakfasting off orange juice and Ryvita biscuits; an unemployed man doesn't.
Why is that? Simply put,
When you are unemployed, which is to say when you are underfed, harassed, bored, and miserable, you don't want to eat dull wholesome food. You want something a little bit "tasty."
And then as now there was plenty of the cheap but tempting available--white bread, tinned beef, sugared tea, and the rest--because, after all, "[u]nemployment is an endless misery that has got to be constantly palliated," and such food does a rather better job of that than what the conventional wisdom recommends as nutrition.

That the observation is so little known, or appreciated, attests to how much more cited than really read Orwell, like all great writers, happens to be--and the utter alienness of what he really thought from the mainstream today, whose perspective is, as much as ever, that of complacent "millionaires." Considering the problem of obesity, it reduces the whole matter to personal, consumption, "lifestyle" choices, without the slightest respect for the matter of context. They have no interest in the class angle, no knowledge, curiosity, sympathy in regard to the hardships faced by those less well-off. The factors that incline the poor to cheap temptation do not enter into their calculations; the only thought is of taking away the palliatives. Thus they crack down on foods with sodium and transfat and large sodas, but poverty, economic insecurity, and the rest are not even seen as relevant.

I will not go so far as to say that matters like advertising or portion sizes are irrelevant, or that action on them has been completely unhelpful. Business does, after all, take advantage of the consumer, in this way as in others, through advertising, and by foisting larger portion sizes on the consumer. (When you buy a 20 ounce soda bottle because none of the machines in your vicinity are offering 12 ounce cans, that has consequences that add up.) Equally there may be factors that are even less a matter of personal choice than the response of the deprived and stressed to temptations--not least, the effect of industrial chemicals or viruses on our bodies (for which a strong case has been made, sadly to little effect thus far). Yet the failure is significant in itself, telling about the larger problem, and a significant reason why obesity has been yet another entry in that long, long list of problems--like the wreck of the natural environment, the health care system, the financialization of the economy--that are eternally talked about, and acted on ineffectively or not all, such that it has just got worse and worse, and which can be expected to go on doing so barring a change in the way society thinks and acts about these matters.

Saturday, April 27, 2019

On John Rhys-Davies' Tantrum

Recently I chanced upon John Rhys-Davies' nasty behavior toward Green Party MP Caroline Lucas on the BBC's Question Time. In particular the bit where he responded to her questioning of Donald Trump's legitimacy through reference to his having lost the popular vote by a wide margin with "Oh woman! Have you not read Kenneth Arrow and the Arrow's theorem? Any system has its problems."

As far as I can tell the criticism of his behavior has generally focused on his rudeness and sexism. I do not dispute any of that, but am in this post more interested in the manner in which he argued, or appeared to argue. In not only citing Kenneth Arrow's "impossibility theorem," but presuming that his work was common knowledge which he could expect anyone he meets to know (it is not), he implied (unsubtly) that he is not only vastly more knowledgeable, but altogether on a higher intellectual plane than his opponent.

However, he also gave away his superficiality. Arrow's theorem is a piece of neoclassical economics, after all--the same neoclassical economics that for the last century and a half has been an exercise in physics envy, Scholasticism, and the dressing up of simple-minded but ideologically convenient (read: extreme-right wing) ideas in mathematical symbols and language (hence, "theorem") to lend them the illusion of having all the force and immutability of physical law (and of course, intimidate the callow). Its value has been virtually nil where understanding actual economic life (you know, an economists's job) is concerned--and naturally the approach in general, and Arrow's theorem (vulnerable to every one of the criticisms listed here) is, at the least, rather more open to question than Rhys-Davies implied (that reading it one could not but accept the matter as settled now and forever). It certainly is not a mike-dropping answer to a specific criticism of the outcome of a specific election.

The best that one could say for Rhys-Davies was that he answered Lucas' criticism with a platitude ("You can't have everything!" "You can't please everyone!") passed off as not being a platitude by its being dressed up with a famous name and the aforementioned mathematical language. The ungenerous would suggest that Rhys-Davies' answer was the non-sequitur of a bully who substitutes name-dropping for genuine thought and argument, and that brings me to something I have had much occasion to think about recently, how Internet trolls behave.

Bluntly put, how Rhys-Davies spoke is not how intelligent people speak but rather how stupid people imagine intelligent people speaking.

Why People With High IQs Don't Go Around Bragging About Their High IQs

I have been struck time and again by the number of trolls who claim to be intellectually superior to anyone they are likely to meet in their profiles and handles. One claimed in his profile to have a "top IQ," and insisted that "you won't find anyone who knows more about life than me." Another, less subtle, incorporated the word "genius" into her handle.

Anyone who has spent time among actual "high IQ" people would recognize these people as fakes instantly, simply for their having done so--as people who actually have high IQs do not go around telling people they have a high IQ.

I think one reason is that they know how little IQ really means. By that I do not necessarily mean that they think the test is meaningless (even if they are likely to realize that a test administered on a single occasion which attempts to offer a general estimate of the extremely complex phenomenon that is intelligence is apt to offer a result more precise than accurate), but that they know how little it counts for in real life. I recall, for example, that when his reported IQ of 170 came up in an interview, Herbert Stempel remarked that that "and two dollars can get you on the subway."

It is a reminder that in real life, however much addicts of the propaganda for meritocracy insist upon it, society is not ruled by the smartest, nor inclined to lavish its rewards upon them. Picking your parents well, playing the dirty game of getting ahead, stands one in far, far better stead there.

They know how little it means even where purely intellectual life is concerned. I doubt a high IQ has completely saved anyone from the painful experiences of working harder than they should for longer than they should, making mistakes, getting frustrated, and outright failing. (Often high intelligence is an obstacle to getting good grades in a school system more concerned with students being able to take direction well than think for themselves; get good scores on standardized tests than acquire deeper capability.)

Even the highest intelligence ever possessed by a human being is not automatic, effortless omnicompetence, a free pass to a life of Faustian adventure in which one gets to do and to be everything. Quite the contrary, where the stupid have the Dunning-Kreuger effect to insulate their egos, they have a fairly good idea of the limits of their competence, even where they are competent--even when a narrow specialist, all too alert to how little they know about that, and how much less they know about the rest, the bar higher for them. In fact, that capacity for self-criticism, without which they would have accomplished little, can leave them underestimating themselves.

Because they have worked, they have known burn-out (and the deflation of confidence that goes with it). Because they have been recognized as something, they have wondered if they have not been impostors (the more so in the wake of those inevitable frustrations and failures). Because people have seen potential in them, they have wondered if they have lived up to it. (You have a high IQ, you say? Well, what did you do with it?)

This is also the more poignant as they are likely to have spent much of their life around other, similarly intelligent individuals, whose failings and insecurities are less visible to them than their own, who may seem the more formidable than they. They know there is always someone "better" out there. And they are smart enough to know that no one likes a braggart, not least because they, like everyone else, have had to endure someone else's bragging.

Indeed, in that particular milieu most closely identified with conspicuous intellectual demands and achievement, they are, far from the super-individualist so beloved of bad science fiction, all too aware that they "stand on the shoulders of giants," and manage that only with the help of others. (Never mind winning Nobel Prizes--publishing a run-of-the-mill paper is likely to be a highly collaborative effort.) Robert Merton put it well, the more so for how provocative the term sounds to the ears of the conventional--"communism" is a cornerstone of the ethos of the scientific enterprise.

Does this mean that the intelligent are incapable of arrogance, whether in their taking an exaggerated view of their powers, or their possession of a sense of superiority to others? Of course not. Indeed, one can picture all the sources of insecurity described here driving them overcompensating. But at the least it makes the IQ test score-flogging crudity of bad fiction and much inane real-life personal behavior by the fakers a rarity among "the real thing."

Sunday, April 14, 2019

Review: The Chimera Vector, by Nathan Farrugia

Nathan Farrugia's The Chimera Vector being the sort of thriller where plot twists and "big reveals" are critical to the entertainment, it is difficult to discuss in spoiler-free fashion. Without giving away much, though, I can say that the essentials are familiar--a covert operative betrayed and forced on the run to survive, resulting in their trotting the globe with the enemy close behind them and discovering behind the mysteries of their own past, and the dangers they face in the present, a conspiracy of global proportions.

However, Farrugia's handling of the material makes it feel fresh, helped by his clean, crisp prose--a far cry from, for example, Robert Ludlum's often bombastic verbosity. It seems significant, too, that while action-packed for the most part Farrugia eschews over-the-top, physics-defying set pieces (as exemplified by Matthew Reilly) in favor of relatively brief, relatively grounded fight and chase scenes that fit in well with his narrative style. (For all the gunplay, I didn't catch myself pausing and backtracking to work out what is going on at that level--which was especially handy given that the edition supplied for review was an e-book rather than a print work.) It helps, too, that there is more than the usual canniness and substance in the thinking about the reasons behind it all. (Indeed, a good many will come away from this book feeling that behind the science fiction trappings, the story is fairly close to the truth about the times in which we live.)

High-Speed Pulp Writing: How Did People Do It?

Those who delve into the history of the more commercial kinds of fiction--the old pulps and paperbacks and so on--encounter what seem like extraordinary feats on the part of writers from the standpoint of sheer volume of output, authors cranking out whole novels in days (a Lester Dent, for example, particularly famous for this approach).

The prospect of being able to do the same is, for many, enormously appealing. After all, who hasn't wanted to finish a piece of writing more quickly? Who doesn't dream of being more productive?

Just how did they do it? The answer is that there were several enabling factors, not least in regard to the sort of books they were writing:

1. They were shooting for short works--50,000 words or so.
2. Those short books tended to be formulaic--the writer simply coming up with different factors to plug in rather than thinking up a narrative arc from scratch.
3. These short, formulaic books were also simple books, with relatively straightforward structures and straightforward prose, and minimalist in such respects as detail. This meant that they didn't have to do so much fleshing out of their scenes, which was another time-saver.
4. There was a certain tolerance for roughness and repetitiveness in the product--like a good deal of serviceable but awkward phrasing.

There was, too, the manner in which they worked.

1. They had to be able to work without interruption, with working through the night, and the next day, a common feature of the approach.
2. They had someone ready to take the work off their hands when they were done, so that they could move on to the next thing, rather than being stuck poring over what they'd finished.
3. They had the opportunity to do this again and again and again, putting such work out as much as they were physically capable, building up experience at this kind of endeavor.

The combination of the kind of project with its particular demands (and the things it didn't demand), their opportunity to work without interruption and then let go of a project to start a new one of a similar type over and over and over again, are a very far cry from how today's aspiring novelist is likely to work. They are expected to shoot for works of 100,000 words or more, books which are not just longer but more complex. Producing this longer, more complex book they are apt to hold down a day job, which by itself not just commands a large and perhaps critical part of their time and energy, but chops the rest to pieces, with fatal consequences for any sort of intuitive flow. And they are, of course, very unlikely to have a publisher waiting to take a work off their hands as soon as they are done. (Especially if they are unpublished and unagented, they might have it on their hands for years, and maybe forever.) If a writer feels that they must strive for originality, get the facts right, polish their prose; if they have been subjected to lots and lots of form rejection letters that wound enormously while teaching nothing, leaving them insecure and self-conscious; then the approach of a Dent is that much further outside their reach.

Those of us who would find their seemingly unselfconscious, spontaneous, intuitive approach a relief, or even liberating, can only breathe a sigh; but it may be some comfort to consider the reality that in today's market we would have a very tough time selling the result.

The '80s Never Ended

Recently reading David Sirota's Back to Our Future I thought again that the 1980s do define our decade, though the explanation I find satisfying, I dare say, runs somewhat deeper than his--specifically that it was the time when the right-wing cultural counter-offensive underway since the 1940s (an adjunct to the more central political-economic counteroffensive we think of as neoliberalism) finally dominated the mainstream.

Why then and not earlier? Simply put, as the stimulus of post-war reconstruction and automobile culture ran their courses, as the war economy ground down the economic machinery sustaining it (the Cold War killed the gold standard and the Great Society together), as Europe and Japan's assimilating American-style mass-manufacturing made the world market very crowded; and the pace of technological innovation and its associated productivity increases slowed, while the old resource profligacy proved unsustainable; Big Business already felt its profits squeezed in the 1960s. In the 1970s post-war boom altogether turned to post-war bust, worsening matters, while inflation may have made manufacturers uneasy but was positively toxic for finance. Meanwhile the leftward shift of society was felt as intensely threatening, even in the core, Western, industrialized nations. (Remember, for instance, the hysteria of British right-wingers about some imminent union/leftist takeover of the country that had them cooking up coup plots against Harold Wilson?)

At the same time, amid danger there was also opportunity, the decay of the post-war consensus making their going on the offensive seem not only more urgent, but like it had a chance--as by getting the public to think the trouble was that things had gone too far left, rather than not left enough. Meanwhile, both sides of the counterculture provided material to work with--not just the much ballyhooed backlash against it, which, in a demonstration of the validity not of the horseshoe theory by which they set so much store but the fishhook theory they have marginalized, shifted ever-shaky "liberal" centrists right into its ranks in practice and even in name, but the counterculture itself (certainly, as it appeared in the U.S.). More emotional than intellectual in its foundations, strongly inclined to individualism, never very strong on the issues of economics and class, it was possible to exploit its impulses on behalf of not just consumerism (as Thomas Frank has made all too clear), but the full-blown neoliberal economic agenda--turning the hippies into yippies and just plain yuppies.

What we have lived through in the last four decades has been the epilogue. But, perhaps, also the prologue to a new story . . .

Subscribe Now: Feed Icon