Thursday, June 6, 2019

The Action Film's Transitional Years: Recalling the 1990s

Once upon a time cops and, especially in the '80s, the commando, were the characteristic figures of the action movie, a relatively grounded, often quite gritty, mid-budget and typically R-rated affair stressing gunplay and car chases. Today it seems that the genre is nearly synonymous with super-powered superheroes at the center of megabudgeted, Computer Generated Imagery-fueled science fiction extravaganzas, more family-friendly than not, but so big and so fast that looking right at them one can scarcely tell what is happening on the screen for much of the time. I would argue that where this was concerned, the 1990s was the key transitional period.

A Generation of Action Films: From the 1960s to the 1980s
In the 1960s the first half dozen or so Bond films largely established the essentials of the genre, from their filmic structure (specifically, the packing in of more thrills, substantially through the use of multiple set pieces, if necessary at the expense of narrative logic), to the range of essential material (the variety of fight and chase scenes, from frogmen-with-knives to ninjas to pursuits on skis), to the associated cinematographic and editing techniques (the use of close shots, short takes, jump cuts and exaggerated sound effects in portraying the fight scenes), as well as developing much of the required technology (techniques for underwater and aerial photography), and probing the limits to the intensification or scaling up of the pattern. In pace and spectacle You Only Live Twice represented a maximum that later Bond films only occasionally and marginally exceeded for decades afterward, while only once was a Bond film to cost more to make before the 1990s.



Still, it must be admitted that Hollywood was slow to replicate their success. It certainly cashed in on the Bond films' success with a rush of spy-themed movies, but generally went the cheaper, easier route of parody, while showing little alertness to what might be called the "poetics of the action movie"—saving its resources for splashy musicals, while the younger, fresher talents capable of walking a new path generally seem to have been more interested in the edgier material. For all that Hollywood did produce contemporary-set films with action in them, but a good look at the crime films and disaster films of the '60s and '70s shows that they were just regular crime dramas that might have a big car chase in the middle (Bullitt, The French Connection, Dirty Harry), or variations on Grand Hotel in which things periodically blew up or crashed or burned down (Airport, The Poseidon Adventure, The Towering Inferno).

As the years progressed Hollywood continued its movement in this direction (Magnum Force and still more The Enforcer are rather more action-packed than the original Dirty Harry), but American filmmaking only really assimilated the pattern, let alone added anything to it, with Star Wars (1977). George Lucas' blend of action movie mechanics with space operatic imagery and revolutionary special effects (principally, the use of a computer-controlled camera that permitted a greater precision and replicability in effects shots, making the process cheaper, and allowing the production of more complex, denser, more elaborate shots of the type) was, where this kind of filmmaking was concerned, a revolution for Hollywood, and the world.



Once again Hollywood was not particularly quick to learn the lesson. Just as in the heyday of the Bond films Hollywood imitated the spy theme rather than the action movie mechanics, it imitated the space theme of Star Wars rather than what the movie did with it—as a viewing of those first two great follow-ups to Star Wars' success, Alien and Star Trek: The Motion Picture, makes clear. Still, more action movies did not follow, not least by way of Lucas, who scripted and produced the Steven Spielberg-helmed Raiders of the Lost Ark (1981), initiating a wave of H. Rider Haggardesque action-adventure, while encouraging the trend toward faster pacing, more gunplay and the rest—the second big-screen Star Trek film reflecting the logic in opting for a battle between Kirk and his most storied enemy, Khan (1982).

The same summer also saw John Rambo make his way to the screen in a battle with the authorities that escalated into a one-man war on the National Guard, First Blood. The finale had him cutting loose on the town that mistreated him with a machine gun, but it was really James Cameron's mix of science fiction and paramilitary action, The Terminator (1984), that established the pattern for films where the central figures, rather than shooting down opponents with a sidearm, takes them out in greater numbers with a machine gun—prevailing in the subsequent '80s action classics Commando (1985); that same summer's Rambo: First Blood, Part II (1985); Cameron's sequel to what had previously been a haunted house horror movie in space, Alien, the ingeniously titled Aliens (1986); and Predator (1987), which actually begins with Arnold Schwarzenegger participating in a conventional enough commando raid in Latin America, such that one could easily mistake it for a sequel to Commando. Heavier firepower of the kind also became increasingly characteristic of the similarly prolific cop movies, like that same summer's Lethal Weapon (1987), Robocop (1987) and Beverly Hills Cop II (1987).

Of course, the tendency had already hit something of a natural limit, and even self-parody, in Commando, where at the climax the protagonist mows down line after line of oncoming attackers just by facing in their direction as the machine gun he has on his hip blazes away. So did it also go for the genre more generally, a fact harder to ignore as the decade continued.* Over-the-top as the second Rambo film, the gargantuan Rambo III (1988), at the time the most expensive production in movie history, tried to top that, and simply looked foolish, while the finale of Stallone's next action film, 1989's gadget, assault vehicle and explosion-filled Tango & Cash, reinforced the impression. It was the case, too, that the angst to which the paramilitary action genre spoke increasingly felt like yesterday's issue. (Rambo was a spectacle of post-Vietnam anguish—fifteen years after the troops came home.) The box office grosses made this harder to ignore as the genre's two most iconic paramilitary franchises each flopped, the fifth Dirty Harry film, The Dead Pool, virtually ignored by moviegoers, while Rambo III made a mere third of what the second Rambo movie had in North America.

The Action Movie is Dead. Long Live the Action Movie!
One flop, or two, has never been enough to make Hollywood shift gears all by itself, the more so as at any given moment the movies to hit theaters two years on have probably already been greenlit—and while entertainment journalists, displaying their usual lack of historical memory for the business they write about and their tendency to read in any little up or down an epoch-defining trend, made noises about the end of the action film soon enough, this was not to be so. Indeed, Hollywood continued to make '80s-style Stallone and Schwarzenegger films, and Schwarzenegger in particular scoring a number of successes (Total Recall, Terminator 2, Cliffhanger, True Lies, Eraser); while milking those franchises that still seemed viable—and relatively successful with Die Hard (which was reasonably successful during that same summer in which Rambo III flopped), and Lethal Weapon (Lethal Weapon 2 having done very well in 1989, and leading to two more successful sequels in 1992 and 1998).

However, the fact that the action movie actually became more rather than less prominent at the box office during the 1990s was due to Hollywood's willingness to try other ideas. Not by any means did they all pan out. The decade saw quite a few video game-based action films, for example, like Super Mario Bros. (1993), Double Dragon (1994) and Street Fighter: The Movie (1994). (In these years only the relatively low-budget, August dump month release Mortal Kombat (1995) was a big enough hit to warrant a sequel, and only one (1997).) There was, too, quite an effort to produce female-centered action films, with an American remake of Luc Besson's La Femme Nikita, Point of No Return (1992); and Geena Davis' Cutthroat Island (1995) and The Long Kiss Goodnight (1996). Things turned up a bit for both endeavors with Charlie's Angels (2000) and Lara Croft: Tomb Raider (2001), but only to a slight degree. Instead the real successes derived from other approaches.

Reuse the Die Hard Formula . . . Endlessly
In the course of squeezing those extant franchises, there was the repetition of their ideas. After the second Die Hard film the actual Die Hard franchise generally moved away from the pattern of the original—a situation in which "terrorists" or somesuch seize a building or other structure and take everyone in it hostage for the sake of executing some film; miss an individual who remains loose and, despite their isolation, perhaps because they have loved ones at stake, undertakes a one-man guerrilla war against the bad guys with the clock ticking, the authorities outside trying to help and often making a botch of it, and likely some final twist awaiting us which will show the villains' plan was not quite what we thought it was at the beginning. However, no one else did, the imitators numerous and often shameless, as the innumerable lists of them on the web show.

Steven Seagal's career high film, Under Siege (1992), was "Die Hard on a battleship." Speed (1994) was "Die Hard on a bus." The Rock (1996) was "Die Hard on Alacatraz." Air Force One was . . . "Die Hard on Air Force One," the same summer that The Rock star Nicholas Cage wound up in a similar situation on a plane full of convicts, Con Air (1997). We even got Die Hard out on the "final frontier" in the eighth Star Trek film, Star Trek: First Contact (1996), where the Borg took over the Enterprise, and Jean-Luc Picard, initially single-handed, had to recover the ship and save the day and the universe from their control.

And of course, alongside these big-budgeted hits there were a good many less celebrated efforts, like those other "Die Hard on a plane" films, Passenger 57 (1992) and Executive Decision (1996); the "Die Hard in a school" movies Toy Soldiers (1991) and Masterminds (1997) (where Patrick Stewart was the villain rather than the day-saving hero this time); "Die Hard in a stadium," Sudden Death (1995); the less successful sequels to Under Siege and Speed, Under Siege 2: Dark Territory (1995) (train), and Speed 2: Cruise Control (1997); and an abundance of straight-to-video stuff, like the Anna Nicole Smith vehicle Skyscraper (1996), which stuck with the skyscraper idea of the original because, why not, and anyway, no one was really watching that for the plot, were they? Meanwhile, one can see more modest influences of the '88 classic in more mobile action films like Cliffhanger (1993) and Broken Arrow (1996), or even the family comedy Home Alone (1990).

When American Movies Aren't Enough, Look Abroad
Besides imitating Die Hard, there was a measure of borrowing from abroad, some more successful than others, with France an obvious case. Hollywood's remake of Besson's Nikita was a critical and commercial disappointment, and the reception to Besson's own English-language film The Professional (1994) was cold, but the biggest action hit of 1994, True Lies, was a remake of Claude Zidi's La Totale! (1991).

Moreover, the borrowing from France was minor next to that from Hong Kong. Hardly a new idea for action filmmakers (Tango & Cash "borrowed" its opening from Jackie Chan's Police Story), this intensified drastically, with recent Hong Kong hits getting the kind of highly publicized wide releases virtually unheard of for foreign productions, starting with the Jackie Chan vehicle Rumble in the Bronx in early 1996. More significantly, Hollywood also copied the fight choreography of the films, particularly the tight, choppy martial arts style made famous by Jackie Chan, the "heroic violence" of director John Woo's films, and the "wire fu" of directors like Yuen Woo-ping, to such a degree that they would soon become standard. Hollywood also brought many of the figures themselves into its own movies, with Woo making American films like Hard Target (1992), Broken Arrow and Face/Off (1997), and Jet Li, Michelle Yeoh and Tsui Hark not long in following them. (Sammo Hung even got a prime time Big Three network show for two seasons, Martial Law (1998-2000) on CBS.)

Often the tropes, and the figures, were used to inject new life into tiring or idea-starved franchises. The Bond series, serving up yet another, and particularly nonsensical, do-over of The Spy Who Loved Me (itself a do-over of You Only Live Twice) in Tomorrow Never Dies (1997), sought a measure of novelty in having Bond work with a Chinese agent this time, in an adventure set in Vietnam, with Hong Kong star Yeoh cast as said agent and getting a martial arts sequence in which to do her thing. The next year the fourth installment of the Lethal Weapon series (1998) featured Jet Li in a supporting role that was hyped far out of proportion to his actual screen time, while the sequel to the 1996 film adaptation of the old Mission: Impossible TV series (2000) had director John Woo bringing his trademark "heroic violence" into the franchise, epitomized by the climactic fight scene in which protagonist Ethan Hunt and his current adversary raced at each other on their motorcycles, then leaped off them to fight in mid-air.

More subtly, the pairing of Jackie Chan with Chris Tucker permitted Hollywood a significant success with the long-flagging buddy-cop action-comedy genre, Rush Hour (1998), and a more minor one with the even less vibrant Western, Shanghai Noon (2000). More significant as a pop cultural moment than any one of these films, however, a more than usually entertaining use of what Hong Kong martial arts films had to offer was part of the package that made the original The Matrix (1999) one of the decade's biggest pop cultural phenomena.

When You Run Out of Movies to Copy, Turn to TV
The 1990s, in general, was a decade of nostalgia for earlier periods, with Hollywood in particular seizing on a good many old TV shows as inspiration for big screen films—like The Addams Family, The Flintstones and The Brady Bunch, each of which was followed up by a sequel. Where the action genre was concerned the possibilities were more limited than with comedy because of the limitations of that medium in this genre, especially in its earlier days. Still, there were some efforts, the two great successes among which were, of course, The Fugitive (1993) and Mission: Impossible (1996). Notable, too, were the space opera Lost in Space (1998), and the James Bond-meets-Western series, Wild Wild West (1999).

By and large the "adaptations" were superficial, relying more on nostalgic evocation of brand names than anything else where the source material was concerned, and deriving much of their interest from elsewhere, even when they were being thoroughly derivative. Mission: Impossible director Brian De Palma, whose previous transformation of an old TV show into a feature film had borrowed its most famous scene from a classic film (The Untouchables, which worked in the baby carriage rolling down the steps from Sergei Eisenstein's The Battleship Potemkin), likewise turned to another classic movie for MI's most memorable sequence, the hanging-by-a-rope break-in scene from Topkapi (1964). (Much imitated since, everyone seems to think Mission: Impossible was the first to do it, testifying to the brevity of most cinemagoers' memories.)

Unsurprisingly, there was not much of a basis for a major franchise in these. The Fugitive, whose plot about the innocent Dr. Richard Kimble's attempt to survive and clear his name was scarcely repeatable, was nonetheless a big enough hit that the producers endeavored to do so on the basis of Tommy Lee Jones' well-received turn as Sam Gerard, U.S. Marshals (1998), and did not succeed, the series ending there, while there was insufficient enthusiasm to attempt even that with Lost in Space or Wild Wild West. However, the Mission: Impossible series has, with the aid of new borrowings, soldiered on into the present (numbers seven and eight being shot back-to-back at last report).

More Military Hardware
As the cop-and-commando films of the '80s showed, there was a tendency to incorporate more military hardware into the adventure to up the level of action, while this heyday of the military techno-thriller saw a still broader fascination with high-tech weaponry. Where bildungsroman had been a reliable basis for box office success with films like Private Benjamin (1980), Stripes (1981), and An Officer and a Gentleman (1982), mixing into such a story F-14 Tomcats in aerial showdowns with foreign enemies made Top Gun the biggest hit of the year in 1986, while that same year's Iron Eagle (where Louis Gossett Jr. again played the mentor) enjoyed a more modest success.

This did not immediately lead to a slew of military-themed action movies. However, the successful adaptation of Tom Clancy's submarine thriller The Hunt for Red October (1990) for the big screen changed that, not only making a multi-movie franchise out of the adventures of its protagonist Jack Ryan (followed up in 1992 with Patriot Games and 1994 with Clear and Present Danger), but also leading to such hits as Under Siege, Crimson Tide, Broken Arrow, Executive Orders and Air Force One. (As noted earlier, many of them were variants on the Die Hard formula, and it does seem worth noting that the confined spaces and destructive potential of warships and military aircraft made convenient settings for such plots, while the fact that such scenarios were smaller in scale may have made them more approachable for audiences.) At the same time there was more hardware on display in movies not built around this theme, with True Lies, the biggest action hit of its year, incorporating a Harrier jet fighter into the climax, while it seems worth noting that the Bond films of the period, particularly Goldeneye (1995) and Tomorrow Never Dies, made especially lavish use of combat aircraft, warships and the like in their action scenes.

Intensified Editing
Besides recycling classic movie stunts so old that to many they seemed new, adding in Hong Kong-style martial arts and gunplay, and continuing to up the military-grade firepower, Hollywood action films worked to squeeze more effect out of such spectacle as they served up, not least through a more intensive use of editing. That approach, of course, was not wholly without precedent, in cases precedent going far back into film history. Montage pioneer Sergei Eisenstein's penchant for short shot lengths is strikingly contemporary, while it is worth noting that more action-oriented film has always tended to be more tightly edited than other kinds (John Ford or Howard Hawks, for instance, tending to shorter takes than Billy Wilder). Later the advent of television and the television commercial provided an arena for ultra-tight editing, facilitated by the shortness of the product, which in the 1960s became a major influence on cinema, by way of those formative James Bond films (so much so that they may be said to have helped create the action movie), and more widely evident in moviemaking as the work of Richard Lester on films like A Hard Day's Night (1964). In the 1970s Don Simpson's famous and infamous promotion of "high concept"-style filmmaking, which turned a movie into a two hour commercial, or music video (an approach encouraged by his enlistment of makers of commercials and music videos as feature film directors, like Ridley and Tony Scott, and Adrian Lyne), which became routine across the industry, intensified this yet again.

Still, technological change made this easier and more commonplace, notably the commercial availability of digital editing technology from 1989 on, which contributed to a sharp drop in Average Shot Length. Indeed, where feature film was concerned the possibilities were rather rapidly exploited, most notoriously by Michael Bay, yet another director of commercials and music videos who made a major splash in feature film with Bad Boys (1995), The Rock (1996) and Armageddon, though others were more aggressive still, not least Paul W.S. Anderson. It was the latter who, in short order, took the trend to what appears a natural limit in Resident Evil 2: Apocalypse (2004) (with an ALS of a mere 1.7 seconds), virtually no movie pushing the envelope further in the fifteen years since.

More Science Fiction. Much, Much More Science Fiction—and Much, Much Less of Some Other Things . . .
As the late '80s already demonstrated, the desire of filmmakers and audiences alike for bigger and better action movie spectacle was exceeding the limits of the frameworks provided by the old cop-and-commando stories, even with the additions discussed above. And so there was a turn to the fantastic—to displays of more than superhuman ability and outright superheroics (Batman), to exotic creatures (Jurassic Park), to the depiction of disaster (Twister), the cast-of-thousands epic (Braveheart), and the space opera that brought all these together (Independence Day, and above all Star Wars: Episode I—the Phantom Menace).

None of these ideas was new, of course. Indeed, there was a certain amount of nostalgia evident here as well, whether for old-fashioned superheroes, B-movie monsters, Irwin Allen disaster films, historical epics or Star Wars and its imitators. Nonetheless, new technology—above all, Computer Generated Imagery (CGI)—permitted such visions to be realized with a new ease, polish and flair. It was this that made the dinosaurs in Jurassic Park what they were, this that created those casts of thousands in a movie like Braveheart—while at the end of the decade, in The Phantom Menace, it permitted George Lucas to conjure all-CGI worlds, such as was hardly dreamed of back when he made the original Star Wars. Detractors of that film and the other prequels were harshly critical of the approach, but there is no denying that it was something new, that at least some found pleasure in the results, and that it did change the way movies were made.

Going along with the shift toward science fiction, and at that, science fiction films of these particular kinds, there were other, subtler shifts. One, reflecting the fact that science fiction's attraction was its affording room for bigger and more exotic action, was that rise in movie budgets. As late as the mid-1980s a summer blockbuster could be made for between $10 and $20 million, as was the case with Commando, or the science fiction-tinged-but-still-thoroughly-paramilitary Aliens and Robocop. By the late 1990s budgets on the order of $100 million (and often much more) were standard.

Another was the way the movies served up their thrills. Where paramilitary adventures emphasized bloody mayhem, these larger-budgeted, more briskly edited films were broadly staged effects extravaganzas, less apt to stress gore—or grit of any kind. Along with the generally declining tendency of popular film to "gratuitous" nudity (in contrast with the bits Commando or Die Hard served up), this meant less reason for an R rating, which in any case became a riskier bet when such large budgets were at stake. The result was that where any list of '80s action movies mostly consisted of movies advertised with the caution that "People under 17 years may only be admitted if accompanied by a parent or guardian," the R-rated action movies generally slipped from their genre's top ranks.

To be sure, there was still a very great deal of the '80s action film in those machine gun-packing Die Hard retreads and last Schwarzenegger films and the rest. Yet, it seems worth noting that the science fiction films outperformed them with increasing consistency and by widening margins, even without a Lucas or a Spielberg at the helm. In 1992 Batman Returns was the highest-grossing film of the year, in 1993 Jurassic Park, in 1995 the next Batman sequel, Batman Forever the biggest action success. The trend became still more pronounced in 1996 with The Rock and Eraser doing well, but nowhere near so well as Twister and Independence Day. The following year Air Force One was a hit, but well behind Men in Black and the Jurassic Park sequel The Lost World. In 1998 Lethal Weapon 4 was a moneymaker, but one well short of Godzilla, Deep Impact and Armageddon, while in 1999 The Phantom Menace, The Matrix and The Mummy were the year's action hits, all safely placing among the top ten earners of the year. By contrast, the highest-ranked, non-science fiction action film of the year was the latest entry in the perennial Bond series at #13, while Mel Gibson's Payback was down in the #26 position, and Schwarzenegger's End of Days (in spite of its supernatural and end-of-the-millennium theme) was all the way down at #33.

Unsurprisingly, the Die Hard retreads puttered to a halt sometime around 1997, and so did the techno-military action (Air Force One appears a last hurrah for both those genres as summer box office staples), while Stallone and then Schwarzenegger became steadily more marginal performers—all of this coming to look like rather B-movie stuff. Hong Kong flavoring remained very much part of any old-fashioned hand-to-hand combat that still took place, but was an element in the mix rather than the star of the show.

Meanwhile the colossally-budgeted newer-style science fiction epics were the titans fighting for the box office's commanding heights. In all that, The Matrix and its sequels seem to merit special attention. An R-rated, bullet-riddled mix of sci-fi and machine gun-packing paramilitary action that starts off in--where else?--L.A., it looked backward in that respect, while in its accent on superpowers it looked forward, and in being so Janus-faced, appears to have been very much of its moment.

* The parody of this kind of action film in Jim Abrahams' Hot Shots! Part Deux did not really have to do much more than imitate what Commando already did.

Looking Past the Hardcovers: Techno-Thrillers in Other Media

Writing about the military techno-thriller--



--I generally focused on print works.

This was for several reasons--because this is where the over century old tradition has been longest and strongest; because my training and experience was overwhelmingly in analyzing literary text; because I had spent a lot of time reading techno-thriller novels back in the 1990s and wanted to write about that, the more so as very little has been written of them.

Still, recently reading David Sirota's cultural history of the '80s Back to Our Future, the strongest part of which was its discussion of the marketing of pop militarism in the '80s in response to the "Vietnam Syndrome," I was struck by how little the print techno-thriller came up. Tom Clancy was, for two years running, the author of the year's top-selling novel (1988 and 1989), and the decade's top-selling novelist, on the strength of The Cardinal of the Kremlin and Clear and Present Danger (and The Hunt for Red October, Red Storm Rising and Patriot Games too), but save for an explanatory note for a single, brief, rather offhand, reference to Jack Ryan, the name simply does not come up. (Still less is there any reference to, for instance, Stephen Coonts or Larry Bond or the rest.)

Instead the discussion focuses on a number of films and TV shows--in particular, Red Dawn and Top Gun and G.I. Joe--and video games. This does, to some extent, reflect the fact that where his recounting of the relevant political history relies on solid scholarship to produce a robustly comprehensive picture of that side of the period, there is an element of personal reminiscence in his discussion of the pop culture. And Sirota was, as Goldbergs viewers remember, a young person at the time, much more likely to rewatch Red Dawn on VHS or play Contra on Nintendo than plow through one of Clancy's doorstops. However, that is exactly the point. A lot more people, of all ages, watched movies and TV, and played video games, than read books, even the bestselling books of the day, and this was all the more relevant with his focus in that portion of his study on the molding of the outlook of the young.

This does not seem to me to make the print techno-thriller less important or interesting. But, content for the time being with what I have written about that, it did suggest to me where my research might go from here, in the direction of all those works with which I was not unfamiliar (his generation is more or less my generation, and I sure did play Contra and the rest well before I picked up a book by Clancy), but to which I had been less attentive academically.

In fact, I have found myself revisiting the body of writing relevant to this side of film, television, and especially gaming history (because a "big picture" seems most elusive here). So far as I can tell it is much like those bodies of work on anything else to do with pop culture.

On the one hand there is journalism, more interested (often too much so) in retelling the personal stories of "industry players" in that manner so thrilling to devotees of the cult-of-the-tech-entrepreneur than offering comprehensive understandings of the subject matter, or much cultural or political insight; often downright sloppily researched and written; and frequently passing off material suitable for a decent article as a doorstop of a book (the easier for them to do because they have set their sights on telling a story, and telling it sloppily).

On the other hand there is scholarship, tending in the other direction, toward the overly narrow rather than the overly broad, normally in line with the fashionable academic obsessions (with regard to subject matter and methodology) I find . . . well, less than illuminating, especially when undertaking a project like this.

Still, that suggests to me that there is more to be said about the matter--and I hope to say something of it when I get around to revisiting the subject.

Crazed Fruit: Some Thoughts

The screen adaptation of Shintaro Ishihara's second novel, Crazed Fruit, ran on TCM some time ago.

I understand the movie has been recognized with a place in the Criterion Collection.

I have not read the book. But if the film is at all representative of it, it seems that Ishihara, who has screenwriting credit, simply reshuffled the elements of his first book, Season of the Sun--snotty rich college kids who spend their time playing on the beach and with boats, going to nightclubs, trying to get laid and getting into fights; some "tough" talk about generation gaps and the worthlessness of ideas next to action that comes off as the pseudo-macho, pseudo-intellectual drivel of posers; a conflict between brothers over a girl with more than a little history, and a proneness to manipulate and use men, with the younger brother the more susceptible and the older the more tough-minded; and just so we know the author's serious, the culmination of all that in sudden, violent death. We also see plenty of certain unattractive elements more pronounced in his other writing, and out of which he made a political career--like racism and xenophobia with which a certain amount of sexual baggage is nakedly mixed in. (Yes, there is a "The foreigners are here to take away our women!" element in this particular mix.)

The result is a piece of sh--excuse me, melodrama, inadequate to support the movie's 86 minute running time.

An interesting note: in addition to furnishing the source material and the screenplay, Ishihara also appears in the movie. And not just appears. With the camera focused on his face he bullyingly declares "I'm Ishihara!"--just in case we didn't know. And this alter ego of his, also named Ishihara, who happens to be part of a pack of snotty rich college kids having a run-in with the aforementioned pack of snotty rich college kids (in this case, also members of the Tokyo U wrestling team) participates in beating up the film's star, Shintaro Ishihara's real-life younger brother, the "James Dean of Japan," Yujiro Ishihara.

So basically Ishihara, going far beyond a subtle Alfred Hitchcock-like cameo, announces on camera that he is indeed Ishihara, then physically attacks his younger but more celebrated sibling on camera.

Make of that what you will.

In light of, well, everything the man has done since.

Interestingly, it seems Francois Truffaut was impressed, so much so that his seeing this film led to a collaboration between the French New Waver and the ultra-rightist who would call French a "failed language" (1962's Love at Twenty).

Make of that what you will, too.

I know I do.

Tuesday, May 28, 2019

George Orwell's The Road to Wigan Pier and Obesity

Considering the unending debate on obesity my thoughts go back to what George Orwell had to say of the dietary habits of the poor in his journalistic classic, The Road to Wigan Pier. As he remarked, rather than fruits and vegetables, whole grains, lean meat, the "basis of their diet . . . is white bread and margarine, corned beef, sugared tea, and potatoes," which he admits to finding "appalling." However,
the peculiar evil is this, that the less money you have, the less inclined you feel to spend it on wholesome food. A millionaire may enjoy breakfasting off orange juice and Ryvita biscuits; an unemployed man doesn't.
Why is that? Simply put,
When you are unemployed, which is to say when you are underfed, harassed, bored, and miserable, you don't want to eat dull wholesome food. You want something a little bit "tasty."
And then as now there was plenty of the cheap but tempting available--white bread, tinned beef, sugared tea, and the rest--because, after all, "[u]nemployment is an endless misery that has got to be constantly palliated," and such food does a rather better job of that than what the conventional wisdom recommends as nutrition.

That the observation is so little known, or appreciated, attests to how much more cited than really read Orwell, like all great writers, happens to be--and the utter alienness of what he really thought from the mainstream today, whose perspective is, as much as ever, that of complacent "millionaires." Considering the problem of obesity, it reduces the whole matter to personal, consumption, "lifestyle" choices, without the slightest respect for the matter of context. They have no interest in the class angle, no knowledge, curiosity, sympathy in regard to the hardships faced by those less well-off. The factors that incline the poor to cheap temptation do not enter into their calculations; the only thought is of taking away the palliatives. Thus they crack down on foods with sodium and transfat and large sodas, but poverty, economic insecurity, and the rest are not even seen as relevant.

I will not go so far as to say that matters like advertising or portion sizes are irrelevant, or that action on them has been completely unhelpful. Business does, after all, take advantage of the consumer, in this way as in others, through advertising, and by foisting larger portion sizes on the consumer. (When you buy a 20 ounce soda bottle because none of the machines in your vicinity are offering 12 ounce cans, that has consequences that add up.) Equally there may be factors that are even less a matter of personal choice than the response of the deprived and stressed to temptations--not least, the effect of industrial chemicals or viruses on our bodies (for which a strong case has been made, sadly to little effect thus far). Yet the failure is significant in itself, telling about the larger problem, and a significant reason why obesity has been yet another entry in that long, long list of problems--like the wreck of the natural environment, the health care system, the financialization of the economy--that are eternally talked about, and acted on ineffectively or not all, such that it has just got worse and worse, and which can be expected to go on doing so barring a change in the way society thinks and acts about these matters.

Saturday, April 27, 2019

On John Rhys-Davies' Tantrum

Recently I chanced upon John Rhys-Davies' nasty behavior toward Green Party MP Caroline Lucas on the BBC's Question Time. In particular the bit where he responded to her questioning of Donald Trump's legitimacy through reference to his having lost the popular vote by a wide margin with "Oh woman! Have you not read Kenneth Arrow and the Arrow's theorem? Any system has its problems."

As far as I can tell the criticism of his behavior has generally focused on his rudeness and sexism. I do not dispute any of that, but am in this post more interested in the manner in which he argued, or appeared to argue. In not only citing Kenneth Arrow's "impossibility theorem," but presuming that his work was common knowledge which he could expect anyone he meets to know (it is not), he implied (unsubtly) that he is not only vastly more knowledgeable, but altogether on a higher intellectual plane than his opponent.

However, he also gave away his superficiality. Arrow's theorem is a piece of neoclassical economics, after all--the same neoclassical economics that for the last century and a half has been an exercise in physics envy, Scholasticism, and the dressing up of simple-minded but ideologically convenient (read: extreme-right wing) ideas in mathematical symbols and language (hence, "theorem") to lend them the illusion of having all the force and immutability of physical law (and of course, intimidate the callow). Its value has been virtually nil where understanding actual economic life (you know, an economists's job) is concerned--and naturally the approach in general, and Arrow's theorem (vulnerable to every one of the criticisms listed here) is, at the least, rather more open to question than Rhys-Davies implied (that reading it one could not but accept the matter as settled now and forever). It certainly is not a mike-dropping answer to a specific criticism of the outcome of a specific election.

The best that one could say for Rhys-Davies was that he answered Lucas' criticism with a platitude ("You can't have everything!" "You can't please everyone!") passed off as not being a platitude by its being dressed up with a famous name and the aforementioned mathematical language. The ungenerous would suggest that Rhys-Davies' answer was the non-sequitur of a bully who substitutes name-dropping for genuine thought and argument, and that brings me to something I have had much occasion to think about recently, how Internet trolls behave.

Bluntly put, how Rhys-Davies spoke is not how intelligent people speak but rather how stupid people imagine intelligent people speaking.

Why People With High IQs Don't Go Around Bragging About Their High IQs

I have been struck time and again by the number of trolls who claim to be intellectually superior to anyone they are likely to meet in their profiles and handles. One claimed in his profile to have a "top IQ," and insisted that "you won't find anyone who knows more about life than me." Another, less subtle, incorporated the word "genius" into her handle.

Anyone who has spent time among actual "high IQ" people would recognize these people as fakes instantly, simply for their having done so--as people who actually have high IQs do not go around telling people they have a high IQ.

I think one reason is that they know how little IQ really means. By that I do not necessarily mean that they think the test is meaningless (even if they are likely to realize that a test administered on a single occasion which attempts to offer a general estimate of the extremely complex phenomenon that is intelligence is apt to offer a result more precise than accurate), but that they know how little it counts for in real life. I recall, for example, that when his reported IQ of 170 came up in an interview, Herbert Stempel remarked that that "and two dollars can get you on the subway."

It is a reminder that in real life, however much addicts of the propaganda for meritocracy insist upon it, society is not ruled by the smartest, nor inclined to lavish its rewards upon them. Picking your parents well, playing the dirty game of getting ahead, stands one in far, far better stead there.

They know how little it means even where purely intellectual life is concerned. I doubt a high IQ has completely saved anyone from the painful experiences of working harder than they should for longer than they should, making mistakes, getting frustrated, and outright failing. (Often high intelligence is an obstacle to getting good grades in a school system more concerned with students being able to take direction well than think for themselves; get good scores on standardized tests than acquire deeper capability.)

Even the highest intelligence ever possessed by a human being is not automatic, effortless omnicompetence, a free pass to a life of Faustian adventure in which one gets to do and to be everything. Quite the contrary, where the stupid have the Dunning-Kreuger effect to insulate their egos, they have a fairly good idea of the limits of their competence, even where they are competent--even when a narrow specialist, all too alert to how little they know about that, and how much less they know about the rest, the bar higher for them. In fact, that capacity for self-criticism, without which they would have accomplished little, can leave them underestimating themselves.

Because they have worked, they have known burn-out (and the deflation of confidence that goes with it). Because they have been recognized as something, they have wondered if they have not been impostors (the more so in the wake of those inevitable frustrations and failures). Because people have seen potential in them, they have wondered if they have lived up to it. (You have a high IQ, you say? Well, what did you do with it?)

This is also the more poignant as they are likely to have spent much of their life around other, similarly intelligent individuals, whose failings and insecurities are less visible to them than their own, who may seem the more formidable than they. They know there is always someone "better" out there. And they are smart enough to know that no one likes a braggart, not least because they, like everyone else, have had to endure someone else's bragging.

Indeed, in that particular milieu most closely identified with conspicuous intellectual demands and achievement, they are, far from the super-individualist so beloved of bad science fiction, all too aware that they "stand on the shoulders of giants," and manage that only with the help of others. (Never mind winning Nobel Prizes--publishing a run-of-the-mill paper is likely to be a highly collaborative effort.) Robert Merton put it well, the more so for how provocative the term sounds to the ears of the conventional--"communism" is a cornerstone of the ethos of the scientific enterprise.

Does this mean that the intelligent are incapable of arrogance, whether in their taking an exaggerated view of their powers, or their possession of a sense of superiority to others? Of course not. Indeed, one can picture all the sources of insecurity described here driving them overcompensating. But at the least it makes the IQ test score-flogging crudity of bad fiction and much inane real-life personal behavior by the fakers a rarity among "the real thing."

Sunday, April 14, 2019

Review: The Chimera Vector, by Nathan Farrugia

Nathan Farrugia's The Chimera Vector being the sort of thriller where plot twists and "big reveals" are critical to the entertainment, it is difficult to discuss in spoiler-free fashion. Without giving away much, though, I can say that the essentials are familiar--a covert operative betrayed and forced on the run to survive, resulting in their trotting the globe with the enemy close behind them and discovering behind the mysteries of their own past, and the dangers they face in the present, a conspiracy of global proportions.

However, Farrugia's handling of the material makes it feel fresh, helped by his clean, crisp prose--a far cry from, for example, Robert Ludlum's often bombastic verbosity. It seems significant, too, that while action-packed for the most part Farrugia eschews over-the-top, physics-defying set pieces (as exemplified by Matthew Reilly) in favor of relatively brief, relatively grounded fight and chase scenes that fit in well with his narrative style. (For all the gunplay, I didn't catch myself pausing and backtracking to work out what is going on at that level--which was especially handy given that the edition supplied for review was an e-book rather than a print work.) It helps, too, that there is more than the usual canniness and substance in the thinking about the reasons behind it all. (Indeed, a good many will come away from this book feeling that behind the science fiction trappings, the story is fairly close to the truth about the times in which we live.)

High-Speed Pulp Writing: How Did People Do It?

Those who delve into the history of the more commercial kinds of fiction--the old pulps and paperbacks and so on--encounter what seem like extraordinary feats on the part of writers from the standpoint of sheer volume of output, authors cranking out whole novels in days (a Lester Dent, for example, particularly famous for this approach).

The prospect of being able to do the same is, for many, enormously appealing. After all, who hasn't wanted to finish a piece of writing more quickly? Who doesn't dream of being more productive?

Just how did they do it? The answer is that there were several enabling factors, not least in regard to the sort of books they were writing:

1. They were shooting for short works--50,000 words or so.
2. Those short books tended to be formulaic--the writer simply coming up with different factors to plug in rather than thinking up a narrative arc from scratch.
3. These short, formulaic books were also simple books, with relatively straightforward structures and straightforward prose, and minimalist in such respects as detail. This meant that they didn't have to do so much fleshing out of their scenes, which was another time-saver.
4. There was a certain tolerance for roughness and repetitiveness in the product--like a good deal of serviceable but awkward phrasing.

There was, too, the manner in which they worked.

1. They had to be able to work without interruption, with working through the night, and the next day, a common feature of the approach.
2. They had someone ready to take the work off their hands when they were done, so that they could move on to the next thing, rather than being stuck poring over what they'd finished.
3. They had the opportunity to do this again and again and again, putting such work out as much as they were physically capable, building up experience at this kind of endeavor.

The combination of the kind of project with its particular demands (and the things it didn't demand), their opportunity to work without interruption and then let go of a project to start a new one of a similar type over and over and over again, are a very far cry from how today's aspiring novelist is likely to work. They are expected to shoot for works of 100,000 words or more, books which are not just longer but more complex. Producing this longer, more complex book they are apt to hold down a day job, which by itself not just commands a large and perhaps critical part of their time and energy, but chops the rest to pieces, with fatal consequences for any sort of intuitive flow. And they are, of course, very unlikely to have a publisher waiting to take a work off their hands as soon as they are done. (Especially if they are unpublished and unagented, they might have it on their hands for years, and maybe forever.) If a writer feels that they must strive for originality, get the facts right, polish their prose; if they have been subjected to lots and lots of form rejection letters that wound enormously while teaching nothing, leaving them insecure and self-conscious; then the approach of a Dent is that much further outside their reach.

Those of us who would find their seemingly unselfconscious, spontaneous, intuitive approach a relief, or even liberating, can only breathe a sigh; but it may be some comfort to consider the reality that in today's market we would have a very tough time selling the result.

The '80s Never Ended

Recently reading David Sirota's Back to Our Future I thought again that the 1980s do define our decade, though the explanation I find satisfying, I dare say, runs somewhat deeper than his--specifically that it was the time when the right-wing cultural counter-offensive underway since the 1940s (an adjunct to the more central political-economic counteroffensive we think of as neoliberalism) finally dominated the mainstream.

Why then and not earlier? Simply put, as the stimulus of post-war reconstruction and automobile culture ran their courses, as the war economy ground down the economic machinery sustaining it (the Cold War killed the gold standard and the Great Society together), as Europe and Japan's assimilating American-style mass-manufacturing made the world market very crowded; and the pace of technological innovation and its associated productivity increases slowed, while the old resource profligacy proved unsustainable; Big Business already felt its profits squeezed in the 1960s. In the 1970s post-war boom altogether turned to post-war bust, worsening matters, while inflation may have made manufacturers uneasy but was positively toxic for finance. Meanwhile the leftward shift of society was felt as intensely threatening, even in the core, Western, industrialized nations. (Remember, for instance, the hysteria of British right-wingers about some imminent union/leftist takeover of the country that had them cooking up coup plots against Harold Wilson?)

At the same time, amid danger there was also opportunity, the decay of the post-war consensus making their going on the offensive seem not only more urgent, but like it had a chance--as by getting the public to think the trouble was that things had gone too far left, rather than not left enough. Meanwhile, both sides of the counterculture provided material to work with--not just the much ballyhooed backlash against it, which, in a demonstration of the validity not of the horseshoe theory by which they set so much store but the fishhook theory they have marginalized, shifted ever-shaky "liberal" centrists right into its ranks in practice and even in name, but the counterculture itself (certainly, as it appeared in the U.S.). More emotional than intellectual in its foundations, strongly inclined to individualism, never very strong on the issues of economics and class, it was possible to exploit its impulses on behalf of not just consumerism (as Thomas Frank has made all too clear), but the full-blown neoliberal economic agenda--turning the hippies into yippies and just plain yuppies.

What we have lived through in the last four decades has been the epilogue. But, perhaps, also the prologue to a new story . . .

Why Was Hollywood So Slow to Capitalize on the Action Film?

The original, EON-produced James Bond movies of the early 1960s are generally regarded as the starting point for the modern action film--the movies that established their characteristic filmic structure, its range and density of set pieces, and its cinematographic and editorial techniques (and even the "blockbuster" mode of releasing them). Yet, Hollywood seems to have only properly begun to assimilate this practice in the mid-1970s, while it seems that it was another decade before they really became a staple product for the industry. In other words, a full generation seems to have passed between those foundational Bond films, and the rise of the Hollywood action film. Just why was that?

One might imagine that such a question had already been asked and answered numerous times, but so far as I can tell, this is very far from being the case. One reason, I suppose, is that the sorts of serious, rigorous scholars of film who would do the hard work of tracing the development have little time for action-oriented blockbusters, whose rise many of them even seem to regard as an annoying aberration, or even a disaster for film. Another would seem to be the extent to which action films are defined by form rather than the kind of content to which they prefer to devote their analytical energies, in part because it is easier to talk about themes than it is the subtleties of cinematic technique--while the fact that the form is of a kind they do not respect does not encourage them in such efforts.

The question is a tough one, too, because it raises the question of why something was not done, a necessarily trickier matter than explaining why something was. Still, the well-known history of Hollywood, and the Bond films, offers a possible explanation.

When the early Bond films came along they were thrillers tinged with science fiction--at a time when thrillers and science fiction were both B-movie stuff, and B-movies looked like they were dying. It might be added, too, that spy films were a relative exoticism for American culture--a genre rather less well-established than, for example, hard-boiled crime. A survey of the reviews written about the films at the time also makes it clear that observers were not so sure about what to make of them--not so sure that they were looking at the rise of a new kind of film. Rather it seemed to them that they were looking at merely a parody of existing material--a parody of Hitchcock, a parody of science fiction, with such a view the more understandable because of the illogic and self-parody that was part of the warp and woof of those movies from the start.

None of this stopped Hollywood from imitating the Bond films. Yet, the imitation was for the most part superficial, and lightweight. The movies imitated the theme of the globe-trotting, gadget-packing superspy, for the most part parodically, rather than displaying a real grasp of the action movie mechanics, because the former was all that could be done with the slight understanding of or interest in those movies. The superficial, parodic route was also much less expensive. After all, in less than half a decade the Bond films upped their budgets tenfold, You Only Live Twice running a hefty near $10 million--and it would have cost Hollywood far more to make such a production at home. It is safe to assume that this was rather more than it was ready to risk in trying to compete head to head with the Bond brand by making a glorified B-movie with the kind of A-movie budget they were used to spending on other, very different kinds of film.* Finally, the reality is that in the mid-1960s the Bond films were already passing the peak of their popularity, and by the 1970s looking increasingly repetitive, derivative of other material and outright decadent, lessening the interest in the movies and what they may have had to offer (that still unmastered action movie form), turning attention from them toward other directions.

And even before that point there were two other trends that galvanized Hollywood's attention, one more natural given its history, the other more radical. The first was an extension of the turn to splashy musicals evident in the '50s, with the years of Bond's greatest successes in America overshadowed each year by a hugely successful film of the type. In 1964 Goldfinger was #2 at the American box office, and From Russia With Love #5, but My Fair Lady was #1. In 1965 Thunderball proved the series' biggest ever success, but wound up at only #3, overshadowed by, above all, the biggest box office phenomenon of the whole decade--The Sound of Music, which took in twice as much in America and globally. In the next several years Hollywood sunk much of its money into trying to repeat that success, and never quite accomplishing it, Camelot doing all right, Oliver! better than all right, but 1969's Sweet Charity, Hello, Dolly! and Paint Your Wagon a trio of big-budget ($20 million) letdowns that largely ended the trend.

The other trend was, of course, the "New Hollywood." Pioneered by directors less habituated to Hollywood tradition, they had relatively little interest in the kind of movie the Bond films represented. Their sensibility, after all, was oriented toward the arthouse, toward youth and the counterculture, toward social criticism and urban grit, toward aesthetically and politically edgier fare, rather than Bond-style crowd pleasers, even if, eventually, it was a New Hollywood director who brought the action film to Hollywood--George Lucas, in Star Wars, for which the Bond films had been an influence.

At that point Hollywood repeated the pattern of superficial imitativeness with Star Wars, imitating the space theme more than grasping the mechanics of action moviemaking, such that the Alien and Star Trek franchises did not began as action movie series', and only became such over the course of the '80s. The original Star Wars trilogy ran its course in 1983, and by then the larger fashion for space operatic movies was waning (few such appearing until the mid-'90s), but by then Hollywood had had the chance to pick up the pattern, routinize it in a period when it might have been more open, because it was more prepared to learn from its own successes than those of others, because B-movies were becoming the new A-movies, because "high concept" had become king and even if it was not yet closely identified with action-adventure, Lucas and his cohorts had given the studio heads the strongest foundation for high concept success of all.

* The 1967 Casino Royale wound up pricier than any Bond movie made to date with a $12 million budget, but this was the result of a chaotic production process that led to cost overruns; and in any event, that film very much an exception.

Darko Suvin and H.G. Wells

Darko Suvin's collection Metamorphoses of Science Fiction includes such classics of his criticism as "Estrangement and Cognition" (available at Strange Horizons).

It also includes a piece dealing with "Wells As the Turning Point of the SF Tradition." In it Suvin remarked that Wells was the "first significant writer who started to write SF from within the world of science, and not merely facing it" (220); whose scientific romances were the "privileged form in which SF was admitted into an official culture that rejected socialist utopianism" (220); and "endowed later SF with a basically materialist look back at human life and a rebelliousness against its entropic closure" (221). "For such reasons," he declares, all subsequent significant SF can be said to have sprung from Wells' Time Machine" (221).

Others are often credited with founding the genre, sometimes Mary Shelley, sometimes Jules Verne, but when one considers science fiction from the standpoint of what made it really unique within twentieth century fiction, from the standpoint of what might be considered its cultural niche or even its special purposes that it alone and not other kinds of fiction could perform, it seems to me inarguable that, as Suvin contends, Wells was by far the most important of these three.

Of Assholes and Bullshit

The last time I checked one still could not say the words "asshole" or "bullshit" on network television. Yet, academics have been writing books about these titles. Aaron James recently published Assholes: A Theory. Even before that, Harry G. Frankfurt paved the way with On Bullshit.

This may seem like a matter of writers trying to sell books by grabbing attention with a few swear words where, in even this vulgarized era, they might be least expected. However, that strikes me as not just ungenerous but inaccurate. The plain and simple truth is that these words have no precise synonyms in the English language.

James contends in his book that an "asshole" is a person whose sense of entitlement translates to their immunity to others' criticisms, reflected, not least, in their systematically exploiting advantages they may have over others--an idea no other word in English captures equally well.

This is even more the case with "bullshit." Bullshit, as Frankfurt makes clear, is not a lie. A liar, after all, is attentive to the truth, enough to know it and for some reason wish to conceal it. For all their faults, they at least know what the truth is and in some way respond to reality. The bullshitter, by contrast, is indifferent to the truth or falsity of what they say. They simply do not care. It is a whole other level of bad faith--and again, no other word captures this equally well.

All that being the case, these terms have been used in serious social science analysis. James applied what he discussed in Assholes: A Theory to the analysis of a then-presidential candidate in Assholes: A Theory of Trump, while David Graeber successfully used the concept of "bullshit" to offer one of the most original and significant critiques of the neoliberal economy in decades, Bullshit Jobs.

All that being the case, will these "dirty" words gain greater acceptance within daily usage? Somehow, I doubt it will happen anytime soon. Still, they seem to me indispensable, and unless we come up with satisfactory equivalents usable in polite society, we will have no choice but to go on speaking of assholes, and bullshit--especially in an age that has made an idol of the asshole, and unavoidably seems in danger of being buried in bullshit as a result.

Saturday, April 13, 2019

The Politics of Trolling: A Few Reflections

It seems to me that the label "troll" is much misused. Certain lazy writers will apply it to any criticism. Straightforward political commentary, or satire, are routinely mislabeled this way.

A troll is not merely someone who disagrees with you, or who even happens to annoy or offend you in their disagreement. A troll is someone who makes it their business to annoy you or offend you with disagreement. They do not address themselves to the world at large, but make a point of going specifically where they are not wanted. They enter other peoples' civil conversations in uncivil fashion. "Stop your talking among yourselves and address me!" they demand, putting others on the spot, and often demanding they answer absurdities and irrelevancies, often absurdities and irrelevancies which the troll themselves recognizes as such, or otherwise endure their attacks (though in fairness, they also mean to go on attacking you regardless of what you say).

This is because they are uninterested in an exchange of ideas. They are not even interested in persuading others. Rather they are the self-appointed police of what is acceptable, meting out punishment to those they see as unacceptable, or even disrupting it. They cannot shut other people down by command, but they can waste their time, and still more, make them miserable. They can confuse and distract, disgust and offend, embarrass and wound. They can, in short, harass them, sabotage them, bully them.

That last word is critical. Bullying is something those with more power do to those with less; something those who feel less inhibited do to those who feel more inhibited. The exploitation of the asymmetry is critical. And the asymmetry itself points to a reality not often owned to which is that, in practice, bullying tends to go from right to left.

Writing those words, I could almost see the right-wingers foaming at the mouth with rage.

I remind them, however, of the way I have defined trolling here--not simply any critical or satirical statement, but a statement presented in this particular way, for this particular effect.

I remind them, too (or explain to those who have not read this blog before) that not everyone they think of as left really is of the left--with many a practitioner of "identity politics," for example, really a right-wing nationalist whose nationalism simply happens to attach to a traditionally marginalized group.

Finally I remind them that, as a rereading should suffice to confirm, my statement is not absolute. I do not say here that bullying can never go from left to right, just that it is less common (I suspect, much, much less common), because it so much less consistent with their values, and so much more dissonant with them; while their practical situation reinforces this.

The reason is that it is those on the left who value reason, freedom, openness, fairness. Many a self-identified conservative insists that they are all about these things, but conservatism is, at bottom, defined by a more guarded attitude toward them. Conservatism is founded on a skepticism about reason, especially in the realm of human affairs. Its respect for freedom is limited by its fear of what "too much" freedom might mean. It is, accordingly, more concerned with order--with enforcing rules for the sake of enforcing rules--and more accommodating of irrational bases for action, not least authority and prejudice (in the broad sense of the term), and the sacrifice of openness and fairness to them.

Moreover, those on the left are convinced that they have reason and fairness on their side. They want, need, to persuade others of their rightness. At the same time, those hostile to them can and do exploit their commitment to dialogue and fairness by sucking them into exactly the position where they wind up trying to defend themselves in a legitimate fashion against the unfair tactics of a troll.

One can go still further and consider certain attitudes, certain kinds of behavior--an admiration of self-assertion and aggression for their own sake, even when, or because, this goes along with callousness and meanness, with inegalitarian and even anti-egalitarian sentiments or presumptions. Being an "asshole" is hardly a deep or sophisticated basis for a political ideology, but there seems no shortage of those who treat it as if it were that, and it would be tough to argue that the "cult of the asshole" is not something more prevalent on the right, much of which, in the age of "greed is good," status politics and the rest, positively revels in such attitudes.

There is, too, the not insignificant reality of who actually has power. We live in a profoundly unequal society, with regard to wealth and status, and it is those who are on top, those who are comfortable with things as they are and ferociously hostile to those who would have them another way, who determine what is allowably respectable and mainstream in speech and expression. By and large, the further to the right one is, however much they may complain (and do they ever complain!), the safer and more secure they can feel; the further to the left one is, the greater the risk they take when they express themselves, the more easily are they put on the defensive and faced with the outrage of a mob. (Bill Maher may pal around on his show with Steve Bannon and the rest of the Breitbart crowd. Can you picture him similarly paling around with genuine Marxists? Of course not.)

There seems every reason to think that this does make a difference with regard to how often a person will act out in this way, whether they end up aggressor--or victim. And it seems pointless to deny this all in the name of that pseudo-fairness which assumes that both sides must be equally guilty or equally blameless in a given situation.

On the Word "Entitlement"

As James Cairns observes in his recent book The Myth of the Age of Entitlement, the term "entitlement" is defined by the Oxford English Dictionary as "a legal right or just claim to do, receive or possess something"--and this is its usual, conventional meaning. However, as he also points out, psychologists' different use of the word equates it with false claims to such rights and claims, reflective of narcissism--extreme selfishness and grandiose views of oneself to which they expect others to pander. "You are acting very entitled" one might chide another individual they think needs a "reality check" and a "privilege check"; someone who needs to "get over themselves"; someone they think needs to be brought "down from the clouds," and "welcomed" back "to the REAL world" (awful, awful phrase, which will probably one day get its own post here).

The second usage is not a description of a matter of fact, but an accusation. And that second, accusatory usage would seem to raise the danger of turning the first usage, any usage, into such an accusation; of turning a mere report or acknowledgment of legal rights or just claims into a way of undermining or denying such rights and claims by declaring the claimant of an entitlement to be a selfish, deluded narcissist. Moreover, given the way in which society misuses and abuses the language of "moralistic" scolding to legitimate inequality and injustice and worse (evident in the manner in which people use the word "deserve"), it would seem in danger of systematically misusing and abusing the word in exactly this way.

And indeed, that is exactly what has happened.

Consider how we in the United States refer to our social safety net--to the Social Security that provides a measure of protection to the elderly, the unemployed, the disabled. This are all other such programs are consistently called "entitlements" in common usage, and the double-meaning is all too evident. They are undeniably legal rights to those for whom they are intended. Sane people would recognize that they are a just claim as well--not only because society is obligated to provide for its most vulnerable members, but because everyone pays into the system in the expectation that if and when they need this program, it will be there for them.

However, from the very start a significant and influential swath of opinion has been fiercely opposed to the existence of any such program, and even the principle of mutual aid underlying it--begrudging the contributions they make, and even the idea that human beings have any obligations to each other (with the frothing-at-the-mouth meanness-wrapped-in-pseudointellectualism of the Ayn Rand-loving "libertarians" who befoul the Internet and every other public space only the most blatant such expression).

Of course, this swath of opinion has only grown more vocal and influential as the country has continued its rightward march from the 1970s to the present. And this, too, is at least as evident in every usage of the word "entitlement" in reference to a program like Social Security. In fact, the attitude, and the usage, have progressed so far that ostensible progessives use the word "entitlement" to beat down working class discontent with the erosion of their social protections, wages, benefits and prospects in the neoliberal era--as New York Times film critic A.O. Scott so shamefully does in his review of 2016's Manchester by the Sea (in which he "accuses" figures like the film's protagonist Lee Chandler with a sense of entitlement as "white working men" to a "working man's paradise").

What is as notable as such usages of the word "entitlement" is the ways in which the word is not used. Working class people, especially working class people whose ethnic background makes it clear that they are disadvantaged by class and not race in a society and a discourse neurotically hostile to acknowledging class at all, are charged with being "entitled" in this way--with being unreasonable narcissists for expecting to be able to find work, and then when putting in their hours on the job, live in something better than abject poverty.

However, those whose sense of entitlement is arguably greatest in terms of their expectation that society really will cater to their needs above all others, and press hardest and most successfully for it are, it is difficult to deny, not those who have least, but those who have most. In the neoliberal age that tells the poorest and weakest that they are "entitled" narcissists for having any expectations at all from life, the economic conventional wisdom is that all economic and social policy, all political life, must revolve around creating the good "business climate" conducive to profits. Which, in practice, means that the needs of wants of business (low or nonexistent taxes, massive subsidies, weak and nonexistent regulation) come before the wants and even the needs of everyone else (be it tax equity, public services, the protection of consumers, labor, the environment).

Yet, "no one" considers neoliberalism to be an expression of the sense of "entitlement" of the affluent.

In the 2008 financial crisis, as bailout followed after bailout, transferring the consequences of institutional behavior by banks and brokerages whose stupidity and irresponsibility beggar description from private balance sheets to public, government balance sheets, and in turn precipitated fiscal crisis and economic crisis that rocked the political system to its foundation with consequences we are still witnessing today, there was much criticism--but few thought, or dared, to say that the financial institutions which demanded these bailouts "acted entitled." (Indeed, in the seven hundred and twenty pages he devoted to this revolting history, Tooze did not use the word, or any synonym for it, even once.)

Of course, those who hate welfare for working people have innumerable excuses for the corporate kind. Much of it, they would argue, is not corporate welfare at all--pointing to such things as the risibly lame "Tax breaks are not subsidies" argument (debunked here). Where such dodges are insufficient, they contend that, well, those corporate types getting the welfare--or as they prefer to call them, the entrepreneurs--are the "wealth creators," the ONLY wealth creators (the natural world on which we rely, the working people who actually staff their companies by the thousand and million, the public services supporting their activity are not wealth creators, only parasitic scum in contrast with the "God on Earth" that is the wealth-creating entrepreneur), and they have to do what they have to do.1 In any event, There Is No Alternative.

But don't you DARE call them entitled you entitled plebs!

1. It does not seem excessive to say that the neoliberal's attitude toward the "entrepreneur" is a near-parody of Hegel's attitude toward the state in The Philosophy of Right--God on Earth.

The Writing Life in an Age of Bullshit Jobs

"What does it say about our society that it seems to generate an extremely limited demand for talented poet musicians but an apparently infinite demand for specialists in corporate law?" David Graeber asked in Bullshit Jobs.

One can equally ask--why so little demand for novelists and filmmakers, painters and sculptors, dancers and musicians?

My first thought regarding this issue was the answer C.P. Snow and John K. Galbraith gave to that question--that it reflects elite priorities in an industrial society, which are, above all, "growth," for which read profits, and have more confidence the "shock troops of capitalism" will deliver them than any arts types.1 More recently I considered the possibility that the explosion in the range and convenience of electronic, digital media was the key factor, bringing on a "post-scarcity" age in regard to word, and image, and sound, at the very time that readership collapsed--all as ours has been an age of scarcity in every other area.

Both these ideas still seem persuasive to me. Yet, reading Bullshit Jobs Graeber had me rethinking the matter yet again, not least the possibility that this may be another instance of that perverse tendency he identified in our economic stage at this curious stage in its development, its rewarding people in inverse relation to the actual value of their work to other people.

"But artists aren't useful people!" you protest?

Okay, perform this simple thought experiment, the next time you think of artists and the arts as somehow "unimportant"--a world without them. For the moment, let us not concern with the fine arts and high culture that, it must be admitted, are enjoyed by a comparative few and think of what even the "cultureless" many have. Picture the aesthetic element removed from every article we use, from our consumer electronics to our cityscapes. Picture your life without every note of music you have ever heard, every film or TV show you have ever watched, every video game you have ever played.

Would you not feel that your life has been impoverished, perhaps gravely so? So much so as to diminish the value of the technological achievements of which we make so much? Those worshipping at the Church of Apple foam at the mouth and reach for their torches and pitchforks if someone tells them their iPhone is not the telos of five billion years of evolution, but would people value their phones so much without the content they access through them--content created overwhelmingly by artists?

Does the existence of corporate law as such mean as much to you as what would have just been lost?

1.Anyone not taken in by the silly Edisonade propaganda and the idiot cult of the tech billionaires so beloved of the libertarian/neoliberal right (Atari Democrats as well as Reagan Republicans) knows that in real life the vast majority of working engineers and researchers are poorly treated. However, when that is how "late capitalism" treats what it supposedly regards as its most precious workers, one can well imagine how much more shabbily it treats artists.

Subscribe Now: Feed Icon