Tuesday, February 28, 2023

Bowdlerizing Bond?

Apparently there are plans to reissue the Ian Fleming Bond novels later this year to mark the series' (and thus the franchise's) seventieth anniversary. (The first Bond novel, Casino Royale, hit print in 1953.)

It seems to me that this is more gesture toward the idea of the brand name "James Bond" attaching to an essentially coherent and thriving franchise than anything else given the lack of evidence for any audience for the novels themselves. Bond continuation novel author Raymond Benson himself acknowledged that the literary prose style of the Fleming originals (Fleming's writing the adventures of James Bond as if he were writing Madame Bovary) by itself suffices to make the books unsalable today, while even when Benson and his colleagues produced more accessible, brisker, more action-packed books they did not exactly set the bestseller lists on fire. (Indeed, so far as I am aware John Gardner's Win, Lose or Die was the last Bond novel to grace the New York Times hardcover fiction bestseller list, way back in 1989, since which time there have been no fewer than eighteen more novels, none of which seem to have managed the feat.) And if anything the odds against a comeback only decline with time, as the public shifts away from print toward audiovisual media for its entertainment generally, and its dose of action-adventure especially.

It has also come to my attention that the books are being bowdlerized. Specifically there is an effort to edit out the more overtly racist material.

This may seem counterintuitive. After all, in the hierarchy of concern among the status politics-minded sensitivity in matters of genders comes ahead of sensitivity in matters of race, and even were that not the case I would imagine that there is more offense taken at how he handled gender than race. (Indeed, while our culture warriors love to think of the '50s as right-wing heaven the ultra-Establishment Edwardian Tory Fleming was already being called out as a reactionary by the feminists of his day, and Fleming's reaction to them such that I suspect that Fleming sometimes played the troll--as in Goldfinger.) Certainly it would seem to say something that, to cite one of those more recent continuation writers, Anthony Horowitz went to extreme, in fact wildly anachronistic, lengths to make the gender politics of Trigger Mortis conform to the expectations of what we now call the "woke" (to the point of undoing Bond's "conversion" of Pussy Galore to heterosexuality, and subjecting Bond to a speech on gay liberation), but kept Bond's casual racism. (This was not least in his remarks about Slavs as a race into which "cold-bloodedness and contempt . . . seemed to be built"--which, no matter what Whoopi Goldberg thinks "racism" means, most certainly are racist, as we especially should not forget when discussing a book that hit print in a moment in which the Brexit-loving right whipped up hatred against Poles, and the most dangerous conflict of the post-Cold War era was brewing on the shores of the Black Sea along lines that make just how Westerners see Russia a matter not to be taken lightly.)

However, there is also the matter of—again--the way in which the books are written. The more inarguable racism in Fleming tends to consist of offhand remarks on the part of the characters or the narration, which can be easily cut out without affecting the larger narrative. By contrast what would appear offensive in the treatment of gender from the standpoint of entertainment-industrial complex standards in 2023 is deeply rooted in the stuff of the novel. A story like Casino Royale (and certainly Goldfinger) would be reduced to shreds if one tried to "fix" it, such that it would not be a matter of excising a little material here and there, but of a full-blown rewrite--after which pretending Fleming is the author of the result would be an empty piety by even the debased standard of these times. And so those are the adjustments they have elected to make.

Of course, to say that they can more easily make those adjustments is not the same as saying that they should do so--and I have to admit that I disapprove. While I certainly don't endorse Fleming's social attitudes (Bond's sneering attitude toward young people and the working class in the first pages of Thunderball, if not what causes controversy these days, were already an unpleasant surprise, and proved no anomaly) the fact remains that the books were written long ago by an author in no position to make decisions regarding his work. To me that in itself seems enough reason to leave things as they are. At the same time that the books have in their way become artifacts of cultural history--with the alteration not merely disrespect for an artist's creation (however we may judge the art), but an attack on the memory and understanding of the past, a thing even less forgivable.

The New York Times Hardcover Nonfiction Bestseller List- One Year into the War in Ukraine

I pretty much always find a look at the New York Times hardcover nonfiction bestseller list a nauseating experience.

My look at the list of February 26, 2023 was no exception.

Of the fifteen books, at least a dozen may be counted as giving the byline to celebrities or other public figures who did not "make their names" as authors of any type, whether as popular writers, scholars, scientists or anything else of the kind who have managed to reach a wider public through the salience of their work. Rather they were, and owe the salability of their names to, their having been the subject of the news.

Indeed, no fewer than seven of the fifteen are officially "by," if one counts "reality" stars and the British royalty who perform much the same function in their home country, Hollywood celebrity-type folks--while the figure would be nine if we add in Hollywood-loving Michelle Obama and Oprah Winfrey.

As one might guess, most of what appears to be on offer here, from the Hollywood types and others, is gossip, narcissism, self-justification--some of it mixed with self-help. The theme continues to some extent through the other books, as with Gabor and Daniel Matè's study of The Myth of Normal (concerned as it is with health). In other cases the self-justification is mixed with current affairs (as with the memoir of Mike Pompeo).

This leaves two books about anything and everything else approached in any and all other ways in all the great wide world. These might be classed as "history." One is a book by Brad Meltzer and Josh Mensch about a supposed Nazi plot to assassinate the heads of the "Big Three" allies. The "Meltzer" name by itself set off my crypto-history alarm (one of the riders of the post-Dan Brown wave of crypto-historical thrillers, he used to host a show on what people have come to joke about as the Used-to-Be-About-History Channel), and what I have seen of the reviews confirms to me that it was not a false alarm. The other book is The 1619 Project--which in the view of several prominent Civil War historians (who seem to me to actually know their subject area, unlike innumerable other experts), has no more foundation in history than Meltzer's attempt to offer up Jack Higgins' The Eagle Has Landed as history rather than fiction.

That's it, that's what the country is supposed to be reading, and it gets only more appalling on closer examination. I, for one, am disgusted by "Prince Harry's" public wallowing in self-pity (in this case, the word is entirely deserved), and more absurd and obscene behavior (like publicly bragging about the body count he thinks he personally racked up in a manner more like that of an XBox player than a soldier who has actually been to war). I am astonished that the story of a young woman known to the world mainly for playing a character on a pair of Nickelodeon children's sitcoms that ended almost a decade ago has been on the list for 27 weeks (and still at #2 after all that time!), and that other people are not as weary as I am of hearing from and about Viola Davis (whose book has been on the list for the same length of time).* That anyone still cares about the guy who played Chandler on Friends also astonishes me, and that the book by and about Dazed and Confused actor Matthew McConaughey has been on the list for 86 weeks is positively mind-boggling.

A few of the books seem to merit the benefit of the doubt. The book by the Matès at least sounds as if it offers something more than the usual claptrap, and--even if I admit to generally finding it harder to be annoyed with her than many of the others discussed here, it does genuinely seem to me to be the case that Pamela Anderson may have something more, and more challenging, to say than people may expect given that she has taken her activism beyond the usual "safe" topics (veganism, etc.), and stood her ground in the face of shameless, thuggish bullying on national television. But on the whole the picture is pretty bleak.

Considering this situation it seems only fair to acknowledge that it did not "just happen." We all know that in spite of the ubiquity of the lame cop-out that the consumer is king and the purveyors of trash are merely "giving people what they want," in publishing as elsewhere it is the oligopolistic producer who is king and the consumer who takes from what they are prepared to offer them--while the consumer did not always take it, exactly. As if the crass and politicized gatekeeping and the advertising dollars were not enough, it is now common practice to buy a book's way onto the bestseller list--with Forbes reporting that Mr. Pompeo's Political Action Committee spent $42,000 on books the day his memoir came out.

Still, even allowing for all that I find very little hopeful here--certainly from the standpoint of testimony to the intelligence of the general public, or its being mentally equipped to confront the issues of our era of polyrcisis. Indeed, never have I had more reason to hope that the bestseller list, which has ever been a highly imperfect measure of the public's book-buying, is wide off the mark in regard to what it is actually reading, let alone thinking.

* I remember some time ago citing a couple of episodes of Sam & Cat in a blog post about the techno-hype of the mid-'10s and wondering if it wasn't too obscure to have been worthwhile. I suppose I need not have worried.

Remembering Unhappily Ever After

Not very long ago the Guardian caught up with Bobcat Goldthwait, and discussed with him what he's been up to lately. Apparently he's back doing stand-up, with the fan base coming out for his shows significantly a carry-over from earlier, and his part as Zed in the Police Academy films (seen in movies two through four) apparently important here.

I noticed that unmentioned was the sitcom Unhappily Ever After, in which Mr. Goldthwait voiced "Mr. Floppy," the stuffed bunny that the show's protagonist, now living in the basement of the house out of which his wife kicked him out, believes talks to him and with whom he has conversations.

The show in question would seem to have a number of claims to a place in pop cultural history (apart from its odd premise, of course, of which Mr. Floppy was just a part). Created by Married . . . With Children (MWC) co-creator Ron Leavitt with fellow MWC veteran Arthur Silver (who wrote and produced for the show over a long stretch of its run) it was an obvious variant on the theme and complement to the original pop cultural phenomenon. It was (like The Wayans Bros.) part of the "starting line-up" of Wednesday night sitcoms with which Warner Brothers launched its broadcast network, "The WB," in 1994 (now defunct, but not going before it made its mark on pop culture). It was also a significant early credit for a number of its cast members, who went on to other, bigger things--like Kevin Connolly (Entourage), Justin Berfield (Malcolm in the Middle) and Nikki Cox (Las Vegas), while the same might be said of Kristanna Loken (who appeared in nine episodes as the nemesis of Nikki Cox's character). And it lasted the hundred episodes that were then the target for show-runners, because that was the magic number that opened up the possibility of syndication, where the really big money was.

I found the show worth a watch, and stuck with it down to the finale. Of course, few others did so, the show never acquiring the cachet of the original. (The story goes that when John Milius cast Ed O'Neill as a Navy JAG in the film version of The Flight of the Intruder test audiences laughed so much just seeing O'Neill that they decided to cut out the bit and reshoot it with Fred Thompson--a problem that Unhappily's Geoff Pierson does not seem to have had.)

One may chalk up the show's weaker impression to MWC's having got there first, and more easily shocked audiences in the more staid '80s than the "extreme" '90s. Alternatively one can attribute to the show having got too extreme, been a little too dark and weird to really find a strong echo with a wide audience. If one is inclined to think the failing had nothing to do with the premise or other content and everything to do with its promotion by the network. (The WB definitely made its mark with shows like Buffy the Vampire Slayer and other young adult-oriented contemporary fantasy, as well as young adult soap operas like Dawson's Creek--but I cannot think of a single WB sitcom that achieved much popular recognition.) And afterward there was not much of a chance for Unhappily to find, for example, a "cult" audience or anything else of the sort after finishing its original run. (Wikipedia reports that due to poor ratings in its mere two years in syndication the show has not been on the air in North America since 2001, was never released "on physical media," and is unavailable from any streaming service in the U.S., though it seems to have done better abroad, perhaps particularly in Germany.)

In any event, it's all but forgotten now--another show of yesteryear dropped down the memory hole, living on nowhere but in the memories of those of us old enough to personally recollect its first crack at the airwaves.

The Vancouver Sun on Susan Niemann's Latest

On Thursday the Vancouver Sun newspaper ran a piece by Douglas Todd regarding philosopher Susan Niemann and her latest book, Why Being Woke is Not Left Wing.

Going by the article Ms. Niemann's case regarding the difference between being left and being "woke" (the Enlightenment-based left stands for universalism, justice and progress; the "postmodernized" and therefore anti-Enlightenment woke for a tribalism and pessimism that easily leads rightward) will be familiar to many. Certainly leftists have been making this case for as long as the identity politics now traveling under the "woke" label have been around, with an obvious example the World Socialist Web Site. Their distinguishing between practitioners of identity politics has been a significant part of their criticism of what they call not the left but the "pseudoleft," and their stance toward it reflected in their attitude toward the #MeToo campaign and the New York Times' 1619 Project).

Still, the center's tendency to take an "any publicity is good publicity" attitude toward the left and accordingly deny them any (in contrast with their attitude toward the right, constantly lavishing right-wing figures, opinions, publications and organizations with publicity, often respectful and usually at least civil publicity, even where they claim to not endorse them) means that the criticism of identity politics, "wokeness," etc. from the left is virtually unseen in the mainstream. Indeed, it is the kind of thing that "fake news" alarmists endlessly call on search engines and social media sites to censor (with the WSWS, for its part, claiming that along with their website generally their criticisms of the 1619 Project specifically has been subject to such censorship). Of course, one should not have to be a leftist to come to the conclusion that wokeness is not a left ideology--only conversant with the most basic ideas of modern political philosophy--but to go by all the evidence that is something that those who have access to major media platforms most definitely are not. Accordingly, rather than any shocking originality of the fundamental conception, Ms. Niemann's stating the obvious and indisputable before people who do not often get to hear it is what makes Douglas Todd's discussion of her views in the Sun of interest. However, whether her book will make any more difference than the other attempts to correct the profound misapprehension about postmodernist identity politics somehow being a left ideology (rather than the Counter-Enlightenment ultra-right ideology it actually is) remains to be seen.

Monday, February 27, 2023

Remembering Predator 2

Like Robocop 2, Predator 2 is a 1990 sequel to a classic 1987 sci-fi action film depicting a cop fighting crime in a major urban area in a near-future America descending into chaotic cyberpunk nightmare--and which was less successful with critics and audiences than the original. Like Robocop 2, I also enjoyed it, and now looking back find it an interesting time capsule--reminding us of where people thought the world was going, starting with the opener here, which in a combination of tabloid "trash TV" with a city collapsing into failed state levels of violence with Morton Downey Jr. (as "journalist" Tony Pope, but really we're supposed to recognize Downey Jr. here) reporting from a gang battle in Los Angeles being fought with grenade launchers.

Of course, in contrast with Robocop's social satire (it was, after all, a movie about the privatization of the public sphere, corporate power, and the links between big business and the equally business-like business of crime), Predator 2 had little to offer, merely wallowing in the image of breakdown--such that Downey Jr.'s inclusion in the film plays like endorsement of the figure, his politics and the mean-spirited idiot sensationalism of which he was then the face than criticism of the cultural direction he represented. Indeed, the commonplace that the city/country/world is "going to hell in a handbasket" was here merely taken for granted--the idea that the "urban" jungle of 1997 Los Angeles would be just as much a war zone as the original Predator's civil war-torn Central American setting simply a jumping-off point for another round of the aliens at their gruesome sport.

Why are the New York Times and Fortune Writing About Clarkesworld?

Ordinarily science fiction magazines are far below the radar of the general press. Indeed, even within the world of print science fiction--a far smaller world than most people realize, quite at odds with how big science fiction has become in the audiovisual media--they are far from their peak of significance to the field. As with publishing generally the center of the action shifted from periodicals to books, and the magazines have increasingly struggled to stay relevant since, more often than not failing.

Yet Clarkesworld has been much in the news lately, discussed even in places like the New York Times and Fortune--because they are pretty much all that remains of paying opportunities to publish short-form fiction. The fact has put them on the "front-line" with regard to the use--and abuse--of chatbots, as seen in Clarkesworld being inundated with chatbot-generated short story submissions, and as a result ended its open submissions policy.

As it happened, much of the content the Clarkesworld staff received was of very poor quality, and frequently visibly plagiarized--the would-be authors apparently thinking they could have a chatbot whip up anything old thing and then send it along without even giving it a decent going-over. But all the same, the people who did this tried it.

Perhaps in this age of digital submissions we will see editors get a hold of of algorithms designed to spot at least the crummier submissions. However, it is also the case that many expect that chatbots will get better. This may be yet another of those expectations that comes to nothing, as so much of the hype about artificial intelligence-based technologies has. Still, it is worth noting that artificial intelligence developers have as yet had their hardest struggle getting to the point at which AI-powered devices could perform everyday tasks requiring the ability to navigate complex physical environments, eye-hand coordination, manual dexterity, from driving through urban environments to flipping burgers. They seem to have been more successful with tasks not making those demands--such that I can picture them replacing coders before they replace truck drivers, or even restaurant workers. And so it does not seem a great stretch to think that what Clarkesworld has seen may be merely the beginning of that shift that long ago already seemed to me underway, where in a publishing market dominated by long series', tie-ins, "co-authored" books and the rest, rather than just requiring writers to produce a standardized product like machines publishers will simply let the machines churn out the product for the dwindling audience there seems to be for anything that actually has to be read.

Thursday, February 23, 2023

Remembering the Saturday Night Fever We Forget

It seems that Lenny's Pizza in Bensonhurst, Brooklyn, is closing down.

I have to admit that I didn't even know about any Lenny's Pizza until the news story about its closure, occasioned principally by its appearance in Saturday Night Fever. Still, all the talk about the film did remind me of how the film struck me when I first saw it, long after its initial release.

I remember being surprised, very surprised, by just how bleak the movie was. The innumerable evocations of the film I'd seen in other films (like Airplane!, or Short Circuit), all emphasized the disco stuff. But the movie was not just about Saturday Night. It was also about "Sunday morning," and all the other mornings, noons and nights of the rest of the week as experienced by a bunch of working-class kids leading dead-end lives in an America that, as we see even more clearly with the hindsight of a half century, was facing diminished prospects after the end of a post-war boom that proved singular event rather than "new normal"--and the unexpected urban grit made more of an impression on me than anything else in the film.

Indeed, the fact that a movie like this was made as a big feature, and became a big blockbuster, struck me as itself indicative of its having come from a different time, and the bit of film history I have learned since has only affirmed that. In 1977 "high concept" film was already a-borning (Jaws came out two years later, while Star Wars had made its debut six months earlier), but this was still the era of the "New Hollywood."

By contrast it was very different with that obvious counterpoint, the following year's screen adaptation of the hit stage musical Grease, where John Travolta played a very similar figure to the one he played in Saturday Night Fever (Danny Zucco, of similar social background and not much more intelligence) in a far more cheerful tale. The grit scrubbed out of the picture it was a bright, upbeat, nostalgic story where the things that went wrong for the characters in Fever went right for them here. (Unlike Fever's Bobby C., it turns out that Kenickie did not get his girlfriend pregnant after all, but even if she was, he was ready to "make an honest woman" out of her, instead of performing suicidal stunts on a bridge and falling to his death.)

I suppose Hollywood was still able to take a hard look at the less pleasant facts of contemporary life--but increasingly inclining to make a principal product of looking at the past with rose-colored glasses, the New Hollywood starting to give way to "the New Old Hollywood" increasingly dominant since. Now anything like the working-class drama of Fever is apt to be the stuff of little indie movies almost no one will see, or feel-bad prestige TV that a certain kind of cowardly ideologue will sneer at as "poverty porn"--while we are barraged with commercials for the upcoming HBO Max prequel to Grease, Grease: Rise of the Pink Ladies.

Baywatch as a Piece of Pop Cultural History

Baywatch had its premiere on NBC back in 1989 and actually was a network show for its first year, but after its cancellation by the network's execs it continued in first-run syndication.

It didn't begin the syndicated TV bonanza of the late '80s and '90s. Star Trek: The Next generation, which launched in 1987, can get more credit for really getting the trend going. But it was an especially large success (the "most watched show in history"), which besides inspiring a good deal of direct imitation (like a four-season revival of Flipper), and more broadly encouraged the boom (helping pave the way for such hits as Xena: Warrior Princess).

Watching it now one is reminded of other aspects of how TV used to be, with its lush opening credits sequences whose opening theme songs would became pop cultural standards; its episodic plots; its lightness of tone, even amid danger-filled action--not taking itself too seriously but also not being obnoxiously flippant. Even more striking in comparison with what we get in even today's lighter fare, distinctively lacking the "edginess" and meanness and pretention and the self-satisfaction that attends all these things we take for granted even in network sitcoms as the makers of the show endeavored to please rather than offend. This all made it the kind of easy viewing one finds only in certain niche sorts of programming these days (like the movies Hallmark and its rivals put on the air, which is generally not quite like this). There is, too, the profound L.A.-ness of the show, and indeed L.A.-ness from when L.A. was the embodiment of the American fantasy of "the good life," which, again, befit that light entertainment approach--people wanting to imagine themselves someplace more alluring than where they happen to be.

Owing to how well it all went over the show stayed on the air so long that its trajectory was to be bound up with the changes in television a decade after its launch. Amid the increasing crowding of the market a decade after its launch, followed by its peaking --and one might add, the general decamping of "Hollywood" Hollywood from what had so long been the default setting for American film and television production, and even the Lower Forty-Eight--the production relocated to Hawaii. This bought it no more than an additional two years, what was now called Baywatch: Hawaii airing its last episode on May 19, 2001.

In its way that date seems to me another little marker of the end of the "'90s," and the beginning of the twenty-first century, with the franchise's fortunes since confirming the distance pop culture traveled--by and large, not for the better. Unlikely material as it was for a big summer movie, that was exactly what Paramount offered up--with the script's writers opting for making the characters look like idiots in what was (mostly but not always) a parody of the original, heavy on frat-boy gross-out humor, rather less heavy on the appeal that drew viewers back week after week. (It was not even made in L.A., but instead in Florida and Georgia, and set in Florida's Broward County!)

It was no surprise that even the poptimists and outright claqueurs who now pass for "film critics" were unimpressed--and so far as I can tell, no hit with fans of the old show, the movie enjoyed or not enjoyed on its own terms, rather than as anything to do with that original. (Mostly "not enjoyed," to go by the ratings I have seen on the Internet Movie DataBase.)

Similarly a sign of the times would seem a comparison of Baywatch with that more recent fantasy of life in L.A., Entourage. With Entourage we have the celebration not of "golden Los Angeles" so much as "golden Los Angeles as experienced by Hollywood's favored few," courtesy of some of those few who again showed us the distance pop culture had traveled, and again, not for the better. The way this sort of thing is supposed to work is that we the audience feel ourselves there, vicariously living it up in this fictional world. Instead Entourage always made its viewers feel as if they had their noses pressed to the glass, looking in on someone else enjoying themselves at a party from which they barred--with those they were watching enjoy themselves a pack of thoroughly unlikable idiots. Indeed, one of the more perceptive comments about Entourage described it as "the most taunting show in TV history . . . show[ing] us what we're missing, and rub[bing] our faces in" it--and so for all the sun, the glamour and the rest made the viewer, just as much as any episode of Mad Men, think "I hate the world, and everything in it, myself included," depressing them without having earned the right by virtue of at least trying to say something important. Such does it go when a show comes from the "mind" of Mark Wahlberg, I guess, though in fairness he is a product of his time and milieu--all as he does that time and milieu no credit.

Tuesday, February 21, 2023

Remembering Baywatch

It has been a third of a century since Baywatch first hit the airwaves, and on streaming it is back, remastered and as bright and glossy as ever.

The show was unpretentious, light entertainment, and absolutely unashamed of the fact—indeed, made the most of it in a manner not unique to it (one finds that even action shows like this one, the original Magnum or MacGyver for instance, tended in the same direction), but arguably more successfully than its contemporaries. Helpful here was the setting and its associated imagery--the sun, the sand, the surf, and of course, the Baywatch beauties, every bit of it redolent with the mystique southern California had not yet begun to lose, and all of it shot in that glorious music video-style that the makers of television for the small screen were just then mastering as the makers of film for the big screen had before them. Helpful, too, was the fact that if the show did not wholly shut out reality (indeed, shut it out rather less than a good deal of "serious" drama praised to the skies by the critics) it rarely got gritty or brutal, and made the viewer feel that the characters inhabited a world where yanking Darwin Award aspirants out of the water was the biggest problem with which the world had to contend, and being a lifeguard on the beach accordingly the most important job in that world--a situation that, I think, most of us would regard as a vast improvement over the polycrisis-ridden world in which we had even then been living for as long as anyone could remember.

It was a winning combination--according to the legend, at any rate, getting well over a billion regular viewers globally at its height, making it the most-watched show in history--a title no show would seem likely to claim from it in this crowded and fragmented media universe, the more in as the strategy for surviving in the ever-more brutal attention economy seems to be to aim for intense appeal at the expense of wide; and much of what made this show such a hit now so unfashionable as to be virtually impossible (and in cases, unlikely even to be tolerated were it attempted).

Still, the fact that it isn't quite like anything they make now is the more reason for its lingering in the public consciousness the way it has, its stars still pop cultural icons. Thus did I find when looking at a list of the 500 most searched-for keywords that, a generation after her departure from the show, Pamela Anderson is on that list (at #435), making her still one of the most searched-for celebrities in the world. Meanwhile, one of the very few ahead of her is her fellow Baywatch alum, Carmen Electra--at #87, putting her behind only Kim Kardashian!

Looking over her list of credits there can be no doubt about which of them contributed most to that standing.

Of Greedy and McTeague

I remember how back in March 1994 the film Greedy came out with what seemed somewhat more than the usual fanfare for a late winter/early spring release in those days.

As it happened the critics received the film as less than a complete success, while the box office was not boffo. (According to Box Office Mojo it was not even among the top hundred earners at the U.S. box office in 1994.) Unsurprisingly the movie seems to have been quickly forgotten afterward--and since then, never being rediscovered and acquiring a cult following the way some films do, remained in the obscurity into which it had passed. Still, I enjoyed it at the time and think it had its good points, not least a twist-filled plot more than usually ambitious for comedies of the day, and some memorable performances--not least, those of screen legend Kirk Douglas as the scheming family patriarch, Olivia d'Abo as his "nurse," and Phil Hartman as the nastiest of his would-be heirs.

The film's cinematic and literary allusions, which went over my head at the time, seem at least to be a point of interest. The title Greedy was a play on Eric von Stroheim's classic silent film Greed (1924)--which was an adaptation of Frank Norris' novel McTeague (1899), with all this, of course, reflected in "McTeague" being the name of the family in question.

Of course, that--and the fact that greed is indeed a theme of the film--is as far as it goes. Norris' novel, to the extent that it is remembered today, is recalled as a classic of Zolaesque naturalism in all its social insight, and the forcefulness, even brutality, with which it conveyed that insight. (McTeague centers on a love triangle involving two friends and the woman they both want, the three members of which may be fairly described as ruining and killing each other--with their deaths prefigured by the destruction of another couple before them.)

By all accounts Von Stroheim's Greed was faithful to that. By contrast Greedy was a high concept comedy, and not an especially dark one, which delivers the expected happy ending--with the result that in the end the title of the film and the choice of McTeague as the family name are just allusions, nothing more, with even a relatively favorably disposed viewer of the film like myself not ready to claim any clear evidence that the makers of the film had anything much deeper on their minds.

Mike Davis' City of Quartz: Excavating the Future in Los Angeles

The novelist Frank Norris remarked that the United States had only three "story" cities, and identified them as New York, New Orleans and San Francisco. I'm not sure that I would have ever wholly agreed with that list's being so limited. (What about, for example, Boston or Chicago, where Norris was actually born--and to which he did devote a major novel, the second installment in his Epic of the Wheat, The Pit?)

However, I certainly tended to not think of Los Angeles as on that list--even though it was the default setting of our screen stories from the sitcom to the action movie. Compared with those metropolises of which Norris wrote, and the other older American cities with them (like now nearly four century old Boston), L.A. seemed not so much "young" as "new," and if in portions glittering essentially hollow--a spot on the map rather than a place that (apart from a handful of Hollywood landmarks that I knew to be only a very small part of the whole sprawl, and remote from the everyday lives of almost everyone living there) lacked local character, or even simple visual distinctiveness. (I remember reading a review of 1997's disaster movie Volcano at the time of its release which lovingly detailed the succession of L.A. sights devastated amid the chaos. It just looked like generic modern cityscape to me.) Reflecting this, when toiling away on thriller plots in those days it was very far down my list of potential settings. (Where North America was concerned I inclined to New York--with legacies of that to be seen in both Surviving the Spike and The Shadows of Olympus.)

It was only reading Mike Davis' City of Quartz that made me look at Los Angeles differently. As with many of Davis' books it is less a neatly unified text than a loose collection of essays examining such varied subjects as the emergence of the city's "power structure," the views European visitors took of the place, the penchant for fortification evident in the local architecture, the counter-insurgency-like policing of the Daryl Gates era, the locale's "homeowners'" associations, the Catholic Church's role in local life, the economic fortunes of nearby Fontana, California--and of course, how the city has been "marketed" nationally and internationally. Still, together they covered a very great deal of territory, from a great many angles, and in the process gave me a sense of Los Angeles as a "place" that it did not have for me before, and better appreciate it not just in itself but as American myth, dream, fantasy and nightmare all at once, and in every one of those respects import in American history culturally as well as materially.

Of course, others remember the book differently. Davis, whose work since that time includes such titles as Late Victorian Holocausts, Planet of Slums and, in his other writing on L.A., Ecology of Fear: Los Angeles and the Imagination of Disaster, is the sort of public intellectual who gets little mainstream attention. (In line with its centrist prejudices the media is so much more fearful of the left than the right that in its case it more often hews to the view that "any publicity is good publicity," and denies it that, smothering it with silence.) And what attention Davis has got has often not been positive--sneers about a proneness to "catastrophism" a cliché of the dismissal of his work.

Also becoming a cliché in its way has been the tendency for those dismissals to look foolish later on. Anticipating a "social explosion" in Los Angeles in City of Quartz Davis looked inspired rather than scare-mongering in the wake of the 1992 Los Angeles riot. Likewise, as COVID-19 upended the world in ways few have the courage to acknowledge even three years on, his book about the danger of a pandemic meant a spike in "demand" for his comment--while one would imagine that the ongoing spread of avian flu that seems to have played a part (but only a part) in the soaring price of eggs to a degree conspicuous even in this moment of soaring prices for everything would have made him similarly in demand now were he still alive, that particular threat having been the focus of the aforementioned work.

The California Dream and its Waning

Reading Mike Davis' City of Quartz I was surprised to find the extent to which Los Angeles is a product of real estate boosterism--the vast metropolis emerging before its industrial base, rather than the industrial base emerging first to become a foundation for a metropolis (with, contrary to what we may imagine given California's association with aerospace, computers, and so much else, manufacturing only blooming late in the state). Surprising, too, was the extent to which that boosterism was politicized in the late nineteenth century and early twentieth century. Southern California was promoted as a "refuge" for old-stock WASPs from the newer waves of immigrant, with their very different ethnic and religious backgrounds, with an element of health faddishness mixed in to produce a vision of "racial revival under a Mediterranean sun." It was also marketed as an "open shop" alternative to trade-unionized San Francisco.

In short it was a thoroughly right-wing vision, compounding a multitude of Victorian prejudices. And indeed, while the "Left Coast" stereotypes have a long history, southern California has long had its right-wing streak, with this evident not only in San Diego or the "Inland Empire," but in Los Angeles itself (certainly evident when one considers its elites, its social divisions, its policing).

Of course, for many on the right the vision of southern California as an ideal has long since given way to a view of it as a "Blue" wasteland exemplifying all it dislikes about the direction in which the country has been moving, such that rather than refuge it has in their view become a place to flee (for "Redder" territory inland). Meanwhile they have not been alone in watching, wondering, worrying, if for different reasons. Davis, coming from the opposite end of the political spectrum, called into question the viability of a metropolis built in that precise geographical location, with its susceptibility to earthquake and fire. He pointed, too, to the distinctly human-caused problem of the deindustrialization of what had once been one of the country's, and the world's, greatest industrial centers (at its height reputedly the second-biggest carmaker in the world after Detroit in its fabled heyday), with all that implied for its social sustainability. Amid an epoch of (likely climate change-amplified) drought, with water short and each summer the fires more horrific than in the last--as deindustrialization and its associated social stresses have mounted--one wonders if that writer, so often sneeringly dismissed as a catastrophist but vindicated by events, was not as right on this score as he has been on so many others.

A Zombie (Firm) Apocalypse?

The term "zombie firm" apparently goes back to at least 1987 but, unsurprisingly, acquired its current popularity with economic and financial commentators amid the twenty-first century's pop cultural obsession with the phenomenon.

In considering it one should note that there is no generally accepted definition of the term "zombie firm." However, the usage generally denotes a firm that is "economically unviable" in a particular way, namely its surviving on the extension of credit it has virtually no hope of ever repaying--such that the firm may be existent beyond the end of its natural life in a condition that, less than fully "alive," may be better understood as simply "undead."

With zombie firms' survival dependent on easy credit the concern that there are many more of them about than usual (and, perhaps, that firms of greater weight than usual fit the zombie profile) naturally grew during the protracted period of extremely low interest rates which followed the 2007-2008 financial crisis--and with it the concern that should the unsustainably ultra-sub market interest rates go up these firms will suddenly find credit no longer so easy, and the undead become simply the dead, the more in as such a surge in interest rates has been underway since early 2022. Should it be the case that there are more zombies about--that those zombies are larger and more central to the economy--then those zombies' being cut off from credit and finally dying out will mean that much more economic pain as a result of the tightening of credit.

Of course, trying to assess the matter in any rigorous way means trying to precisely establish a standard, and here there is, for the time being at least, room for individual judgment--which is to say, also interest and prejudice. Those looking to justify the monetary policy of the last two decades--to say that the ultra-loose monetary policy that prevailed for almost a decade and a half has been beneficent, and that the current rise in interest rates is also essentially beneficent--seem to be inclined to think that zombie firms are not a very big problem. (Indeed, that is what a 2021 Federal Reserve study of the matter argues, while I have seen nothing from this direction to indicate otherwise.*) By contrast others leerier of some part of this trajectory--for instance, Wall Street figures for whom the interest rate can never be low enough unhappy with their current direction, perhaps especially those concerned with sectors that may be more vulnerable than the average (as with venture capital-funded tech, or the real estate sector that may be the country's "most zombified"), display less equanimity about their economic impact.

For the time being the U.S. is not, officially, in recession, and indeed there has been a recent burst of optimism about the possibility of a "soft landing" in the wake of a decade and a half of monetary chaos--but the present trend (which includes a further ratcheting up of the interest rate) suggests that in even the most optimistic view (to say nothing of a less optimistic one that would seem better founded) such companies will find their path getting steeper, in the process revealing just how many zombies really are walking the Earth among us.

* The cited Federal Reserve study used for its standard the classification of a zombie firm as one with "leverage above the sample annual median, interest coverage ratio . . . below one, and negative real sales growth over the preceding three years," which means a company in the unenviable situation of being in the top 50 percent with regard to the ratio of debt to equity; pre-tax and interest earnings that are less than the interest they must pay; and shrinking sales for three years; such that they can't even keep up with their interest payments, while their earnings are moving in the wrong direction from the standpoint of their ability to pay, let alone get out from under their debt. (For what it is worth this seems to me a fairly reasonable standard.)

Friday, February 17, 2023

How Would Dostoyevsky's Underground Man Be Living Today?

Reading one's way through the classics of nineteenth century literature one can scarcely avoid Fyodor Dostoyevsky--and while he is perhaps best known through his thousand-page epic The Karamazov Brothers (one big book that justifies its length far, far better than most in this reader's opinion) I suspect many more who actually read him (rather than lyingly say they do) know him from shorter works. Among the more notable is his Notes From Underground--the first-person confession of a man who, today, might be described as a nineteenth-century hikikomori who has used a modest inheritance to isolate himself in an apartment in the Russian capital of Saint Petersburg for as long as his money will hold out.

Naturally I have found my thoughts turning back to it when taking up the subject of social isolation today. In the course of that I found myself thinking about how technology has changed the situation of someone of similar means attempting to live in such a manner--a point that can seem trivial next to some of the issues with which Dostoyevsky struggles, but have some importance.

In the book the "Underground Man" (as the character has come to be known) was obliged to keep a servant to relieve his household cares (somebody had to go out to get food). Today modern household appliances and online delivery make that rather less necessary.

Meanwhile, Dostoyevsky's Underground Man had little to do in his apartment but read. Today he would have the brave new world of electronic entertainment to keep himself busy--including all the books anyone could handle if he opted to have an e-reader, though I suspect he would have spent a lot of time playing video games. On those occasions when he got the itch to interact with others (which in the book means his barging into the company of people he hates, and who do not want him around, time and again leading to dreadfully humiliating episodes) he could have just done it online--while he would have probably opted for cybersex over his trips to a brothel.

All of this, I think, makes it easier for people to live in isolation--to, while keeping isolated, minimize their contact with others, and keep themselves entertained. And I suspect that many more now do so because of that.

In saying so I render no judgment on "technology"--the availability of which I consider to at least have given people some measure of choice. What seems to me a problem is the condition of a society that makes so many find becoming an Underground Man the least-worst of the options open to them--which is absolutely no testament to its health.

Top Gun 2 vs. Avatar 2: A Note

In 2022 fans of Top Gun crowed at the reception to the sequel, with some justification. The movie was the #1 hit of the year at the North American box office.

But it was a different story in the global market, where Avatar 2 has been the biggest hit, by a long way. At this point the sequel seems likely to finish with a take about fifty percent higher than Top Gun: Maverick ($2.2 billion+ versus the other film's $1.5 billion), with the difference overwhelmingly made by the international market. (Avatar 2, closing in on $1.6 billion outside North America, made more than twice as much as Maverick in the international market, and more in those markets than Maverick made altogether.)

It is a reminder that, even as the world economy seems to be fragmenting, the international markets still matter--and that Hollywood will come out ahead offering movies with broad global appeal, with Avatar arguably the easier sell worldwide than the flag-waving Top Gun.

The principle would seem to apply elsewhere, with a Chinese film industry offering up jingoistic action-adventure and war movies playing well at home (Wolf Warrior 2, The Battle at Lake Changjin), but probably not helping Chinese film very much in its reaching that bigger, global market, adding obstacles to a path already cluttered with them.

The (D)Evolution of Keir Starmer's Rhetoric: From the Ten Pledges to His 2023 New Year's Speech

Back during the Labour Party's leadership contest in 2020 Keir Starmer presented to the world a social democratic platform of "ten pledges," promising higher taxes on the rich, the end of college tuition, "common ownership of rail, mail, energy and water," the repeal of the Trade Union Act, and "the Green New Deal at the heart of everything we do."

He won the contest of course--and then quickly distanced himself from the ten pledges. Already by the time of his speech in answer to the government's budget a year later he limited himself to milder criticisms than would have been expected given his purported vision for Britain, and the "effective opposition to the Tories" that was itself a promise (#10). Not long afterward he officially scrapped the pledges.

Subsequently, as his 2022 Labour Party Conference keynote speech made clear Starmer has, if sprinkling his speeches with radical touches (throwing around references to "the 1 percent" as if he were an Occupy Wall Street activist), consistently stressed "fiscal responsibility" (for which purpose he repeatedly declares himself ready to defer the accomplishment of "good Labour things"), and only grown more emphatic in his expressions of a desire for "partnership" with business and exaltation of entrepreneurs, with this much more than this leader of the Labour party says about labor, toward which he is at best insultingly aloof. Even his most ambitiously leftish project of 100 percent "clean energy" by 2030, with this done partly through a state-owned firm which could facilitate the transition, appears a pious desire rather than a firm promise (whatever that would mean from the same man who not only made it clear that everything else is secondary to austerity economics, but so casually discarded his prior pledges). And to read his subsequent speeches is to be numbed by the repetitiveness of what have come to seem Starmer's clichés.

The result is that Starmer early on more than confirmed any suspicions that he was a Blairite neoliberal-centrist posing as a leftie to neutralize a leftward challenge, and then as soon as it was convenient reverting to his actual sense because, as he keeps saying "I want to be Prime Minister." Indeed, in his justification of his scrapping of the pledges he exemplifies that particularly neoliberal habit of seeing the correct response to every crisis as a right turn (i.e. "disaster capitalism"), of always dealing with the troubles made by neoliberalism with . . . more neoliberalism.

Of course, Starmer might still "lead" Labour to victory in the next General Election for all that. At this point Britain has already seen over twelve years of continuous, catastrophic (austerity, Brexit, a badly bungled handling of the pandemic in all its facets, a post-war era-style currency crisis and IMF bailout, and now, as always, more austerity in its wake) under a string of unfortunate Prime Ministers it seems almost impossible to say anything positive--David Cameron (who staged the Brexit referendum as a bluff on behalf of the regulation-haters in the City--some bluff that), Theresa May (whose tears were only ever for herself), Boris "let the bodies pile high in their thousands" Johnson (even if he didn't actually utter the words his actions spoke them), Liz Truss (who thinks imported cheese is a disgrace and in the wake of presenting her fiscal vision "got outlasted by a lettuce head"), and that Rishi Sunak (who in his youth bragged about having no working-class friends, and probably isn't making many now with his austerity on steroids). In the process they have made it very, very easy for the official opposition to beat them--especially when it is led by a neoliberal who will be treated far more gently by business and the press than his predecessor, after which victory he will, of course, get hard at work disappointing whatever slight hopes he managed to raise in the voters in another turn of the widening gyre.

Wednesday, February 15, 2023

On the Entertainment Media's Coverage of "Bond 26"

It may seem odd that I am writing about the coverage of "Bond 26" rather than about the (as yet nonexistent but widely anticipated) movie itself. However, that coverage is itself odd--sufficiently so as to warrant some remark.

The reason is that the entertainment press keeps talking about the project--even though there is no actual news for them to report, let alone analyze. Either the producers are doing nothing at all, or the preparations are an extremely well-guarded secret. In either case the entertainment press is left with nothing to offer. Yet they keep talking--and in the process recycle the same speculations over and over and over again.

"Who will be the next James Bond?" they ask, and run through (mostly) the same list of candidates over and over again.

"What will the tone of the movies be like?" they ask. Will they stick with the more grounded material and darker, more "character"-oriented tone of the Daniel Craig era, or revert to the more flamboyant adventures of yore?

It was all actually talked to death around the time of Spectre--back in 2015. (You know, before the Trump presidency, before Brexit, before #MeToo, before the pandemic, before lots of things that can not unreasonably leave one feeling that the world has changed.) But the extreme delay in the release of that movie that from the first was expected to be Craig's last, No Time to Die, meant they just had nowhere else to go. And so has the fact that in the year since that movie hit theaters there has been so little for those claqueurs-posing-as-journalists tasked with sustaining public interest in the franchise.

Might that start to change soon? Certainly the film industry seems to be stabilizing, and along lines not much different from what prevailed before--small dramas and comedies ever less likely to be a theatrical draw, and ever more consistently relegated to streaming; but the big franchise action-adventure doing well, as demonstrated over the last year by hits like Spider-Man: No Way Home, Top Gun: Maverick and Avatar: The Way of Water in particular, and likely to be further reconfirmed as we proceed through what looks like a thoroughly blockbuster-packed 2023. The result is that Hollywood will feel more confident about pouring its money into such projects as a new Bond film.

Still, that has only been part of the problem. It seems that No Time to Die was, to the extent that it did less well than some hoped (especially in North America), was held back not merely by the pandemic (undeniable a factor as it was), but also the franchise's plain and simple running down. This most venerable of high-concept action-adventure franchises--the one that can truly be credited with "starting it all"--is showing its considerable age, which if it were a person would have it looking forward to its pension a few years hence. Audiences were less interested, with this going especially for the younger age cohorts for which nostalgia plays less part, and the tropes are less resonant (having grown up on superheroes rather than secret agents), in an ultra-crowded market where 007 appears ever less special (indeed, perhaps even less their favored secret agent hero than the now similarly-approaching-retirement-age gang from the Fast and the Furious franchise). This has likely played its part in the hesitations of the producers about proceeding--and I have no idea how far they may be along in moving past that to settle on just what approach with Bond 26 will get the public excited about 007 again (assuming, of course, that is not a bridge too far"), the more in as the budgets, and the grosses that alone can justify them, make such movies ever more a preposterous gamble for the backers.

Robert Heinlein's Friday, Again

In Robert Heinlein's novel Friday the character identified as the Boss remarked at one point in the novel that "a dying culture invariably exhibits personal rudeness. Bad manners. Lack of consideration for others in minor matters," and that this "symptom" of a dying culture "is especially serious in that an individual displaying it never thinks of it as a sign of ill health but as proof of his/her strength."

The remark was not a throwaway line, but relevant to the book's theme, precisely because the titular protagonist's problem was that of making a life for herself in a world that really was supposed to be falling apart.

Of course, people have always complained that their "culture was dying"--and as part of the package always complained about the decline of manners. It has especially been the case that the old have always complained about the young in this manner. And for precisely that reason I ordinarily discount this sort of thing.

Still, the passage caught my eye partly because there was a bit more nuance here, in particular the observation that it is something that worsens, with the bit about mistaking rudeness for "proof of strength" especially telling. A culture where people take such pride in being an "asshole" as we see, in which people who act like "assholes" are worshipped by people whose greatest aspiration in life seems to similarly be "assholes," is not necessarily "dying," but also far from healthy.

The Screen DC Universe Reboots: What Can We Expect?

The superhero movie genre has been booming from the very start of this century, with the entire period seeing the major studios barraging audiences with A-grade, big-budget blockbusters about the biggest household name characters from the pages of Marvel and DC Comics--accompanied by a smaller but still significant traffic on the small screen (in the age of streaming, increasingly an extension of the big-screen stuff, such that you had to watch Wandavision to fully appreciate Dr. Strange 2).

Unsurprisingly, after being so heavily exploited for so long, the genre is not looking its best.

Consider where things now stand. The Marvel Cinematic Universe (MCU) has gone from looking unstoppable in Phase Three to underwhelming in Phase Four; and the DC Extended Universe a letdown again and again, such that Warner Bros. Discovery is officially giving up on the old vision (all as it likewise seems to be giving up on the small-screen Arrowverse). Amid all that lovers of superhero film and television desirous of more would seem to have had as their best hope for something fresh and interesting the reports of James Gunn and Peter Safran's work on some grand reboot of the DC Universe for after 2023 (at least, excluding the decision to make a sequel to The Batman, due out in 2025).

This month they have revealed part of what audiences can expect, including:

* A TV show all about Waller (the character Viola Davis first played in 2016's Suicide Squad), inventively named "Waller."

* A Green Lantern show "in the vein of True Detective."

* A Game of Thrones-style show (blood-soaked soap opera?) set on Wonder Woman's home island of Themyscira.

* A "much more hard-core" Supergirl pointedly produced by a horrific childhood spent getting to Earth.

* An "impostor syndrome"-themed show about Booster Gold.

* A Batman film (not starring Robert Pattinson) utilizing the content of The Brave and the Bold, complete with Damian Wayne in a "very strange father-and-son story."

That's not all of it, but I think you get the picture.

The entertainment press, of course, is responding with its usual enthusiasm.

I will admit to not sharing the feeling. Admittedly I have been losing enthusiasm for the genre for rather a long time--the genre seeming to me to keep going mainly on the form's convenient fit with the studios' particular marketing requirements rather than its having anything fresh and new to offer (and depending overmuch on such superficial tweaks as the "edginess" of Deadpool, etc. for any appearance of newness). But there is also something else evident in the pattern that has me not looking forward to any of these--namely that (as we ought to have expected from the writer who decided "Let's make Scrappy Doo the villain!" in a textbook example of the principle that the Teen Titans Go! episode "The Return of Slade" had to teach, and which the absolutely unteachable folks in Hollywood seem absolutely incapable of learning), he is going all-in on "dark" and "edgy" and frankly pretentious. (A Green Lantern show "in the vein of True Detective?" Really?) Irksome enough in itself to those of us who think big dumb action movies should actually be fun it is the more annoying because rather than some dramatic shift-of-course in that respect, at least, it actually sounds like just-more-of-the-same stuff that I, for one, didn't particularly care for in the first place.

Saturday, February 11, 2023

What Do The Fabelmans, Batgirl and Rust Tell Us About the Future of the Movies?

Especially as two of the three films I name in the title of this post have not even been released--one of them, in fact, unlikely to even be completed, let alone see the light of day--while the other movie has actually been very little-seen, it may be best to offer a word or two about each one of them.

The Fabelmans is a drama Steven Spielberg made on the basis of his experiences growing up that, if carrying the highly salable Spielberg name and much loved by the critics, was probably never going to play like Indiana Jones or Jurassic Park, or even Schindler's List or Lincoln. Still, box office-watchers had expected the film to do better than it has. (Its first weekend in wide release saw it take in a mere $3 million, while, even with the bump from its slew of seven Oscar nominations--including Best Picture, Best Director and two for the cast's performances--all it has taken in at the North American box office is a bit under $17 million--less than half the reported production budget of $40 million.)

Batgirl, as the reader has probably guessed, is a Warner Bros. Discovery production starring just that character intended for streaming. The production, which ran into trouble partly on account of the pandemic, and underwent a change of directors, saw its budget swell from $70 million to over $90 million. Subsequently there was a change of leadership at the company, which judged the resulting movie's prospects in either the streaming market or in theaters too poor for the movie to be worth completing, and, according to the prevailing reports, decided to just bury the unfinished movie and take the tax write-off.

Rust is an upcoming Western starring Alec Baldwin (who also has story and producer credits)--and if you have heard about this one it is probably because Baldwin, and the movie's armorer, Hannah Gutierrez-Reed, are up on charges of "involuntary manslaughter" as a result of his having fatally shot someone on the set with a gun he did not know to be loaded.

These are quite different films, with not much in common between them, but individually they have been recognized as each indicative of a significant trend in film production. The Fabelmans has been taken as an indication of the continued decline of the salability of drama as a theatrical experience--people generally opting to watch such movies at home. Batgirl has been an indication of the media companies' drawing back from their earlier splurging on streaming in recognition of its limits as a revenue stream (no substitute for selling $15-$20 tickets in theaters, and indeed unlikely to justify the cost of a mid-to-high-budget production like Batgirl). And Rust, where amid talk of the on-set tragedy one constantly hears reference to extreme low budgets and corner-cutting (which, for his part, actor Michael Shannon has called out as part of a bigger pattern) is indicative of the tendency in financing those films that are not "the biggest."

The result is that, while the pandemic period saw some experimentation with streaming in unprecedented ways--Disney making grade-A blockbusters like Black Widow and Mulan available at home for a surcharge on their opening weekend in theaters--any radical shift in the making and marketing of such content seems to have been made implausible by the results, the studios focusing on those theatrical releases and their as yet irreplaceable $15-20 a ticket revenue stream. These blockbusters (the enduring place of which is demonstrated not just by failures like The Fabelmans but undisputed successes like Spider-Man: No Way Home, Top Gun: Maverick, Avatar: The Way of Water) are ever-more dominated by a very few genres, above all action-adventure (and to a lesser extent, splashy animated features dominated by musical comedy), to the point of everything else not only being crowded off the multiplex screens, but people falling out of the habit of catching them there (arguably, the more easily as the theatrical, giant-screen experience provides less "added value" with other movies than it does the noisome spectacles).

Meanwhile, with producers disabused of their illusions about streaming, everything else is being made more penuriously, not only as Rust was, but such that WBD, and its rivals, would probably not authorize a project like Batgirl today (too costly to be a practical for-streaming project, but still too small a production for theaters, with this logic seemingly confirmed by the bigger budget for a Blue Beetle movie intended for theatrical release), and a movie like The Fabelmans, especially when not made by a Spielberg, have ever less chance of being authorized as a production on such a scale.

In short--giant action movies and cartoons at the theater, and the relegation of pretty much everything else to cut-rate streaming, with very little in between. For the film-maker and the film-lover alike it is not a happy situation--a hardening of the already oppressive constraints on what could be made. However, it seems far better to acknowledge that forthrightly than indulge in the mindless boosterism so tiresomely standard for the entertainment press--the more in as it is really just a confirmation of where most of those with any observational skills at all knew things were going way before the pandemic.

Are Commercials More Annoying Than They Used to Be? Yes, Yes They Are.

A certain sort of person shrugs off other people's observations about the ways in which the world may be changing with the view that everything is always the same, and any perception of change is subjective and meaningless. The bias of contemporary culture toward respect for ideas that look like "eternal verities," the frequency with which people offer superficial observations that are easily debunked, and the "air of superiority to the world at large" with which such statements tend to be made, causes that shrugging banality and cop-out to pass for great wisdom. But all the same, that objective, material world is out there, and we are undeniably part of it and affected by it. Yes, there are times when life probably really is getting harder for most people--and where such evidence goes it is hard to argue with something like life expectancy, with falling life expectancy proof of genuine hardship. The trend of things in the U.S. this past decade, with (according to CDC/World Bank data) male life expectancy (already eroding before the pandemic) fallen over three years in just the 2014 to 2021 period alone (from 76.5 to 73.2 years), and female life expectancy a more modest but still significant two years (from to 81.2 to 79.1). All of this is the more the case as it is virtually undisputed that the deaths lowering that expectancy (the pandemic apart) are "deaths of despair" (alcoholism, drug abuse, suicide, etc.), while the averages conceal significant differences, with the poor suffering more than these figures indicate. Even before the trend had progressed so far the richest 1 percent of men lived 15 years longer than the poorest 1 percent--while the erosion of life expectancy in the years since would seem to have hit the poor harder than the rich (the portion of it related to the pandemic included) . . .

"Wait, wasn't this going to be about commercials?" some readers are doubtless asking.

Yes, in fact it already is. Specifically it seems to me that in a world where people are suffering so much--the actual deaths only the most extreme outcome of the untoward larger trend--even the little things are weighing on us more.

Like the annoyance caused by commercials.

Still, I do think the actual experience of those commercials is relevant. There is how advertisers strive harder and harder to penetrate an ever-more cacophonous media universe to reach a hyper-saturated audience that wants absolutely nothing to do them, and has developed defenses against them. There are such disgusting "innovations" as autoplay for videos no one would choose to play, so that before you have even spotted the video you hear the commercial blaring and are left hurriedly trying to locate it and shut off the distraction so that you can get on with what you came for. There is the way that when we are online going to some website we will be hit in the face with a popup demanding we disable our ad blockers--the element of coercion, the fact of our putting ourselves to the inconvenience of disabling our protections (and then having to enable them again as soon as we leave), acquiescing in their harassing us, the more distasteful. There is their readiness to make their content annoying as the price of keeping us from shutting them out mentally--with one tactic the design of commercials so that after seeing them a hundred times you are still not sure what they are about, or even what they are selling, the approach of catching the viewer's attention to insidiously imprint some subconscious association unbelievably cynical and obnoxious (while in the case of some of us making us wonder just what it was about so that we consciously pay attention, manipulated yet again).

Meanwhile there is the ever-greater attention people are encouraged to devote to "status politics." The result is that the cultural traditionalist will not only be annoyed by what they experience as a ceaseless barrage of "representation," "counter-stereotype" and the like, but the sight of it will be the more charged because of that larger context--every commercial, like everything else, a battle in the culture war. Others who are not averse to such politics may still be annoyed by what they object to not for "wokeness," but simply the "corporate kind." And still others who have no objection to even corporate wokeness will be the more annoyed by the occasions when a commercial is not so "woke." (In short, everyone walks away unhappy, which frankly is the point of elevating such politics.)

One may add that the politics can seem to have had a subtler impact on the commercials. Traditionally commercials relied on the casting of physically attractive persons in them. They relied on an element of glamour and, yes, sex appeal. Humor was important, too. It is not always clear that all this actually helped sell more goods. And often it annoyed as many as it pleased. But for those who were amenable to the content--frequently to the point of, every now and then, actually liking the commercial in question, even when they didn't like commercials generally--it made living in an advertising-soaked media universe easier to take, a single good commercial buying a measure of forgiveness for a lot of bad ones. Now--and I admit this is not the sort of opinion you would see uttered in the mainstream media--in the view of a non-negligible portion of the market, when judged by traditional, conventional criteria, advertising has retreated from the use of attractive models, from glamour, from sex, while altering its use of humor (much of which now seems to revolve around subverting the old expectations by delivering precisely the opposite of what people would have expected--and where they would have preferred it, frankly annoying them rather than making them laugh). For those who feel that way all this removed the sweetener, leaving just the bitter pill that was an unwanted sales pitch being shoved down their throat--while those who saw the now vanished content as objectionable merely saw something objectionable diminished, commercials made less unpleasant than they were before in certain specific ways, which is a different thing from their actually being made pleasant, the bitter pill itself still that. And the result, of course, has been an audience with still more reason to be annoyed by the multimedia onslaught on their senses and their wallets.

The "Other" Stan Lee

When we speak of a writer named Stan Lee people will probably assume one means the Stan Lee associated with Marvel Comics, where he has generally been credited with, along with Jack Kirby, being the driving creative force in the company's "Silver Age" ascendancy, co-creating such icons of the form as the Fantastic Four, Spider-Man, the Incredible Hulk and the X-Men (among many others).

However, it is notable that just as Stan Lee was making comics history another Stan Lee--Stanley R. Lee--was also making history. A "Mad Men"-era adman at the firm of DBB, he wrote the copy for one of the most famous political advertisements in American history, the "Daisy" ad aired during the 1964 presidential election race by Lyndon B. Johnson's campaign, alluding to the cavalier attitude toward nuclear war that Barry Goldwater had displayed in his public statements. (A major theme of that campaign, opponents of Goldwater parodied his campaign jingle "In your heart you know he's right" with "In your guts you know he's nuts!")

As it happened the theme of nuclear war was one that Stan Lee stuck with as, like a great many other ad-men, he turned to publishing fiction. When we think of the '80s--an era in which the Goldwaterite tendency within the Republican Party, defeated in '64, captured the party and increasingly set the tone for American politics, down to this day--we remember it as a militaristic period, with Rambo: First Blood, Part II and Top Gun among its biggest box office hits, and Tom Clancy not only edging out Robert Ludlum to become the leading thriller novelist of the day, but the highest-selling novelist of the decade, period. Still, the decade also saw backlash against that tendency in a powerful anti-nuclear arms race movement that also made itself felt in pop culture in such ways as the miniseries The Day After (watched by an estimated 100 million Americans), and hit films like Wargames and the Strategic Defense Initiative-satirizing Spies Like Us, while if writers like Clancy offered images of a "winnable" World War III in works like Red Storm Rising writers like David Brin in The Postman offered a very contrary view.

This "Stan Lee" made a notable contribution to the stream with his novel Dunn's Conundrum (1985) (published under the name "Stan Lee"), a cleverly satirical "anti-techno-thriller" about an intelligence agency with a very dark and dangerous function in the nuclear arms race and the Cold War confrontation under a government that was clearly completely unhinged to allow such a thing--and the whistleblower intent on exposing and shutting down the operation. Five years later Mr. Lee followed up the book with his second novel, The God Project (1990), a similarly striking satire that, if still imagining the Soviet Union as a going concern in 1997, shifted its attention to a world of ubiquitous small wars and aspirations to making American intervention in them via automation, taken to its imaginative limit (with, to boot, the big secret bound up with a left-wing Christian movement).

So far as I can tell Lee did not publish a third novel before his early passing in 1997--while sadly the books seem to have sunk into obscurity (as the sheer scantiness of the Wikipedia page devoted to Lee reminds one--the single, two-sentence paragraph not quite filling three lines). It is a great shame, especially in an era in which few writers would dare to produce such works, not least for pessimism about a publisher being willing to take them, the niche unfilled--but the books, at least, remain "for your consideration."

Subscribe Now: Feed Icon