Tuesday, February 21, 2023

The California Dream and its Waning

Reading Mike Davis' City of Quartz I was surprised to find the extent to which Los Angeles is a product of real estate boosterism--the vast metropolis emerging before its industrial base, rather than the industrial base emerging first to become a foundation for a metropolis (with, contrary to what we may imagine given California's association with aerospace, computers, and so much else, manufacturing only blooming late in the state). Surprising, too, was the extent to which that boosterism was politicized in the late nineteenth century and early twentieth century. Southern California was promoted as a "refuge" for old-stock WASPs from the newer waves of immigrant, with their very different ethnic and religious backgrounds, with an element of health faddishness mixed in to produce a vision of "racial revival under a Mediterranean sun." It was also marketed as an "open shop" alternative to trade-unionized San Francisco.

In short it was a thoroughly right-wing vision, compounding a multitude of Victorian prejudices. And indeed, while the "Left Coast" stereotypes have a long history, southern California has long had its right-wing streak, with this evident not only in San Diego or the "Inland Empire," but in Los Angeles itself (certainly evident when one considers its elites, its social divisions, its policing).

Of course, for many on the right the vision of southern California as an ideal has long since given way to a view of it as a "Blue" wasteland exemplifying all it dislikes about the direction in which the country has been moving, such that rather than refuge it has in their view become a place to flee (for "Redder" territory inland). Meanwhile they have not been alone in watching, wondering, worrying, if for different reasons. Davis, coming from the opposite end of the political spectrum, called into question the viability of a metropolis built in that precise geographical location, with its susceptibility to earthquake and fire. He pointed, too, to the distinctly human-caused problem of the deindustrialization of what had once been one of the country's, and the world's, greatest industrial centers (at its height reputedly the second-biggest carmaker in the world after Detroit in its fabled heyday), with all that implied for its social sustainability. Amid an epoch of (likely climate change-amplified) drought, with water short and each summer the fires more horrific than in the last--as deindustrialization and its associated social stresses have mounted--one wonders if that writer, so often sneeringly dismissed as a catastrophist but vindicated by events, was not as right on this score as he has been on so many others.

A Zombie (Firm) Apocalypse?

The term "zombie firm" apparently goes back to at least 1987 but, unsurprisingly, acquired its current popularity with economic and financial commentators amid the twenty-first century's pop cultural obsession with the phenomenon.

In considering it one should note that there is no generally accepted definition of the term "zombie firm." However, the usage generally denotes a firm that is "economically unviable" in a particular way, namely its surviving on the extension of credit it has virtually no hope of ever repaying--such that the firm may be existent beyond the end of its natural life in a condition that, less than fully "alive," may be better understood as simply "undead."

With zombie firms' survival dependent on easy credit the concern that there are many more of them about than usual (and, perhaps, that firms of greater weight than usual fit the zombie profile) naturally grew during the protracted period of extremely low interest rates which followed the 2007-2008 financial crisis--and with it the concern that should the unsustainably ultra-sub market interest rates go up these firms will suddenly find credit no longer so easy, and the undead become simply the dead, the more in as such a surge in interest rates has been underway since early 2022. Should it be the case that there are more zombies about--that those zombies are larger and more central to the economy--then those zombies' being cut off from credit and finally dying out will mean that much more economic pain as a result of the tightening of credit.

Of course, trying to assess the matter in any rigorous way means trying to precisely establish a standard, and here there is, for the time being at least, room for individual judgment--which is to say, also interest and prejudice. Those looking to justify the monetary policy of the last two decades--to say that the ultra-loose monetary policy that prevailed for almost a decade and a half has been beneficent, and that the current rise in interest rates is also essentially beneficent--seem to be inclined to think that zombie firms are not a very big problem. (Indeed, that is what a 2021 Federal Reserve study of the matter argues, while I have seen nothing from this direction to indicate otherwise.*) By contrast others leerier of some part of this trajectory--for instance, Wall Street figures for whom the interest rate can never be low enough unhappy with their current direction, perhaps especially those concerned with sectors that may be more vulnerable than the average (as with venture capital-funded tech, or the real estate sector that may be the country's "most zombified"), display less equanimity about their economic impact.

For the time being the U.S. is not, officially, in recession, and indeed there has been a recent burst of optimism about the possibility of a "soft landing" in the wake of a decade and a half of monetary chaos--but the present trend (which includes a further ratcheting up of the interest rate) suggests that in even the most optimistic view (to say nothing of a less optimistic one that would seem better founded) such companies will find their path getting steeper, in the process revealing just how many zombies really are walking the Earth among us.

* The cited Federal Reserve study used for its standard the classification of a zombie firm as one with "leverage above the sample annual median, interest coverage ratio . . . below one, and negative real sales growth over the preceding three years," which means a company in the unenviable situation of being in the top 50 percent with regard to the ratio of debt to equity; pre-tax and interest earnings that are less than the interest they must pay; and shrinking sales for three years; such that they can't even keep up with their interest payments, while their earnings are moving in the wrong direction from the standpoint of their ability to pay, let alone get out from under their debt. (For what it is worth this seems to me a fairly reasonable standard.)

Friday, February 17, 2023

How Would Dostoyevsky's Underground Man Be Living Today?

Reading one's way through the classics of nineteenth century literature one can scarcely avoid Fyodor Dostoyevsky--and while he is perhaps best known through his thousand-page epic The Karamazov Brothers (one big book that justifies its length far, far better than most in this reader's opinion) I suspect many more who actually read him (rather than lyingly say they do) know him from shorter works. Among the more notable is his Notes From Underground--the first-person confession of a man who, today, might be described as a nineteenth-century hikikomori who has used a modest inheritance to isolate himself in an apartment in the Russian capital of Saint Petersburg for as long as his money will hold out.

Naturally I have found my thoughts turning back to it when taking up the subject of social isolation today. In the course of that I found myself thinking about how technology has changed the situation of someone of similar means attempting to live in such a manner--a point that can seem trivial next to some of the issues with which Dostoyevsky struggles, but have some importance.

In the book the "Underground Man" (as the character has come to be known) was obliged to keep a servant to relieve his household cares (somebody had to go out to get food). Today modern household appliances and online delivery make that rather less necessary.

Meanwhile, Dostoyevsky's Underground Man had little to do in his apartment but read. Today he would have the brave new world of electronic entertainment to keep himself busy--including all the books anyone could handle if he opted to have an e-reader, though I suspect he would have spent a lot of time playing video games. On those occasions when he got the itch to interact with others (which in the book means his barging into the company of people he hates, and who do not want him around, time and again leading to dreadfully humiliating episodes) he could have just done it online--while he would have probably opted for cybersex over his trips to a brothel.

All of this, I think, makes it easier for people to live in isolation--to, while keeping isolated, minimize their contact with others, and keep themselves entertained. And I suspect that many more now do so because of that.

In saying so I render no judgment on "technology"--the availability of which I consider to at least have given people some measure of choice. What seems to me a problem is the condition of a society that makes so many find becoming an Underground Man the least-worst of the options open to them--which is absolutely no testament to its health.

Top Gun 2 vs. Avatar 2: A Note

In 2022 fans of Top Gun crowed at the reception to the sequel, with some justification. The movie was the #1 hit of the year at the North American box office.

But it was a different story in the global market, where Avatar 2 has been the biggest hit, by a long way. At this point the sequel seems likely to finish with a take about fifty percent higher than Top Gun: Maverick ($2.2 billion+ versus the other film's $1.5 billion), with the difference overwhelmingly made by the international market. (Avatar 2, closing in on $1.6 billion outside North America, made more than twice as much as Maverick in the international market, and more in those markets than Maverick made altogether.)

It is a reminder that, even as the world economy seems to be fragmenting, the international markets still matter--and that Hollywood will come out ahead offering movies with broad global appeal, with Avatar arguably the easier sell worldwide than the flag-waving Top Gun.

The principle would seem to apply elsewhere, with a Chinese film industry offering up jingoistic action-adventure and war movies playing well at home (Wolf Warrior 2, The Battle at Lake Changjin), but probably not helping Chinese film very much in its reaching that bigger, global market, adding obstacles to a path already cluttered with them.

The (D)Evolution of Keir Starmer's Rhetoric: From the Ten Pledges to His 2023 New Year's Speech

Back during the Labour Party's leadership contest in 2020 Keir Starmer presented to the world a social democratic platform of "ten pledges," promising higher taxes on the rich, the end of college tuition, "common ownership of rail, mail, energy and water," the repeal of the Trade Union Act, and "the Green New Deal at the heart of everything we do."

He won the contest of course--and then quickly distanced himself from the ten pledges. Already by the time of his speech in answer to the government's budget a year later he limited himself to milder criticisms than would have been expected given his purported vision for Britain, and the "effective opposition to the Tories" that was itself a promise (#10). Not long afterward he officially scrapped the pledges.

Subsequently, as his 2022 Labour Party Conference keynote speech made clear Starmer has, if sprinkling his speeches with radical touches (throwing around references to "the 1 percent" as if he were an Occupy Wall Street activist), consistently stressed "fiscal responsibility" (for which purpose he repeatedly declares himself ready to defer the accomplishment of "good Labour things"), and only grown more emphatic in his expressions of a desire for "partnership" with business and exaltation of entrepreneurs, with this much more than this leader of the Labour party says about labor, toward which he is at best insultingly aloof. Even his most ambitiously leftish project of 100 percent "clean energy" by 2030, with this done partly through a state-owned firm which could facilitate the transition, appears a pious desire rather than a firm promise (whatever that would mean from the same man who not only made it clear that everything else is secondary to austerity economics, but so casually discarded his prior pledges). And to read his subsequent speeches is to be numbed by the repetitiveness of what have come to seem Starmer's clichés.

The result is that Starmer early on more than confirmed any suspicions that he was a Blairite neoliberal-centrist posing as a leftie to neutralize a leftward challenge, and then as soon as it was convenient reverting to his actual sense because, as he keeps saying "I want to be Prime Minister." Indeed, in his justification of his scrapping of the pledges he exemplifies that particularly neoliberal habit of seeing the correct response to every crisis as a right turn (i.e. "disaster capitalism"), of always dealing with the troubles made by neoliberalism with . . . more neoliberalism.

Of course, Starmer might still "lead" Labour to victory in the next General Election for all that. At this point Britain has already seen over twelve years of continuous, catastrophic (austerity, Brexit, a badly bungled handling of the pandemic in all its facets, a post-war era-style currency crisis and IMF bailout, and now, as always, more austerity in its wake) under a string of unfortunate Prime Ministers it seems almost impossible to say anything positive--David Cameron (who staged the Brexit referendum as a bluff on behalf of the regulation-haters in the City--some bluff that), Theresa May (whose tears were only ever for herself), Boris "let the bodies pile high in their thousands" Johnson (even if he didn't actually utter the words his actions spoke them), Liz Truss (who thinks imported cheese is a disgrace and in the wake of presenting her fiscal vision "got outlasted by a lettuce head"), and that Rishi Sunak (who in his youth bragged about having no working-class friends, and probably isn't making many now with his austerity on steroids). In the process they have made it very, very easy for the official opposition to beat them--especially when it is led by a neoliberal who will be treated far more gently by business and the press than his predecessor, after which victory he will, of course, get hard at work disappointing whatever slight hopes he managed to raise in the voters in another turn of the widening gyre.

Wednesday, February 15, 2023

On the Entertainment Media's Coverage of "Bond 26"

It may seem odd that I am writing about the coverage of "Bond 26" rather than about the (as yet nonexistent but widely anticipated) movie itself. However, that coverage is itself odd--sufficiently so as to warrant some remark.

The reason is that the entertainment press keeps talking about the project--even though there is no actual news for them to report, let alone analyze. Either the producers are doing nothing at all, or the preparations are an extremely well-guarded secret. In either case the entertainment press is left with nothing to offer. Yet they keep talking--and in the process recycle the same speculations over and over and over again.

"Who will be the next James Bond?" they ask, and run through (mostly) the same list of candidates over and over again.

"What will the tone of the movies be like?" they ask. Will they stick with the more grounded material and darker, more "character"-oriented tone of the Daniel Craig era, or revert to the more flamboyant adventures of yore?

It was all actually talked to death around the time of Spectre--back in 2015. (You know, before the Trump presidency, before Brexit, before #MeToo, before the pandemic, before lots of things that can not unreasonably leave one feeling that the world has changed.) But the extreme delay in the release of that movie that from the first was expected to be Craig's last, No Time to Die, meant they just had nowhere else to go. And so has the fact that in the year since that movie hit theaters there has been so little for those claqueurs-posing-as-journalists tasked with sustaining public interest in the franchise.

Might that start to change soon? Certainly the film industry seems to be stabilizing, and along lines not much different from what prevailed before--small dramas and comedies ever less likely to be a theatrical draw, and ever more consistently relegated to streaming; but the big franchise action-adventure doing well, as demonstrated over the last year by hits like Spider-Man: No Way Home, Top Gun: Maverick and Avatar: The Way of Water in particular, and likely to be further reconfirmed as we proceed through what looks like a thoroughly blockbuster-packed 2023. The result is that Hollywood will feel more confident about pouring its money into such projects as a new Bond film.

Still, that has only been part of the problem. It seems that No Time to Die was, to the extent that it did less well than some hoped (especially in North America), was held back not merely by the pandemic (undeniable a factor as it was), but also the franchise's plain and simple running down. This most venerable of high-concept action-adventure franchises--the one that can truly be credited with "starting it all"--is showing its considerable age, which if it were a person would have it looking forward to its pension a few years hence. Audiences were less interested, with this going especially for the younger age cohorts for which nostalgia plays less part, and the tropes are less resonant (having grown up on superheroes rather than secret agents), in an ultra-crowded market where 007 appears ever less special (indeed, perhaps even less their favored secret agent hero than the now similarly-approaching-retirement-age gang from the Fast and the Furious franchise). This has likely played its part in the hesitations of the producers about proceeding--and I have no idea how far they may be along in moving past that to settle on just what approach with Bond 26 will get the public excited about 007 again (assuming, of course, that is not a bridge too far"), the more in as the budgets, and the grosses that alone can justify them, make such movies ever more a preposterous gamble for the backers.

Robert Heinlein's Friday, Again

In Robert Heinlein's novel Friday the character identified as the Boss remarked at one point in the novel that "a dying culture invariably exhibits personal rudeness. Bad manners. Lack of consideration for others in minor matters," and that this "symptom" of a dying culture "is especially serious in that an individual displaying it never thinks of it as a sign of ill health but as proof of his/her strength."

The remark was not a throwaway line, but relevant to the book's theme, precisely because the titular protagonist's problem was that of making a life for herself in a world that really was supposed to be falling apart.

Of course, people have always complained that their "culture was dying"--and as part of the package always complained about the decline of manners. It has especially been the case that the old have always complained about the young in this manner. And for precisely that reason I ordinarily discount this sort of thing.

Still, the passage caught my eye partly because there was a bit more nuance here, in particular the observation that it is something that worsens, with the bit about mistaking rudeness for "proof of strength" especially telling. A culture where people take such pride in being an "asshole" as we see, in which people who act like "assholes" are worshipped by people whose greatest aspiration in life seems to similarly be "assholes," is not necessarily "dying," but also far from healthy.

The Screen DC Universe Reboots: What Can We Expect?

The superhero movie genre has been booming from the very start of this century, with the entire period seeing the major studios barraging audiences with A-grade, big-budget blockbusters about the biggest household name characters from the pages of Marvel and DC Comics--accompanied by a smaller but still significant traffic on the small screen (in the age of streaming, increasingly an extension of the big-screen stuff, such that you had to watch Wandavision to fully appreciate Dr. Strange 2).

Unsurprisingly, after being so heavily exploited for so long, the genre is not looking its best.

Consider where things now stand. The Marvel Cinematic Universe (MCU) has gone from looking unstoppable in Phase Three to underwhelming in Phase Four; and the DC Extended Universe a letdown again and again, such that Warner Bros. Discovery is officially giving up on the old vision (all as it likewise seems to be giving up on the small-screen Arrowverse). Amid all that lovers of superhero film and television desirous of more would seem to have had as their best hope for something fresh and interesting the reports of James Gunn and Peter Safran's work on some grand reboot of the DC Universe for after 2023 (at least, excluding the decision to make a sequel to The Batman, due out in 2025).

This month they have revealed part of what audiences can expect, including:

* A TV show all about Waller (the character Viola Davis first played in 2016's Suicide Squad), inventively named "Waller."

* A Green Lantern show "in the vein of True Detective."

* A Game of Thrones-style show (blood-soaked soap opera?) set on Wonder Woman's home island of Themyscira.

* A "much more hard-core" Supergirl pointedly produced by a horrific childhood spent getting to Earth.

* An "impostor syndrome"-themed show about Booster Gold.

* A Batman film (not starring Robert Pattinson) utilizing the content of The Brave and the Bold, complete with Damian Wayne in a "very strange father-and-son story."

That's not all of it, but I think you get the picture.

The entertainment press, of course, is responding with its usual enthusiasm.

I will admit to not sharing the feeling. Admittedly I have been losing enthusiasm for the genre for rather a long time--the genre seeming to me to keep going mainly on the form's convenient fit with the studios' particular marketing requirements rather than its having anything fresh and new to offer (and depending overmuch on such superficial tweaks as the "edginess" of Deadpool, etc. for any appearance of newness). But there is also something else evident in the pattern that has me not looking forward to any of these--namely that (as we ought to have expected from the writer who decided "Let's make Scrappy Doo the villain!" in a textbook example of the principle that the Teen Titans Go! episode "The Return of Slade" had to teach, and which the absolutely unteachable folks in Hollywood seem absolutely incapable of learning), he is going all-in on "dark" and "edgy" and frankly pretentious. (A Green Lantern show "in the vein of True Detective?" Really?) Irksome enough in itself to those of us who think big dumb action movies should actually be fun it is the more annoying because rather than some dramatic shift-of-course in that respect, at least, it actually sounds like just-more-of-the-same stuff that I, for one, didn't particularly care for in the first place.

Saturday, February 11, 2023

What Do The Fabelmans, Batgirl and Rust Tell Us About the Future of the Movies?

Especially as two of the three films I name in the title of this post have not even been released--one of them, in fact, unlikely to even be completed, let alone see the light of day--while the other movie has actually been very little-seen, it may be best to offer a word or two about each one of them.

The Fabelmans is a drama Steven Spielberg made on the basis of his experiences growing up that, if carrying the highly salable Spielberg name and much loved by the critics, was probably never going to play like Indiana Jones or Jurassic Park, or even Schindler's List or Lincoln. Still, box office-watchers had expected the film to do better than it has. (Its first weekend in wide release saw it take in a mere $3 million, while, even with the bump from its slew of seven Oscar nominations--including Best Picture, Best Director and two for the cast's performances--all it has taken in at the North American box office is a bit under $17 million--less than half the reported production budget of $40 million.)

Batgirl, as the reader has probably guessed, is a Warner Bros. Discovery production starring just that character intended for streaming. The production, which ran into trouble partly on account of the pandemic, and underwent a change of directors, saw its budget swell from $70 million to over $90 million. Subsequently there was a change of leadership at the company, which judged the resulting movie's prospects in either the streaming market or in theaters too poor for the movie to be worth completing, and, according to the prevailing reports, decided to just bury the unfinished movie and take the tax write-off.

Rust is an upcoming Western starring Alec Baldwin (who also has story and producer credits)--and if you have heard about this one it is probably because Baldwin, and the movie's armorer, Hannah Gutierrez-Reed, are up on charges of "involuntary manslaughter" as a result of his having fatally shot someone on the set with a gun he did not know to be loaded.

These are quite different films, with not much in common between them, but individually they have been recognized as each indicative of a significant trend in film production. The Fabelmans has been taken as an indication of the continued decline of the salability of drama as a theatrical experience--people generally opting to watch such movies at home. Batgirl has been an indication of the media companies' drawing back from their earlier splurging on streaming in recognition of its limits as a revenue stream (no substitute for selling $15-$20 tickets in theaters, and indeed unlikely to justify the cost of a mid-to-high-budget production like Batgirl). And Rust, where amid talk of the on-set tragedy one constantly hears reference to extreme low budgets and corner-cutting (which, for his part, actor Michael Shannon has called out as part of a bigger pattern) is indicative of the tendency in financing those films that are not "the biggest."

The result is that, while the pandemic period saw some experimentation with streaming in unprecedented ways--Disney making grade-A blockbusters like Black Widow and Mulan available at home for a surcharge on their opening weekend in theaters--any radical shift in the making and marketing of such content seems to have been made implausible by the results, the studios focusing on those theatrical releases and their as yet irreplaceable $15-20 a ticket revenue stream. These blockbusters (the enduring place of which is demonstrated not just by failures like The Fabelmans but undisputed successes like Spider-Man: No Way Home, Top Gun: Maverick, Avatar: The Way of Water) are ever-more dominated by a very few genres, above all action-adventure (and to a lesser extent, splashy animated features dominated by musical comedy), to the point of everything else not only being crowded off the multiplex screens, but people falling out of the habit of catching them there (arguably, the more easily as the theatrical, giant-screen experience provides less "added value" with other movies than it does the noisome spectacles).

Meanwhile, with producers disabused of their illusions about streaming, everything else is being made more penuriously, not only as Rust was, but such that WBD, and its rivals, would probably not authorize a project like Batgirl today (too costly to be a practical for-streaming project, but still too small a production for theaters, with this logic seemingly confirmed by the bigger budget for a Blue Beetle movie intended for theatrical release), and a movie like The Fabelmans, especially when not made by a Spielberg, have ever less chance of being authorized as a production on such a scale.

In short--giant action movies and cartoons at the theater, and the relegation of pretty much everything else to cut-rate streaming, with very little in between. For the film-maker and the film-lover alike it is not a happy situation--a hardening of the already oppressive constraints on what could be made. However, it seems far better to acknowledge that forthrightly than indulge in the mindless boosterism so tiresomely standard for the entertainment press--the more in as it is really just a confirmation of where most of those with any observational skills at all knew things were going way before the pandemic.

Are Commercials More Annoying Than They Used to Be? Yes, Yes They Are.

A certain sort of person shrugs off other people's observations about the ways in which the world may be changing with the view that everything is always the same, and any perception of change is subjective and meaningless. The bias of contemporary culture toward respect for ideas that look like "eternal verities," the frequency with which people offer superficial observations that are easily debunked, and the "air of superiority to the world at large" with which such statements tend to be made, allows that shrugging banality and cop-out to pass for great wisdom. But all the same, that objective, material world is out there, and we are undeniably part of it and affected by it. Yes, there are times when life probably really is getting harder for most people--and where such evidence goes it is hard to argue with something like life expectancy, with falling life expectancy proof of genuine hardship. The trend of things in the U.S. this past decade, with (according to CDC/World Bank data) male life expectancy (already eroding before the pandemic) fallen over three years in just the 2014 to 2021 period alone (from 76.5 to 73.2 years), and female life expectancy a more modest but still significant two years (from to 81.2 to 79.1). All of this is the more the case as it is virtually undisputed that the deaths lowering that expectancy (the pandemic apart) are "deaths of despair" (alcoholism, drug abuse, suicide, etc.), while the averages conceal significant differences, with the poor suffering more than these figures indicate. Even before the trend had progressed so far the richest 1 percent of men lived 15 years longer than the poorest 1 percent--while the erosion of life expectancy in the years since would seem to have hit the poor harder than the rich (the portion of it related to the pandemic included) . . .

"Wait, wasn't this going to be about commercials?" some readers are doubtless asking.

Yes, in fact it already is. Specifically it seems to me that in a world where people are suffering so much--the actual deaths only the most extreme outcome of the untoward larger trend--even the little things are weighing on us more.

Like the annoyance caused by commercials.

Still, I do think the actual experience of those commercials is relevant. There is how advertisers strive harder and harder to penetrate an ever-more cacophonous media universe to reach a hyper-saturated audience that wants absolutely nothing to do them, and has developed defenses against them. There are such disgusting "innovations" as autoplay for videos no one would choose to play, so that before you have even spotted the video you hear the commercial blaring and are left hurriedly trying to locate it and shut off the distraction so that you can get on with what you came for. There is the way that when we are online going to some website we will be hit in the face with a popup demanding we disable our ad blockers--the element of coercion, the fact of our putting ourselves to the inconvenience of disabling our protections (and then having to enable them again as soon as we leave), acquiescing in their harassing us, the more distasteful. There is their readiness to make their content annoying as the price of keeping us from shutting them out mentally--with one tactic the design of commercials so that after seeing them a hundred times you are still not sure what they are about, or even what they are selling, the approach of catching the viewer's attention to insidiously imprint some subconscious association unbelievably cynical and obnoxious (while in the case of some of us making us wonder just what it was about so that we consciously pay attention, manipulated yet again).

Meanwhile there is the ever-greater attention people are encouraged to devote to "status politics." The result is that the cultural traditionalist will not only be annoyed by what they experience as a ceaseless barrage of "representation," "counter-stereotype" and the like, but the sight of it will be the more charged because of that larger context--every commercial, like everything else, a battle in the culture war. Others who are not averse to such politics may still be annoyed by what they object to not for "wokeness," but simply the "corporate kind." And still others who have no objection to even corporate wokeness will be the more annoyed by the occasions when a commercial is not so "woke." (In short, everyone walks away unhappy, which frankly is the point of elevating such politics.)

One may add that the politics can seem to have had a subtler impact on the commercials. Traditionally commercials relied on the casting of physically attractive persons in them. They relied on an element of glamour and, yes, sex appeal. Humor was important, too. It is not always clear that all this actually helped sell more goods. And often it annoyed as many as it pleased. But for those who were amenable to the content--frequently to the point of, every now and then, actually liking the commercial in question, even when they didn't like commercials generally--it made living in an advertising-soaked media universe easier to take, a single good commercial buying a measure of forgiveness for a lot of bad ones. Now--and I admit this is not the sort of opinion you would see uttered in the mainstream media--in the view of a non-negligible portion of the market, when judged by traditional, conventional criteria, advertising has retreated from the use of attractive models, from glamour, from sex, while altering its use of humor (much of which now seems to revolve around subverting the old expectations by delivering precisely the opposite of what people would have expected--and where they would have preferred it, frankly annoying them rather than making them laugh). For those who feel that way all this removed the sweetener, leaving just the bitter pill that was an unwanted sales pitch being shoved down their throat--while those who saw the now vanished content as objectionable merely saw something objectionable diminished, commercials made less unpleasant than they were before in certain specific ways, which is a different thing from their actually being made pleasant, the bitter pill itself still that. And the result, of course, has been an audience with still more reason to be annoyed by the multimedia onslaught on their senses and their wallets.

The "Other" Stan Lee

When we speak of a writer named Stan Lee people will probably assume one means the Stan Lee associated with Marvel Comics, where he has generally been credited with, along with Jack Kirby, being the driving creative force in the company's "Silver Age" ascendancy, co-creating such icons of the form as the Fantastic Four, Spider-Man, the Incredible Hulk and the X-Men (among many others).

However, it is notable that just as Stan Lee was making comics history another Stan Lee--Stanley R. Lee--was also making history. A "Mad Men"-era adman at the firm of DBB, he wrote the copy for one of the most famous political advertisements in American history, the "Daisy" ad aired during the 1964 presidential election race by Lyndon B. Johnson's campaign, alluding to the cavalier attitude toward nuclear war that Barry Goldwater had displayed in his public statements. (A major theme of that campaign, opponents of Goldwater parodied his campaign jingle "In your heart you know he's right" with "In your guts you know he's nuts!")

As it happened the theme of nuclear war was one that Stan Lee stuck with as, like a great many other ad-men, he turned to publishing fiction. When we think of the '80s--an era in which the Goldwaterite tendency within the Republican Party, defeated in '64, captured the party and increasingly set the tone for American politics, down to this day--we remember it as a militaristic period, with Rambo: First Blood, Part II and Top Gun among its biggest box office hits, and Tom Clancy not only edging out Robert Ludlum to become the leading thriller novelist of the day, but the highest-selling novelist of the decade, period. Still, the decade also saw backlash against that tendency in a powerful anti-nuclear arms race movement that also made itself felt in pop culture in such ways as the miniseries The Day After (watched by an estimated 100 million Americans), and hit films like Wargames and the Strategic Defense Initiative-satirizing Spies Like Us, while if writers like Clancy offered images of a "winnable" World War III in works like Red Storm Rising writers like David Brin in The Postman offered a very contrary view.

This "Stan Lee" made a notable contribution to the stream with his novel Dunn's Conundrum (1985) (published under the name "Stan Lee"), a cleverly satirical "anti-techno-thriller" about an intelligence agency with a very dark and dangerous function in the nuclear arms race and the Cold War confrontation under a government that was clearly completely unhinged to allow such a thing--and the whistleblower intent on exposing and shutting down the operation. Five years later Mr. Lee followed up the book with his second novel, The God Project (1990), a similarly striking satire that, if still imagining the Soviet Union as a going concern in 1997, shifted its attention to a world of ubiquitous small wars and aspirations to making American intervention in them via automation, taken to its imaginative limit (with, to boot, the big secret bound up with a left-wing Christian movement).

So far as I can tell Lee did not publish a third novel before his early passing in 1997--while sadly the books seem to have sunk into obscurity (as the sheer scantiness of the Wikipedia page devoted to Lee reminds one--the single, two-sentence paragraph not quite filling three lines). It is a great shame, especially in an era in which few writers would dare to produce such works, not least for pessimism about a publisher being willing to take them, the niche unfilled--but the books, at least, remain "for your consideration."

H.G. Wells and World War I

As I have remarked time and again H.G. Wells is these days little remembered but as a science fiction novelist--and at that, a producer of that handful of early novels that Hollywood and its foreign counterparts endlessly remake (The Time Machine, The Invisible Man, The Island of Dr. Moreau, The War of the Worlds). His later science fiction works generally go ignored, and still more his realist novels like Tono-Bungay (assailed by critics from Virginia Woolf to Mark Schorer in line with the unfortunate trend of letters over the century). Meanwhile as a historian and social and political thinker of whom George Orwell, even when criticizing him brutally, could say that in the Edwardian era he had seemed an "inspired prophet," such that he "doubt[ed] whether anyone who was writing books between 1900 and 1920, at any rate in the English language, influenced the young so much," it can seem that even less is left of him. (Indeed, it seems all too symbolic that Warehouse 13 recreated the character along lines of post-modernist, status politics-minded wish-fulfillment, while it seems that a great many are salivating at the thought of "canceling" him, and seizing on any excuse to do it.)

All of this has struck me as profoundly unfair, and indicative of profoundly unhealthy tendencies (including a contempt for history, and colossal hypocrisy in some quarters). Still, while much of his accomplishment stands the test of time it is clear that his contributions had their limitations as well. I think of how in works like The War of the Worlds, The War in the Air, The World Set Free, Wells fiercely opposed imperialism, nationalism, militarism and much else associated with them, pointedly warning against the coming of a conflict like the First World War and its effects--but then when that war actually broke out in 1914 propagandized for that cause in works like Italy, France and Britain at War (1916), where he portrays the Allies as fighting to end the old evils, and bring about a better world, in line with the narrative that this was a "war to end war."

Post-war Wells abandoned such conduct, assessing and condemning the conflict in line with his long-established principles, but his war-time propaganda was an undeniable betrayal of his intellectual and political principles. Indeed, one finds him having to reckon with the "extremely belligerent" journalism he produced during the conflict in his Experiment in Autobiography (1934).I In discussing that lapse he acknowledged that "sanguine exhortation . . . swamped [his] warier disposition towards critical analysis and swept [him] along," with his mind not "get[ting] an effective consistent grip upon the war until 1916," and Wells requiring "months of reluctant realization . . . to face the unpalatable truth that this . . . 'war to end war' . . . was in fact no better than a consoling fantasy," that the reality was that the Allies were playing the same dirty old great power game "in pursuance of their established policies, interests, treaties and secret understandings . . . and that under contemporary conditions no other war was possible." Despite that, even in the last year of the war he kept hoping that "indignant commonsense that would sweep not simply the Hohenzollerns but the whole of the current political system" out of existence, to the point that in his journalism he could still spew anti-German propaganda.

And remembering that I find myself remembering, too, the limitations of his vision. He saw rather clearly that the world was a mess, and he rather clearly pictured something better as a necessity, but he was not so strong on how the world could get from one state to the other. Indeed, in The World Set Free, written the very year before World War I's outbreak, he imagined that the leaders of the warring powers themselves, facing the destructiveness of the conflict they had unleashed, would themselves work toward ending the fighting and building that better order.* As he himself realized before the end of the conflict this was not happening, and indeed he looked in other directions--but more pessimistically, and not necessarily convincingly.** Thus was he susceptible to persuasion that the better order to which he looked forward could somehow emerge from the Allied victory--and go on to encourage others in such illusions.

I dare say that others who saw the situation more clearly were less vulnerable on this score (not just a John Maclean, but even a Keir Hardie, from whom the Labour Party's current Keir could not be more different in sensibility and style). However, a remembrance of the matter would not be complete without remembrance of how the war-time atmosphere so "swamped his warier disposition." The climate of a country at war, the anger and fear and hate, the authoritarianism and intolerance, all of which get exploited to the full by those who stand to gain from them, are not conducive to critical thought, or free discussion--and invariably when some measure of the insanity passes many are ashamed of how they acted, little good as their realization does anyone given how readily those who made the mistake the last time usually plunge it into again, certainly for having failed to learn anything from the experience--and one may as well admit it, for lack of character too.

* The war in The World Set Free was a nuclearized conflict, which saw hundreds of cities hit with atomic weapons, and therefore the more severe and frightening--but all the same the actual world war may be taken as showing the falsity of his expectations.
** In The Shape of Things to Come the world leaders fail to stop a war which, while not nuclear, saw aero-chemical attacks on cities that were the interwar period's conflict, after which the war ran its course--to international breakdown (as the war, a less vigorous thing than it might have been as a result of their having already been exhausted, followed by a pandemic readable as a more murderous equivalent of the post-World War I Spanish flu). The result was that here the World State emerged out of post-apocalyptic ruins, built by "practical middle-class types" who realized the old game was up.

Thursday, February 9, 2023

What Will TV Be Like in 2026? (And Beyond!)

CNBC recently sat down with a dozen senior figures in the TV industry and asked them a series of questions about just where they think that industry--and the experience it provides the consumer--is going, and published a "sampling" of the answers.

Going by what they published there was no consensus--the guesses these figures offered varying greatly. Candle Media CEO Kevin Mayer, for example, in response to the question of where "legacy TV" would be in three years held it to be in its "death throes"--while the rest were generally less pessimistic, with The Ringer founder Bill Simmons suggesting that instead TV, like radio, will linger on.

Still, it seems possible to distill the assorted answers into the following: regular, legacy TV will still be around, the content it produces from dwindling budgets watched by a dwindling, aging viewership, and kept alive by news and sports (the high fees charged by which are yet another reason why there is less to fund other stuff). It is also likely to have help from its syndication earnings, and the very same streaming which is its rival, as this has, of course, been having its own problems funding content (as those who have seen the drama at Netflix, Disney+, Warner Bros. Discovery, are all aware). Indeed, streaming is likely to need help from legacy TV with content funding, and at the same time to increasingly sell its content to legacy TV, while there is likely to be consolidation both in legacy TV and streaming, with failures and mergers thinning out the ranks; and some aggregation, too, though as Barry Diller put it, the customer getting the convenience of "[o]ne central warehouse who deals with all the players and sends one bill" is unlikely.

The vision seems fairly plausible, but also fairly safe (reflecting, I think, more guarded expectations regarding streaming compared with the bullishness of a couple of years ago when people fancied streaming replacing theaters, etc.). Still, there are a few more exotic predictions in there--among them, the integration of betting into sports viewing (gamblers placing bets through their remotes using voice command), artificial intelligence (AI) facilitating the subbing and dubbing of content to make it easily marketable internationally ("without a third-party dubbing it for you"), and most exotic of all, the viewer getting a measure of editorial control over some of the content (citing Netflix's Kaleidoscope as an example).

What are we to make of these last? I suspect it will take a lot more states legalizing sports betting--and specifically online sports betting--to make the kind of technological set-up that former CNN President Jeff Zucker envisions worthwhile for the relevant businesses (though they may, of course, bet on this happening in short order). I think it will probably be a fair while before AI does much to speed along the subbing/dubbing process--and that if we had AI that could dub shows effectively, we might see AI not only replace a great deal of staff in this area (the dubbing business is not an unimportant source of jobs for actors, writers, directors, etc.), but perhaps more broadly take over the voicing of, for example, animation (which, along with other advances, might enable fans to easily and cheaply create their own). But I think that far from becoming a significant new way to watch customizable narratives (an idea we saw tried before on DVD, where you could include or exclude a few scenes out of the running time) it is likely to remain a gimmick for a very long time to come because that is simply not how storytelling works.

Is the Future Solitary?

From time to time the folks in the news media point out the fact that more people are living alone than before in some fashion or another--or at least, not with a partner (for instance, that more adults of this or that age category are living with their parents instead). In line with its long-established ideological tendency it does not consider the matter very deeply or critically, but it does point it out, and even what it means for individual isolation and its consequences. After all, in a society where for a great many reasons extended family is a faint presence (Edward Luttwak was to comment that "Americans are . . . as poor in their family connections as Afghans or Sudanese are in money"), work is unlikely to provide much in the way of relationships (because it is insecure and tenuous, because organized labor and its associations are so much weaker in the U.S. than elsewhere, because everyone commutes to the job from so far away, etc. ), and the sense of community generally has always been fragile (decades ago a sociologist could observe that we are "bowling alone"), the nuclear family has been critical to any kind of social connectedness. Moving out, setting up a household alone or with another person, and ultimately marrying and starting a family, for all its imperfections (divorce rates are high, even when people don't divorce they become alienated from one another, children move away and don't call, etc.), is pretty much the only thing that provides people with significant social connections for any length of time. (It is even critical to the non-familial connections, because of the prospect of association with other married couples, and parents, and their connections, in a way less accessible to single persons.)

But that, of course, has its economic requirements, especially if one means to do so on the at least "quasi-middle class" terms people have come to expect--and it has become much harder for most to meet those requirements, with the young taking a particularly hard hit.

Most obviously there is the increasingly crushing cost of living relative to income for the majority as college degrees cost more and deliver less, and the burden of student debt and the cost of housing both particularly notorious in the list of problems particularly making life hard for the young, with this going even for those who do as they are told and go for STEM instead of the humanities (as, contrary to the whining of some, more and more have been doing ).

One reflection of the hard times would seem to be that, even with American cities as unwalkable as ever and as ill-provisioned as ever with public transport (even New York's subway system is crumbling), they drive less, which in much of the country means an enormous restriction of their personal mobility, to the point of significantly diminishing any prospect of a social life (and certainly any dating opportunities).

And of course the pandemic has not helped. The media, again in line with its prejudices, lavishes its attention on the "anti-lockdown" and "anti-vaccine" groups and completely ignores the other end of the spectrum here, those who think government has not done enough, that individuals are not doing enough, to control the pandemic--and who as the pandemic keeps raging have changed their habits, perhaps especially to the extent that their particular living standard makes them more vulnerable. (Someone who lives in an apartment building with its narrow hallways and shared utilities, someone who cannot or does not drive and has to ride public transport instead, has had to work much harder to keep themselves safe, if so inclined.) Meanwhile, even those totally indifferent to the problem have seen the escalating price of essentials reduce "real," inflation-adjusted wages as, once more, the price of housing in particular exploded.

Yet alongside all these very material factors which it is scarcely possible to deny (though not for lack of trying on the part of an ever-downward-punching media ever-sneering about "millennials" and out-of-touch billionaires mouthing off about young people's supposed passion for "avocado toast") there are subtler issues. There is, for instance, what it means to not be doing well economically and the ways in which this gets in the way of any attempt to "have a life" materially, but psychologically, in a very unequal, very hierarchical society where the conformist endlessly insist that the world is one big "meritocracy," and on that basis accord themselves carte blanche to be brutal to those who are not doing well. People get what they "deserve," it insists--and those who do not have enough don't deserve more, with no one having to be nice about it, and indeed free to be as nasty about it as possible to the "loser" and the "failure" --while accusing any of these who would object at all to such treatment of having an unwarranted sense of "entitlement," and telling them that any emotional distress they are suffering is nothing more than "self-pity." Enough in itself to make them want to keep to themselves, societies' inequities hit people especially hard in the "mating market," "the marriage market," which in a very important sense live up to the implication of the word "market" with all its brutal connotations--enough so that those whose experiences have been unpleasant can get very impatient indeed of the hypocrisies one is obliged to respect when raising the subject, the supposedly reassuring banalities that can only seem insultingly patronizing to anyone past a certain not-very-great age.

The result is that while marrying and having a family may be the principal alternative to aloneness--but society's pressure on people to do so is not what it once was in an atomized, anonymous urban world, and those who (often as a result of much painful experience) expect to not do well in the market, to not find someone they want to be with, or who really wants to be with them (let alone someone they want to be with who also wants to be with them, and both thinking they might want that for life), find it easier to resist the pressure; to, as Al Bundy would have us do, "Just Say No." And indeed, even were life to get somewhat better for many--the economic and other stresses to be alleviated--I suspect that we would still see more people than before "bowling alone," and in general living alone. Because even if being alone may be the first choice of only a very few, in anything like the situation we have now many more will find that it hurts less than being with others.

Wednesday, February 8, 2023

Is Blogging Dead? Not Exactly, But . . .

Typing the keywords "Is blogging dead?" into the principal search of engine of the day invariably leads one to a long list of links to sites emphatically answering "No, of course not!" as they proceed to offer glib advice--as they try to sell you something (for instance, one of their blogs).

Thus does it go with today's Internet--ask a question, and get not an answer but a sales pitch--in a veritable monument to what Cory Doctorow so rightly calls the "enshittification" of the Internet.

Naturally, getting somewhere requires us to think more precisely. Clearly people are still writing blogs, and the evidence is that many blogs are still attracting readers, and so in that sense blogging is not dead.

But I do not think that is what most have in mind when they raise the question. People "blog" in many ways, for many reasons. There are blogs operated by major businesses and other institutions, for example, where prominent individuals put out statements of public consequence for a limited expert audience likely to retail what they have to say to a broader public.

However, for the vast majority of the world blogging is nothing like that. What they mean when they speak of blogging is a person who is not connected with a major platform, who is not in their own right famous, setting up such a platform with little or no money spent, no assistance from anyone else, and no prospect of making their blogging a full-time activity, in the reasonable expectation of presenting their words to the world and somehow finding a readership. If they have something that resonates, maybe a big readership. Maybe even going viral!

And what they mean when they ask "Is blogging dead?" is the extent to which such a hope exists.

The boosters on the aforementioned web sites to which the search engines so relentlessly lead us have no interest whatsoever in the question "Is blogging dead?" when asked in that sense

My answer is that, contrary to the impression given by an idiot media ever trafficking in aspirational garbage that if actually followed is apt to be self-destructive, the hope was always extremely slim. The most obvious reason is that the ratio of "producers" of online content to "consumers" has been extremely high from the start, making the scene something like The Hunger Games, times a million--with some much more disadvantaged than others, like those trying to offer something of substance via a medium not particularly well-equipped for its provision. (After all, as one writer who so far as I can tell never even tried to offer anything of substance was well-equipped to explain, "The Internet is a communication tool used the world over where people can come together to bitch about movies and share pornography with one another"--not have intelligent discussions.)

It has only got worse since. Much, much worse. I can think of at least four very well-attested reasons why.

1. The decline of free-range Internet searching. Instead people access the Internet through social media and its associated filters, making "discovery" less likely (even if you get your blog onto social media).

2. The increasingly pay-to-play character of the Internet entailed by the aforementioned "enshittification," where others' search engine optimization, ad-buying and the rest get them ahead in the search rankings at your expense--while any effort you make to get ahead without laying out the cash gets you penalized. (Kind of like how the people who hand out Oscars wanted to punish Andrea Riseborough for having got an Oscar nomination by way of low-budget promotion by people who were actually enthusiastic about her movie and her performance instead of a $30 million studio campaign.)

3. The favoritism search engines show to "authoritative" content in the name of keeping the minds of the masses unsullied by "fake news" and the like. This means that the non-"authoritative"--anyone not connected with a big, big-budget platform--gets pushed down in the search rankings. And "This means you!" (Probably.) Finally, there is

4. The feedback loop that amplifies all of these tendencies. Get demoted in the search rankings for any reason--or more likely still, for many of them--and fewer people will happen upon your blog. This will get you downgraded yet again, making that many fewer people happen on you again. And so on and so forth in a vicious circle toward a readership of exactly . . . one person, yourself.

As if this were not enough there are the other factors that make blogs generally less likely to land an audience--like a discourse adapted to 140 (or is it 280?) characters at a time, max, and people flocking to "vlogs" over blogs because, reinforced by declining literacy levels, vlogs don't require you to "Read stuff." And so on and so forth.

All of this has turned the Hunger Games-times-a-million into, I suspect, something more like the Super Hunger Games-times-a-billion, the "attention economy" become that much more brutal (and punitive), which has made things worse yet again by encouraging lowest common denominator content. Thus does it seem that those who do write independently incline ever more toward the clickbaity--to the gossipy and the confessional, to hot button-pushing, to meaningless lists and worse than meaningless self-help, to forgotten-in-five-minutes ultra-topicality. Meanwhile, if things do indeed still go viral (I am doubtful that they actually do so, but people say it happens nonetheless), it is almost always the case that they are exceedingly stupid things, about which only stupid people would care about.

Like stupid Ben Affleck looking constipated at a stupid awards show, and people turning it into a "meme," and commentators remarking the face in the meme as somehow the very picture of the world's misery in 2023. (A world sinking ever deeper into a polycrisis of economic and ecological collapse, where hunger spreads and pandemic after pandemic rages, and war spreads and rages, with the fighting in Ukraine possibly already escalated into World War III, and the visage of this narcissistic mediocrity whom life has treated obscenely well next to 99.999 percent of the other people on the planet looking slightly less than pleased with life as he sits next to Jennifer Lopez at the damned Grammys is the "mask of unadorned misery?" to these people? Even as irony, which is profoundly overrated, this is too much.)

Looking at such drivel it is very, very, very hard indeed for even someone as critical of the "Idiocracy" view of the world as I am to believe that the world has not already degenerated into Mike Judge's dystopia--the more in as any blogger who is not catering to complete nitwits has to compete with all that. The result is that where their chances of commanding an appreciable audience today are concerned it is not quite absolute zero it is awfully close to it--justifying the view that, for them, blogging, if not dead, may as well be.

Subscribe Now: Feed Icon