Saturday, March 18, 2023

Just How Well Would Shazam 2 Have to Do to Be a Winner?

What would it take for Shazam 2 (Shazam! Fury of the Gods) to be a hit?

None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.

My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.

Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.

Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.

So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.

With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.

Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.

Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.

The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.

Comparing and Contrasting Box Office Underperformance: Black Panther 2 vs. Atom-Man 3

When looking over Black Panther 2's box office numbers back in the autumn I argued that there was a case for regarding the film as a commercial failure--as it had made, in inflation-adjusted terms, only half what its predecessor had, while given its big budget (reported as $250 million) it may have been short of covering its cost at the end of its theatrical run.

While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.

It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.

Ant-Man 3's Underperformance and the Trajectory of Marvel

In the view of the analysts I have read the disappointing box office performance of Ant-Man 3 was a measure of the viability not only of a relatively minor superhero movie, but rather the bigger Marvel Cinematic Universe of which it is a part--with the disappointment the more significant because of how the franchise has fared in recent years. Once going from strength to strength, all the way up to the two-part confrontation of the MCU's assembled heroes with Thanos (a two-film, $5 billion grossing whose second part became the "highest-grossing film in history") it has been very different with the subsequent "Phase Four." Certainly its problems (and particularly those of its first three movies--Black Widow, The Eternals, and Shangi-Chi and the Legend of the Ten Rings) have been partially attributable to the pandemic, and at the international level, also to Phase Four's shutout from China's market. However, it is undeniable that the content itself has been an issue, with anything after Phase Three's build-up to and realization of the battle with Thanos an anticlimax--no "going bigger" really plausible here--while there was a pervasive sense of the pursuit of diminishing returns, be it in the turn toward prequels with Black Widow, to more obscure characters in The Eternals, and the all too characteristic weariness and tonal confusion of a series gone on too long in Thor: Love and Thunder, even before one considers such awkwardness as the excessive eagerness of Disney to tie its feature films to its small-screen streaming series in Dr. Strange 2 and the presentation of a Black Panther sequel without . . . Black Panther. The result was that only the Spider-Man sequel, which was again a one-of-a-kind event in its drawing together the stuff of three separate big-screen Spider-Man series into one sprawling "multiverse" narrative really felt like an event. And unsurprisingly, only that one performed really "above and beyond" at the box office (in fact becoming the movie that, with almost $2 billion banked globally, proved "filmgoing is back").

The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.

Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.

Is Ant-Man 3 a Flop? Crunching the Numbers

When Ant-Man and the Wasp: Quantumania came out on President's Day it looked like a hit, with its $106 million take in the first three days and the $120 million it picked up over the four-day weekend at the North American box office--a feat that seemed the more impressive in as Ant-Man has always been a "second-string" character. (Ant-Man is not Spider-Man, after all. Or Iron Man or Black Panther or Captain America or Captain Marvel or Thor or even Dr. Strange . . .) Still, a couple of weekends later the view of the film as "hit" (commercially, at least; the critics liked it a lot less than its predecessor) gave way to a view of the film as "flop," not only in the eyes of the professional Marvel-bashers (e.g. those who hate Disney's "woke" turn and are eager for evidence of the audience rejecting it), but even quite mainstream observers ordinarily well-disposed to the brand.*

Of course, those discussing these matters for a broad audience rarely crunch the numbers in any rigorous way, at best citing a figure or two, but the truth is that this is easy enough to do on the basis of conveniently available data--like the stats at Box Office Mojo--with an obvious place to start what prompted that change of opinion, namely the sharp drop in gross between the first weekend and the second. That drop was indeed dramatic, some 70 percent between the first and second Friday-to-Sunday periods. Still, one should put it into perspective. Ant-Man 3 was a "threequel," and not to some long-awaited follow-up to a record-breaking "New Classic," but the not-so-long-ago last film in one of the weaker component franchises of the exceedingly prolific Marvel Cinematic Universe; with that movie also debuting on a four-day holiday weekend. One expects such a film's take to be very front-loaded indeed--and the gap with the second weekend to show it--as seen with the preceding Ant-Man film, which had a 62 percent drop between the first and second weekends. The 70 percent drop Ant-Man 3 had does not seem very much worse in the circumstances (and even less so when compared with the franchise's other post-pandemic releases).**

Still, if the significance of the drop can be exaggerated one ought not to rush to dismiss it either. The days and weeks that followed bore out the impression of a faster fade for Ant-Man 3, with the result that 28 days on, if running ahead of Ant-Man 2 in current dollar terms ($202 million to $189 million), was about a ninth behind it in inflation-adjusted, real dollar terms (the prior film's $189 million more like $226 million), with the decline the more striking as Ant-Man 2 was, again, not too leggy itself. This is still not a disaster, but no great success, either, given what it bodes for the final take. Ant-Man 2 finished with about $217 million, or $259 million adjusted for inflation. Standing at $202 million after last Thursday, with the movie unlikely to make so much as $5 million over this weekend, it is all but certain to finish well short of Ant-Man 2's inflation-adjusted gross, doing well to finish up in the vicinity of $220 million.

All the same, in considering this it is worth remembering that in making and distributing Ant-Man 3 Disney/Marvel is not only counting on the North American market, but the global market, where Marvel's appeal has long been strong, and which has redeemed many a stateside disappointment. Indeed, it is worth recalling that Ant-Man 1 and Ant-Man 2 each made twice as much as their American take overseas, a fact crucial to the franchise's continuation. Adjusted for inflation Ant-Man 2 added to its rough quarter-billion dollar North American take a half billion more dollars in those international markets (its $406 million 2018 take abroad equal to $484 million in today's terms). Were Ant-Man 3 to manage the same one would not look on it too unfavorably.

So how has it been doing there?

Alas, not nearly so well as that. The movie has, at last check, made just $250 million--and may not have much more left to collect. After all, when Ant-Man 2 came out in U.S. its release in many important markets (among them Britain, Japan and of course China) came only a month later (indeed, almost two months later in the case of Japan and China). By contrast Ant-Man 3 was released in pretty much every major market at the same time, so far from waiting and seeing how it plays overseas we have already seen that, with the resulting numbers not encouraging. (Consider the example of China. Where Ant-Man 2 picked up $120 million in China in its first four weeks--the equivalent of $140 million today--thus far it has collected less than $40 million.)

Globally I simply do not see Ant-Man picking up another $250 million, or even $150 million abroad. Some are now suggesting that it will not get much past the half billion dollar mark, if that. This would make it the lowest-grossing of the Ant-Man movies in real terms, and even current-dollar terms, with this the more problematic given the $200 million production budget. While movie budgets are, again, often a matter of dubious, loss-claiming accounting, and often frayed by subsidy and production placement, while movie revenues are over time supplemented by revenue streams from home video, television, merchandising, etc. that enable even flops to get "into the black" after the passage of enough time, let us take the cited figure at face value, and stick with the rule of thumb that a profit on the theatrical run requires the gross of four to five times that figure (while anyway being important because a good theatrical run is what makes people buy your merchandise and want to pay streaming fees and the rest anyway). According to the resulting calculation the movie needed to make something in the $800 million to $1 billion range--about what it would have made if it bettered the inflation-adjusted take of Ant-Man 2 by even ten percent or so, and at least half again what it actually seems likely to collect. Even considering the possible compensations of other funding and revenue streams that gap between where it stands now and what it might take to "break even" yawning.

What does all this add up to in the end? That Ant-Man 3's North American gross, if initially impressive, dropped sharply after its opening weekend, not only showing disappointing legs but making for a disappointing total--but only moderately so. The real problem is the overseas performance of the movie, which positively crashed compared with Ant-Man 2's performance. (Rather than making a sixth less than Ant-Man 2, the movie may make more like two-fifths less elsewhere--falling from nearly $500 million in today's terms to under $300 million.) The latter goes a much longer way to putting a big dent in the movie's earnings, and Disney's balance sheets--with, interestingly, this more important part of the story (financially, anyway) scarcely being talked about at all.

* The critics gave Ant-Man 2 an 87 percent ("Fresh") Tomatometer rating. By contrast Ant-Man 3 rated just 47 percent (with the "Top Critics" score falling even more sharply, from 83 to 39 percent).
** According to the Bureau of Labor Statistics' Consumer Price Index inflation calculator Ant-Man 2's $75.8 million in July 2018 is roughly equivalent to $90.5 million in February 2023 terms.

Friday, March 17, 2023

Planet of the Chatbots

These days to mention Planet of the Apes is generally to evoke film--with, I suppose, the listener's generation determining the version of which they think. The young, I suppose, may be most likely to think of the currently ongoing franchise (headed into its fourth installment with The Kingdom of the Planet of the Apes headed to a theater near you in 2024). Still, even they may remember, if only through constant reference and parody (down to a whole invented musical in The Simpsons' glorious "golden age"), the first 1968 film adaptation of Planet of the Apes, and its closing scene with Charlton Heston's Taylor pounding the sand with his fist in rage and grief in the shadow of the Ozymandias-like wreck of the Statue of Liberty, testifying to the destruction of humanity in a nuclear war.

Yet the franchise really began with a novel that had nothing to do with nuclear war, or the latterly more popular theme of genetic engineering--Pierre Boulle's 1963 novel offering rather a different "secret" to the rise of the ape-dominated order, to which the book's narrator-protagonist first catches on during a visit to the apes' stock exchange, and subsequent "remembrance of the world of finance" as he had known it back on Earth. This was that for all its apparent complexity humanity had built a civilization on the basis not of reasoning, original thinking, but of "monkey see, monkey do," such that even "monkeys" could replicate it, and eventually did.*

Considering how chatbots work, and all people seem ready to credit them with being capable of doing, it appears we may be having such a moment. Our chatbots are a very long way from that intelligence that can "parse" language, and process subtle audio and visual data, and reason from fact to conclusion. By design they are just imitators, and often not terribly convincing ones as yet--but whether the task is writing an essay for a student, or being a romantic partner (!), they seem capable of all that is required.

I will not pretend to have all the implications of this worked out. But I think that it says more about the nature of society than of "human nature." Humans have their capacity for intelligence and creativity; but society, intricately and stultifyingly hierarchical and controlling, with relatively few allowed very much in the way of choice and initiative, requires the vast majority of us to simply play a limited and limiting part, even those who perform supposedly "elite" functions. Indeed, much of what we call "education" is about conditioning the individual to accommodating themselves to this limited part (with those acquiescent to the process showered with moral praises and those who are less acquiescent showered with opprobrium as "immature," "irresponsible," "refusing to grow up," etc.). Putting it bluntly, society trains people to be machines, and rewards or punishes them for being more or less serviceable machines. We should therefore not be surprised when we find that a machine is a perfectly adequate replacement for a human trying to be one.

* Indeed, in the book it was the use of trained apes, not machines, that became the basis of labor-saving and "automation," such that one could see it as a variation on the "robot rebellion" theme. (And lest it need be said, the author is fully aware that apes are not monkeys, hence the quotation marks.)

Ezra Klein on AI in the New York Times: A Reaction

Earlier this week journalist Ezra Klein penned an opinion piece on artificial intelligence (AI) in the New York Times.

What the piece offers the reader was predictable enough--a hazy sense that "Big Things" are happening, bolstered by an example or two (DeepMind "building a system" that "predict[ed] the 3-D structure of tens of thousands of proteins," etc.), after which, faced with the "Singularity" we get not the enthusiastic response of the Singularitarian but the more commonplace, negative reaction. This included plenty about our inability to actually understand what's going on with our feeble, merely human minds (citing Meghan O'Gieblyn repeatedly, not least her remark that "If there were gods, they would surely be laughing their heads off at the inconsistency of our logic"), which goes so far as to characterize artificial intelligence research as magic ("an act of summoning. The coders casting these spells have no idea what will stumble through the portal"), and declare that confronted with what he sees as the menace of AI to our sense of self we may feel compelled to retreat into the irrational ("the subjective experience of consciousness"), though we may hardly be safe that way (Mr. Klein citing a survey reporting that AI researchers themselves were fearful of the species being wiped out by a technology they do not control). Indeed, the author concludes that there are only two possible courses, either some kind of "accelerate[d] adaptation to these technologies or a collective, enforceable decision . . . made to slow the development of these technologies," while basically dismissing any skepticism toward his view of present AI research as a meddling with forces we cannot understand out of Lovecraftian horror as an ostrich-like burial of one's head in the sand (closing with Eric Davis' remark that "[i]n the court of the mind, skepticism makes a great grand vizier, but a lousy lord").

And so: epistemological nihilism, elevation of the irrational and anti-rational and even obscurantist (he actually talked about this as magic, folks!), misanthropy, Frankenstein complex-flavored Luddism, despair and the brush-off for any alternative view. In other words, same old same old for this take on the subject, old already in Isaac Asimov's day, so old and so pervasive that he made a career of Robot stories opposing the cliché (to regrettably little effect to go by how so many cite an Arnold Schwarzenegger B-movie from 1984 as some kind of philosophical treatise). Same old same old, too, for the discussion of the products of human reason going back to the dark, dark beginnings of the Counter-Enlightenment (on which those who find themselves "overwhelmed" have for so long been prone to fall back).

Still, it is a sign of the times that the piece ran in this "paper of record," especially at this moment in which so many of us (the recent, perhaps exaggerated, excitement over chatbots notwithstanding) feel that artificial intelligence has massively under-delivered, certainly when one goes by the predictions of a decade ago (at least, in their wildly misunderstood and misrepresented popular form, next to which chatbots slapping together scarcely readable prose does not seem so impressive). There is, too, a somewhat more original comparison Mr. Klein makes on the way to dismissing the "skeptical" view, namely between AI and cryptocurrency, raising the possibility of enthusiasm for the former replacing the enthusiasm for the latter as, in the wake of the criminality and catastrophe at FTX, investors pour their money into artificial intelligence.

Personally I have my doubts that AI will become anywhere near the darling of investors that "crypto" was. However much the press looks full of stories of startups raising capital, the plain and simple truth is that Wall Street's love affair with "tech" has not been anything close to what it was in the '90s since that time. (Indeed, investors have preferred that oldest, least tech-y commodity, real estate--hence the bubble and bust and other turmoil of this whole stagnant yet crisis-ridden century.) I do not see that behavior changing now--the more in as money is getting tighter and the economy slower.

Still, the activity in that area may be something for those interested in the field to watch for those curious as to how far things may go, how fast Meanwhile I suggest that anyone really concerned for humanity's survival direct their attention not to fantasies of robot rebellion, but to the fact that such scenarios seem a way of overlooking the very numerous, deadly serious and entirely real conflicts among humans of profoundly unequal power and opposed interest (not AI vs. human, but human vs. human with AI in the mix, and the more probable danger the way some humans may use AI against others). I suggest, too, that those concerned for human survival also worry less about artificial intelligences generally than nuclear weapons, fossil fuels, "forever chemicals," etc.--while remembering that besides the dangers caused by the technologies that we have there is the danger of our failing to produce, or properly use, the technologies we need in order to cope with the problems of today.

A Note on Balzac's The Two Brothers

Reading Lucien de Rubempre's story in Balzac's Lost Illusions I felt that, Jack London's Martin Eden apart, it was the truest portrait of what the young artist faces as he embarks upon a career.

If less impressive on that score there is nonetheless a comparable bit of that of truth in Balzac's other novel, The Two Brothers. There Joseph Bridau, a young artist of talent and application is looked down upon as an addle-brained youth of no prospects because of his career choice by virtually everyone, and a constant worry to his conventionally-minded mother--even as Joseph, because of his work as a painter, and his loyal and responsible personal conduct, is her and the family's sole support (his earnings not from a "day job," but from his painting itself, what enable them to survive financially). The lack of appreciation for what he does, and her lack of appreciation for Joseph because he does it, is accentuated by the contrast between the way his mother looks at him and the favored, elder son, Philippe, a creature of monstrous selfishness, callousness and irresponsibility who, after a superficially "brilliant" career as a soldier in the Napoleonic Wars, brings nothing but calamity to his family, and to anyone else unfortunate enough to be associated with him, in spite of which he remains the favorite, but ever remains the more admired and looked-to for "great things." (All this is even more the case when they venture out to the mother's provincial home town of Issodun.)

It is only as the end of the story draws near, and their mother approaches the end of a life constantly darkened and ultimately shortened by Philippe's betrayals, that she is induced--by her confessor--to recognize that if there has been a sin in her life it has been her not loving and appreciating Joseph. In the years after Philippe, whose ambitions have ultimately been wrecked by his own indiscretion (and the all too contemporary-seeming financial idiocy-cum-madness which looms so large in his, and others', tales of the Paris of that era), dies in a colonial campaign, and leaves what remain of his fortune and title to Joseph, who at the end of the story is not only a great success as an artist, but wealthy and literally ennobled (the painter a Comte, no less).

Considering Joseph's trajectory, which could not be more different from Lucien's, I had an impression of Balzac writing a wish-fulfillment here. I suppose it is a reminder that even so cold-eyed a realist as Balzac could be—so much so as has put many a reader off of him from his own time to today (Dickens acknowledging the fact, more recently even David Walsh allowing that "Balzac's relentless, ferocious assault on this environment and its denizens can at times be wearing")--even he could, occasionally, allow at least one of his characters a good outcome.

Thursday, March 16, 2023

The Academy Goes With the Popular Choice: The 48th Academy Awards, Rocky's Best Picture Win, and the Trajectory of Cinema Ever Since

At the 48th Academy Awards there were five films up for the Best Picture Oscar, namely Alan J. Pakula's All the President's Men; Hal Ashby's Bound for Glory; Martin Scorsese's Taxi Driver; Sidney Lumet's Network; and John G. Avildsen's Rocky.

The first four films were what a certain sort of ideologue would call "elitist"--tonally and technically sophisticated, with a socially critical outlook (be it Pakula's thriller-like treatment of Bob Woodward and Carl Bernstein's investigation of the Watergate scandal, Ashby's biopic about folk singer Woody Guthrie, Scorsese's vision of Times Square in a New York approaching what is still conventionally remembered as its "rock bottom," or Lumet's satire of network television centering on the rise of a right-wing populist "mad prophet of the airwaves"). By contrast Rocky was an old-fashioned sports story about an underdog who gets a shot at the top and makes the most of it, with, some thought, a rather right-wing edge (in its essentially "aspirational," "You only need willpower to be a champ" attitude, and what some saw as its racial subtext), which got it called "populist."

In what was then and now seen as a gesture toward the broader moviegoing audience Rocky was given the prize over those other more conventionally Oscar-worthy (more technically accomplished, more thematically ambitious, more substantial) films. And if it is less often spoken of than, for example, Jaws or Star Wars by those historians who recount the end of the New Hollywood, it does get mentioned from time to time as a sign of where film was headed, with some justice given the sequels. Looking back at Rocky IV, for example, one sees in the sequel-itis, the Cold War-ism, the thoroughly "high concept" character of the product's look and feel and tone, the "New" New Hollywood clearly arrived.

All the same, there is a surprise for those who look back from the sequel, or from Rocky's reputation as a marker of the decline of the New Hollywood, to the original. If Rocky IV purely belonged to the new era, "Rocky I" did not--with, instead of mansions and press conference halls, grand arenas and training facilities out of the future, the screen saturated with the shabbiness of working-class districts amid which people lead dead-end lives, while the movie actually looks, classically, like a movie. One can even quantify the fact. Where Rocky IV has an Average Shot Length (ASL) of a mere 2.3 seconds--shrinking toward 1 second in the training montages and fight scenes--the ASL was 8.5 seconds in the original, the sequel squeezing almost four times as many shots as the original into any given space of time, and watching it you really feel it.* Where the mid-'80s follow-up a 90-minute music video intended above all to dazzle with light and sound, with synchronization of image and soundtrack, all very Michael Bay a decade before Bay shot his first feature, the original was a movie telling a story, whatever one makes of it, about people in places that were none too dazzling in a way much out of fashion in major feature film by the '80s, and even more so now.

* Figures on ASL from the ever-handy and much-recommended Cinemetrics web site.

Of Media Bias and Technological Reportage

When most of us speak of "media bias," I think, we have in mind the "Big" mainstream, media--the national-level operations that tend toward a "centrist" perspective.

The "political economy" of that media colors its coverage of every issue, of course.

The particular issue that interests me at the moment is its attitude toward technology.

As it happens this media seems to me utterly technophiliac when it comes to pushing the garbage business wants to sell the consumer--a pack of claqueurs when it comes to something like, for instance, "the Internet of Things" they wanted us all to be so excited about a little while ago.

By contrast they turn into a pack of sneering technophobes when it comes to technologies that might actually solve problems--like, for instance, anything at all that might help with a problem such as climate change, from cellular agriculture to renewable energy.

This can seem contradictory. But really it isn't. It is a natural outcome of a business that lives on selling fear, not selling hope; which is deferential to industries and their spokespersons that love to talk about "disruption" when singing "the market" but do everything possible to resist disruption in real life whenever it threatens their speculation and rent-seeking (sometimes because they actually own said media); and, with all a centrist's fearfulness of calls for change (and perhaps also their equation of pessimism with wisdom), committed to a narrative of apathy-inducing doomism precisely because it is apathy-inducing. Michael Mann rightly speaks of "inactivism" in this area--which seems their default mode toward everything in life, and toward technology ever making any change in it for the better.

Disentangling the Media's Biases: Center, Right and Left

In talking about "media bias" it may be helpful to discuss which part of the media we are talking about, as it is not all one thing.

Simply when generalizing among national-level news outlets from the standpoint of the familiar ideological spectrum one can speak of a "mainstream" media--that media which is most commonplace and visible and generally accepted as the baseline of discussion--of which the New York Times could be called representative. People often call it "liberal," or even "left," but it is far more accurate to call it "centrist."

Meanwhile there is a considerable media to its right, which we could identify with, for example, the Wall Street Journal or the New York Post.

There is also a body of media which can be considered genuinely "left." Much smaller and less visible than the others, and generally shut out of a mainstream conversation which ordinarily acts, in part because it prefers to act, as if it did not exist (this is implicitly what happens when people call centrists, whose philosophy is actually deeply conservative and whose positions are ever further to the right, "the left"), one might name as an example of a news site of indisputable left credentials, and which has now and then come to mainstream attention in the wake of the controversies over political-censorship-by-search-engine-algorithm and the 1619 Project the World Socialist Web Site. (If you've never heard of it, that just goes to prove the point about the left media's marginality.)

We do well to remember which of these we are talking about when we speak of the media and its "bias."

"Thank You For Your Support." Wait, What Support?

When considering the "aspiring" writer's lot I think time and again of Jack London's Martin Eden--the truest depiction of that unhappy state that I have ever encountered in print--and often one particular aspect of that situation, its loneliness. This is not just a function of the fact that writing is a solitary and often isolating activity, but the way that others treat someone going about it when they have not "made it"--specifically their lack of empathy, sympathy and respect, and their sense that they have no obligation to be tactful, or even civil, about what they really think. There is, too, what all of this means if one ever does get anywhere, as Eden does--for he does "make it," becoming rich and famous, such that the multitude of people who knowing full well who he was would not have looked twice at the sight of him starving to death in the street now all want to have him over for dinner, or even marry their daughter.

So I suspect does it go online--even at the level of merely giving people an absolutely cost-less "like" or "share" on social media and those other platforms akin to it. A celebrity can get an outpouring of enthusiasm for uttering pure banality--while a nobody can scarcely hope for support no matter what they say, a mere click of the mouse that could mean a great deal in their position begrudged them. It is a testament to the shallowness of our online life--and the extreme stupidity of so many of those who people it. And experiencing it this way takes its toll. In Balzac's Lost Illusions Lucien de Rubempre, callow and ultimately ill-fated as he is, does realize some hard facts of life along the way before his final destruction, among them not only how scarce a sympathetic word or gesture can be for those who have not "succeeded" ("In the home circle, as in the world without, success is a necessity"), but the toll that even those who attain "success" are likely to pay before they get there. ("Perhaps it is impossible to attain to success until the heart is seared and callous in every most sensitive spot.")

In that there may be the tiny grain of truth in those stupid scenes of writers with supercilious looks on their faces as they autograph copies of their latest for fawning idiots--the praises cannot touch them, because if they had not long ago stopped caring what other people think they would have given it all up, perhaps as completely as Lucien does.

Wednesday, March 15, 2023

The Rubble of the '90s

Over the years I have written a lot about the '90s--in part because the period encouraged certain illusions which have proven destructive, and increasingly appeared destroyed themselves by contact with reality. There was the way that those singers of the information age, so long trying to persuade Americans that what was obviously deindustrialization was actually progress, almost looked credible amid the tech boom of the latter part of the decade--enough so as to profoundly distort the national dialogue about the economy for an extended and critical period. There was the way that Americans were persuaded to believe the "unipolar moment" of uncontestable full-spectrum American hyperpower might go on forever. There was the way the public was persuaded that the Internet would be an ultra-democratic cyber-utopia, as political correctness triumphed over all--so much so that, as Thomas Frank recounts, right-wing populism seemed to be assuming a new face, not only trying to persuade the people that "the market" was the common man's best friend, but appearing to embrace wokeness, while declaring the market its greatest champion.

What people think of as the elite are, as I have said again and again and will go on saying again and again, careerist mediocrities who are profoundly out of touch with reality, as they show us again and again when relied upon to lead the way. Accordingly many of them may still believe in such drivel--still, to use Paul Krugman's words, seem "stuck in some sort of time warp, in which it's always 1997"--but not all of them do so, while credulity in such silliness is the weaker the further out one gets from that overprivileged little world they inhabit.

C.P. Snow's "The Two Cultures" in 2023

The essential claim made in C.P. Snow's lecture-turned-essay on "The Two Cultures" of science and letters was that this division of intellectual life between them in the modern West has gone with mutual ignorance, incomprehension and even dislike, between those on either side of the split.

On first encountering C.P. Snow's essay I thought there was a good deal of truth to it and still do. After all, intellectual life is indeed extremely specialized, with the fact underlined by how outside their specialties even exceptionally intelligent and well-educated people often have only the most banal thoughts to offer (as they often show when accorded that kind of Public Intellectual status in which they are given a platform from which to comment on anything and everything). And all of this seems to me to have got worse rather than better with time, as their work has only become more esoteric and minute (how many scientists does it take to publish a single paper these days?), while the pursuit of an academic or research career has become a more harried thing, leaving less time or energy for "extracurriculars," which at any rate are frowned upon. (The "professions," as the root of the word suggests, are indeed priesthood-like in many ways, often living down to the unhappiest and most stultifying connotations of that term, like that disapproval of any outside interest, or indeed any ideas but the orthodoxy imposed by the hierarchs.)

This specialization may have gone so far that those in the sciences and letters alike are doing their jobs less well than they may need to be doing them--with scientists reportedly lacking sufficient grounding in philosophy to understand the scientific method on which their work depends, while, as I discovered struggling with composition textbooks, and their usually clumsy explanations of so basic a matter as "What is a thesis?" (while finding reference to the scientific method, to scientific examples, useful in redressing such weaknesses--in the classroom and in my own book) I have often wondered if the writers would not have benefited from a bit of scientific study themselves.

Still, salient as the division within intellectual life remains there is also the political framework within which that life goes on. To his credit, Snow was sensitive to that. The cultural orthodoxy is a function of the priorities of those who have power--whose concern is first, last and always the bottom line, to which they recognize science as contributing. That is not to say that they actually respect the scientific method, or the work of scientists, even just enough to care about funding scientific education and research properly, let alone make use of the best available scientific advice when making policy--just that they know there is something here they can use, much more than is the case with letters.

All of this would seem to have become more rather than less consequential in the neoliberal era--amid unending austerity, with the pressure to justify everything in terms of short-term corporate profitability the greater, with a predictable result that where neoliberalism has gone, so too has the imperative to reduce education to occupational-technical training (from the Thatcher-era overhaul of educational funding, which has continued ever since, to the requirements of the "structural reform" programs foisted on developing countries). Meanwhile the politics of interest are ever more marginalized, the politics of status in the ascendant--with the resulting era of culture war seeing the humanities demonized, to the point that disdain for the work of historians and the like has right-wingers openly gloating over the prospect of the demise of humanities programs on campus.

It would be a profound mistake to imagine all this does not have consequences for how those of the sciences and letters see each other--with, perhaps, the young indicative of the trajectory. While I think that there is, again, much ignorance and incomprehension on the part of those in the sciences and letters alike, I never actually noticed any actual disrespect. I am less sure that this is the case today--certainly to go by what we hear of the sneering, mocking STEM major openly disdainful of the humanities, an attitude all too predictable in an era where the war cry of "STEM! STEM! STEM!" is unceasing, and unmatched by anything the defenders of the humanities have to say on their behalf.

Will the New Chatbots Mean the End of the Novelist?

Over the years I have remarked again and again the reports that writers' wages--and certainly the wages of writers of fiction--have long been collapsing. The result is that, while the chances to "make a living" writing fiction (or anything else) have always been few, the path far from straightforward, the remuneration usually paltry relative to the effort and skill one put in--and one might add, society at large deeply unsympathetic to the travails in question (just ask Balzac, or London)--it does not seem unreasonable to say that matters have just gone on getting worse this way.

Consider, for instance, how even in the middle years of the last century there were writers who scraped by (sometimes did better than scrape by) selling short stories to pulp fiction magazines. Any such thing seems unimaginable today--in an age where writers have it so tough that they are giving away vast amounts of quite readable content, and finding no takers. Meanwhile what could be salable seems an ever smaller category as a good many old reliable genres die (consider action-adventure fiction, for example), and what does sell, if always reliant on authors' platforms and "brand names" to persuade consumers to take a look (just remember the unpleasant truths spoken by Balzac's vile Dauriat), the reliance on those supports would seem to have only grown greater and greater, while one is struck by how new platforms and new brand names are not emerging, hinting at the game being one of milking an aging and dwindling consumer base. (Just look at the paperback rack. The thriller novelists you will find there are pretty much the same ones you saw in the '90s--James Patterson, John Grisham and company, with Tom Clancy and Clive Cussler still writing somehow to go by the bylines.)

Indeed, it seems to me safe to say that the key factor here has been the shift in the relation of supply to demand. People spend a lot less time reading fiction than they used to do, their leisure time increasingly consumed by audiovisual media. (I suspect that in this century far more people have played Tom Clancy video games than read Tom Clancy novels.) Meanwhile, due to the widening of educational opportunity, the number of those who endeavor to become published authors has grown, with publishing fiction, while a nearly impossible task for anyone who does not have fame or nepotism on their side, still seeming more plausible than trying to make it as a writer in the more glamorous but more closed-off film and television industry. Still, whatever one makes of the cause, the result is inarguable--and the writers' troubles may well escalate if what we are hearing about chatbots bears out. Whatever writers, critics or readers may think, publishers--who are first, last and always businessmen, moving literature as they want any other good (again, I refer you to Balzac)--prefer to keep putting out the same old stuff that has had takers in the past, and it is inconceivable that they would balk at replacing writers with chatbots. For the moment, of course, they are barred from taking such a course by the limits of those bots, which are much better at producing short work than long (and truth be told, not producing that very well). However, if we see the progress some expect they may get to the point where a human would only be needed to clean up the result--and maybe do a lot less clean-up in the process--within a decade's time, and publishers will not hesitate to exploit the possibility (such that the Jack Ryan and Dirk Pitt novels of ten years hence, and the Alex Cross novels too, may be churned out by some new iteration of GPT). The only question then would be whether there will still be a big enough audience for those books to make the venture profitable.

The New Yorker Notices the Collapse of the Humanities on Campus

Having recently noted for myself the surge in STEM majors as enrollment in the humanities collapses, and wondered why we were not hearing more about either phenomenon, Nathan Heller's "The End of the English Major" caught my eye.

Alas, Heller's piece reminded me of much that I had to say about the quality of the coverage rendered by the mainstream media a while back. Here as elsewhere the journalist offers a few factoids, and comment culled from others presumed to have an understanding of the matter, with some interesting bits possibly to be found in the mass, but the whole less than the sum of its parts, many of which were not impressive to begin with. Too much of it consists of the writer recounting his wandering about the Harvard campus getting quotes from students and faculty of whom we are supposed to be in awe because "HARVARD, HARVARD, HARVARD!" ("[G]olden kids from Harvard," he calls the students. "Basic employability is assured by the diploma: even a Harvard graduate who majors in somersaults will be able to find some kind of job to pay the bills," Heller writes. Alas, not how it works, but that he says so is very telling about the level of thought to be found here.) Then after this he dutifully endeavors to force the bits all into the "brisk, forward-looking, optimistic" view to which our journalism, as so much else, is so unfortunately addicted. The result is that instead of a better sense of "What it all means" there was a rain of details of uncertain meaning and often no meaning at all, a good deal of obnoxious, shamelessly name-dropping elitism in its particularly grating "Cult of the Good School" form (HARVARDHARVARDHARVARDHARVARD!), and ultimate complacency, that left me persuaded that I wasted my time bothering with the thing.

Subscribe Now: Feed Icon