Saturday, March 18, 2023

Is Ant-Man 3 a Flop? Crunching the Numbers

When Ant-Man and the Wasp: Quantumania came out on President's Day it looked like a hit, with its $106 million take in the first three days and the $120 million it picked up over the four-day weekend at the North American box office--a feat that seemed the more impressive in as Ant-Man has always been a "second-string" character. (Ant-Man is not Spider-Man, after all. Or Iron Man or Black Panther or Captain America or Captain Marvel or Thor or even Dr. Strange . . .) Still, a couple of weekends later the view of the film as "hit" (commercially, at least; the critics liked it a lot less than its predecessor) gave way to a view of the film as "flop," not only in the eyes of the professional Marvel-bashers (e.g. those who hate Disney's "woke" turn and are eager for evidence of the audience rejecting it), but even quite mainstream observers ordinarily well-disposed to the brand.*

Of course, those discussing these matters for a broad audience rarely crunch the numbers in any rigorous way, at best citing a figure or two, but the truth is that this is easy enough to do on the basis of conveniently available data--like the stats at Box Office Mojo--with an obvious place to start what prompted that change of opinion, namely the sharp drop in gross between the first weekend and the second. That drop was indeed dramatic, some 70 percent between the first and second Friday-to-Sunday periods. Still, one should put it into perspective. Ant-Man 3 was a "threequel," and not to some long-awaited follow-up to a record-breaking "New Classic," but the not-so-long-ago last film in one of the weaker component franchises of the exceedingly prolific Marvel Cinematic Universe; with that movie also debuting on a four-day holiday weekend. One expects such a film's take to be very front-loaded indeed--and the gap with the second weekend to show it--as seen with the preceding Ant-Man film, which had a 62 percent drop between the first and second weekends. The 70 percent drop Ant-Man 3 had does not seem very much worse in the circumstances (and even less so when compared with the franchise's other post-pandemic releases).**

Still, if the significance of the drop can be exaggerated one ought not to rush to dismiss it either. The days and weeks that followed bore out the impression of a faster fade for Ant-Man 3, with the result that 28 days on, if running ahead of Ant-Man 2 in current dollar terms ($202 million to $189 million), was about a ninth behind it in inflation-adjusted, real dollar terms (the prior film's $189 million more like $226 million), with the decline the more striking as Ant-Man 2 was, again, not too leggy itself. This is still not a disaster, but no great success, either, given what it bodes for the final take. Ant-Man 2 finished with about $217 million, or $259 million adjusted for inflation. Standing at $202 million after last Thursday, with the movie unlikely to make so much as $5 million over this weekend, it is all but certain to finish well short of Ant-Man 2's inflation-adjusted gross, doing well to finish up in the vicinity of $220 million.

All the same, in considering this it is worth remembering that in making and distributing Ant-Man 3 Disney/Marvel is not only counting on the North American market, but the global market, where Marvel's appeal has long been strong, and which has redeemed many a stateside disappointment. Indeed, it is worth recalling that Ant-Man 1 and Ant-Man 2 each made twice as much as their American take overseas, a fact crucial to the franchise's continuation. Adjusted for inflation Ant-Man 2 added to its rough quarter-billion dollar North American take a half billion more dollars in those international markets (its $406 million 2018 take abroad equal to $484 million in today's terms). Were Ant-Man 3 to manage the same one would not look on it too unfavorably.

So how has it been doing there?

Alas, not nearly so well as that. The movie has, at last check, made just $250 million--and may not have much more left to collect. After all, when Ant-Man 2 came out in U.S. its release in many important markets (among them Britain, Japan and of course China) came only a month later (indeed, almost two months later in the case of Japan and China). By contrast Ant-Man 3 was released in pretty much every major market at the same time, so far from waiting and seeing how it plays overseas we have already seen that, with the resulting numbers not encouraging. (Consider the example of China. Where Ant-Man 2 picked up $120 million in China in its first four weeks--the equivalent of $140 million today--thus far it has collected less than $40 million.)

Globally I simply do not see Ant-Man picking up another $250 million, or even $150 million abroad. Some are now suggesting that it will not get much past the half billion dollar mark, if that. This would make it the lowest-grossing of the Ant-Man movies in real terms, and even current-dollar terms, with this the more problematic given the $200 million production budget. While movie budgets are, again, often a matter of dubious, loss-claiming accounting, and often frayed by subsidy and production placement, while movie revenues are over time supplemented by revenue streams from home video, television, merchandising, etc. that enable even flops to get "into the black" after the passage of enough time, let us take the cited figure at face value, and stick with the rule of thumb that a profit on the theatrical run requires the gross of four to five times that figure (while anyway being important because a good theatrical run is what makes people buy your merchandise and want to pay streaming fees and the rest anyway). According to the resulting calculation the movie needed to make something in the $800 million to $1 billion range--about what it would have made if it bettered the inflation-adjusted take of Ant-Man 2 by even ten percent or so, and at least half again what it actually seems likely to collect. Even considering the possible compensations of other funding and revenue streams that gap between where it stands now and what it might take to "break even" yawning.

What does all this add up to in the end? That Ant-Man 3's North American gross, if initially impressive, dropped sharply after its opening weekend, not only showing disappointing legs but making for a disappointing total--but only moderately so. The real problem is the overseas performance of the movie, which positively crashed compared with Ant-Man 2's performance. (Rather than making a sixth less than Ant-Man 2, the movie may make more like two-fifths less elsewhere--falling from nearly $500 million in today's terms to under $300 million.) The latter goes a much longer way to putting a big dent in the movie's earnings, and Disney's balance sheets--with, interestingly, this more important part of the story (financially, anyway) scarcely being talked about at all.

* The critics gave Ant-Man 2 an 87 percent ("Fresh") Tomatometer rating. By contrast Ant-Man 3 rated just 47 percent (with the "Top Critics" score falling even more sharply, from 83 to 39 percent).
** According to the Bureau of Labor Statistics' Consumer Price Index inflation calculator Ant-Man 2's $75.8 million in July 2018 is roughly equivalent to $90.5 million in February 2023 terms.

Friday, March 17, 2023

Planet of the Chatbots

These days to mention Planet of the Apes is generally to evoke film--with, I suppose, the listener's generation determining the version of which they think. The young, I suppose, may be most likely to think of the currently ongoing franchise (headed into its fourth installment with The Kingdom of the Planet of the Apes headed to a theater near you in 2024). Still, even they may remember, if only through constant reference and parody (down to a whole invented musical in The Simpsons' glorious "golden age"), the first 1968 film adaptation of Planet of the Apes, and its closing scene with Charlton Heston's Taylor pounding the sand with his fist in rage and grief in the shadow of the Ozymandias-like wreck of the Statue of Liberty, testifying to the destruction of humanity in a nuclear war.

Yet the franchise really began with a novel that had nothing to do with nuclear war, or the latterly more popular theme of genetic engineering--Pierre Boulle's 1963 novel offering rather a different "secret" to the rise of the ape-dominated order, to which the book's narrator-protagonist first catches on during a visit to the apes' stock exchange, and subsequent "remembrance of the world of finance" as he had known it back on Earth. This was that for all its apparent complexity humanity had built a civilization on the basis not of reasoning, original thinking, but of "monkey see, monkey do," such that even "monkeys" could replicate it, and eventually did.*

Considering how chatbots work, and all people seem ready to credit them with being capable of doing, it appears we may be having such a moment. Our chatbots are a very long way from that intelligence that can "parse" language, and process subtle audio and visual data, and reason from fact to conclusion. By design they are just imitators, and often not terribly convincing ones as yet--but whether the task is writing an essay for a student, or being a romantic partner (!), they seem capable of all that is required.

I will not pretend to have all the implications of this worked out. But I think that it says more about the nature of society than of "human nature." Humans have their capacity for intelligence and creativity; but society, intricately and stultifyingly hierarchical and controlling, with relatively few allowed very much in the way of choice and initiative, requires the vast majority of us to simply play a limited and limiting part, even those who perform supposedly "elite" functions. Indeed, much of what we call "education" is about conditioning the individual to accommodating themselves to this limited part (with those acquiescent to the process showered with moral praises and those who are less acquiescent showered with opprobrium as "immature," "irresponsible," "refusing to grow up," etc.). Putting it bluntly, society trains people to be machines, and rewards or punishes them for being more or less serviceable machines. We should therefore not be surprised when we find that a machine is a perfectly adequate replacement for a human trying to be one.

* Indeed, in the book it was the use of trained apes, not machines, that became the basis of labor-saving and "automation," such that one could see it as a variation on the "robot rebellion" theme. (And lest it need be said, the author is fully aware that apes are not monkeys, hence the quotation marks.)

Ezra Klein on AI in the New York Times: A Reaction

Earlier this week journalist Ezra Klein penned an opinion piece on artificial intelligence (AI) in the New York Times.

What the piece offers the reader was predictable enough--a hazy sense that "Big Things" are happening, bolstered by an example or two (DeepMind "building a system" that "predict[ed] the 3-D structure of tens of thousands of proteins," etc.), after which, faced with the "Singularity" we get not the enthusiastic response of the Singularitarian but the more commonplace, negative reaction. This included plenty about our inability to actually understand what's going on with our feeble, merely human minds (citing Meghan O'Gieblyn repeatedly, not least her remark that "If there were gods, they would surely be laughing their heads off at the inconsistency of our logic"), which goes so far as to characterize artificial intelligence research as magic ("an act of summoning. The coders casting these spells have no idea what will stumble through the portal"), and declare that confronted with what he sees as the menace of AI to our sense of self we may feel compelled to retreat into the irrational ("the subjective experience of consciousness"), though we may hardly be safe that way (Mr. Klein citing a survey reporting that AI researchers themselves were fearful of the species being wiped out by a technology they do not control). Indeed, the author concludes that there are only two possible courses, either some kind of "accelerate[d] adaptation to these technologies or a collective, enforceable decision . . . made to slow the development of these technologies," while basically dismissing any skepticism toward his view of present AI research as a meddling with forces we cannot understand out of Lovecraftian horror as an ostrich-like burial of one's head in the sand (closing with Eric Davis' remark that "[i]n the court of the mind, skepticism makes a great grand vizier, but a lousy lord").

And so: epistemological nihilism, elevation of the irrational and anti-rational and even obscurantist (he actually talked about this as magic, folks!), misanthropy, Frankenstein complex-flavored Luddism, despair and the brush-off for any alternative view. In other words, same old same old for this take on the subject, old already in Isaac Asimov's day, so old and so pervasive that he made a career of Robot stories opposing the cliché (to regrettably little effect to go by how so many cite an Arnold Schwarzenegger B-movie from 1984 as some kind of philosophical treatise). Same old same old, too, for the discussion of the products of human reason going back to the dark, dark beginnings of the Counter-Enlightenment (on which those who find themselves "overwhelmed" have for so long been prone to fall back).

Still, it is a sign of the times that the piece ran in this "paper of record," especially at this moment in which so many of us (the recent, perhaps exaggerated, excitement over chatbots notwithstanding) feel that artificial intelligence has massively under-delivered, certainly when one goes by the predictions of a decade ago (at least, in their wildly misunderstood and misrepresented popular form, next to which chatbots slapping together scarcely readable prose does not seem so impressive). There is, too, a somewhat more original comparison Mr. Klein makes on the way to dismissing the "skeptical" view, namely between AI and cryptocurrency, raising the possibility of enthusiasm for the former replacing the enthusiasm for the latter as, in the wake of the criminality and catastrophe at FTX, investors pour their money into artificial intelligence.

Personally I have my doubts that AI will become anywhere near the darling of investors that "crypto" was. However much the press looks full of stories of startups raising capital, the plain and simple truth is that Wall Street's love affair with "tech" has not been anything close to what it was in the '90s since that time. (Indeed, investors have preferred that oldest, least tech-y commodity, real estate--hence the bubble and bust and other turmoil of this whole stagnant yet crisis-ridden century.) I do not see that behavior changing now--the more in as money is getting tighter and the economy slower.

Still, the activity in that area may be something for those interested in the field to watch for those curious as to how far things may go, how fast Meanwhile I suggest that anyone really concerned for humanity's survival direct their attention not to fantasies of robot rebellion, but to the fact that such scenarios seem a way of overlooking the very numerous, deadly serious and entirely real conflicts among humans of profoundly unequal power and opposed interest (not AI vs. human, but human vs. human with AI in the mix, and the more probable danger the way some humans may use AI against others). I suggest, too, that those concerned for human survival also worry less about artificial intelligences generally than nuclear weapons, fossil fuels, "forever chemicals," etc.--while remembering that besides the dangers caused by the technologies that we have there is the danger of our failing to produce, or properly use, the technologies we need in order to cope with the problems of today.

A Note on Balzac's The Two Brothers

Reading Lucien de Rubempre's story in Balzac's Lost Illusions I felt that, Jack London's Martin Eden apart, it was the truest portrait of what the young artist faces as he embarks upon a career.

If less impressive on that score there is nonetheless a comparable bit of that of truth in Balzac's other novel, The Two Brothers. There Joseph Bridau, a young artist of talent and application is looked down upon as an addle-brained youth of no prospects because of his career choice by virtually everyone, and a constant worry to his conventionally-minded mother--even as Joseph, because of his work as a painter, and his loyal and responsible personal conduct, is her and the family's sole support (his earnings not from a "day job," but from his painting itself, what enable them to survive financially). The lack of appreciation for what he does, and her lack of appreciation for Joseph because he does it, is accentuated by the contrast between the way his mother looks at him and the favored, elder son, Philippe, a creature of monstrous selfishness, callousness and irresponsibility who, after a superficially "brilliant" career as a soldier in the Napoleonic Wars, brings nothing but calamity to his family, and to anyone else unfortunate enough to be associated with him, in spite of which he remains the favorite, but ever remains the more admired and looked-to for "great things." (All this is even more the case when they venture out to the mother's provincial home town of Issodun.)

It is only as the end of the story draws near, and their mother approaches the end of a life constantly darkened and ultimately shortened by Philippe's betrayals, that she is induced--by her confessor--to recognize that if there has been a sin in her life it has been her not loving and appreciating Joseph. In the years after Philippe, whose ambitions have ultimately been wrecked by his own indiscretion (and the all too contemporary-seeming financial idiocy-cum-madness which looms so large in his, and others', tales of the Paris of that era), dies in a colonial campaign, and leaves what remain of his fortune and title to Joseph, who at the end of the story is not only a great success as an artist, but wealthy and literally ennobled (the painter a Comte, no less).

Considering Joseph's trajectory, which could not be more different from Lucien's, I had an impression of Balzac writing a wish-fulfillment here. I suppose it is a reminder that even so cold-eyed a realist as Balzac could be—so much so as has put many a reader off of him from his own time to today (Dickens acknowledging the fact, more recently even David Walsh allowing that "Balzac's relentless, ferocious assault on this environment and its denizens can at times be wearing")--even he could, occasionally, allow at least one of his characters a good outcome.

Thursday, March 16, 2023

The Academy Goes With the Popular Choice: The 48th Academy Awards, Rocky's Best Picture Win, and the Trajectory of Cinema Ever Since

At the 48th Academy Awards there were five films up for the Best Picture Oscar, namely Alan J. Pakula's All the President's Men; Hal Ashby's Bound for Glory; Martin Scorsese's Taxi Driver; Sidney Lumet's Network; and John G. Avildsen's Rocky.

The first four films were what a certain sort of ideologue would call "elitist"--tonally and technically sophisticated, with a socially critical outlook (be it Pakula's thriller-like treatment of Bob Woodward and Carl Bernstein's investigation of the Watergate scandal, Ashby's biopic about folk singer Woody Guthrie, Scorsese's vision of Times Square in a New York approaching what is still conventionally remembered as its "rock bottom," or Lumet's satire of network television centering on the rise of a right-wing populist "mad prophet of the airwaves"). By contrast Rocky was an old-fashioned sports story about an underdog who gets a shot at the top and makes the most of it, with, some thought, a rather right-wing edge (in its essentially "aspirational," "You only need willpower to be a champ" attitude, and what some saw as its racial subtext), which got it called "populist."

In what was then and now seen as a gesture toward the broader moviegoing audience Rocky was given the prize over those other more conventionally Oscar-worthy (more technically accomplished, more thematically ambitious, more substantial) films. And if it is less often spoken of than, for example, Jaws or Star Wars by those historians who recount the end of the New Hollywood, it does get mentioned from time to time as a sign of where film was headed, with some justice given the sequels. Looking back at Rocky IV, for example, one sees in the sequel-itis, the Cold War-ism, the thoroughly "high concept" character of the product's look and feel and tone, the "New" New Hollywood clearly arrived.

All the same, there is a surprise for those who look back from the sequel, or from Rocky's reputation as a marker of the decline of the New Hollywood, to the original. If Rocky IV purely belonged to the new era, "Rocky I" did not--with, instead of mansions and press conference halls, grand arenas and training facilities out of the future, the screen saturated with the shabbiness of working-class districts amid which people lead dead-end lives, while the movie actually looks, classically, like a movie. One can even quantify the fact. Where Rocky IV has an Average Shot Length (ASL) of a mere 2.3 seconds--shrinking toward 1 second in the training montages and fight scenes--the ASL was 8.5 seconds in the original, the sequel squeezing almost four times as many shots as the original into any given space of time, and watching it you really feel it.* Where the mid-'80s follow-up a 90-minute music video intended above all to dazzle with light and sound, with synchronization of image and soundtrack, all very Michael Bay a decade before Bay shot his first feature, the original was a movie telling a story, whatever one makes of it, about people in places that were none too dazzling in a way much out of fashion in major feature film by the '80s, and even more so now.

* Figures on ASL from the ever-handy and much-recommended Cinemetrics web site.

Of Media Bias and Technological Reportage

When most of us speak of "media bias," I think, we have in mind the "Big" mainstream, media--the national-level operations that tend toward a "centrist" perspective.

The "political economy" of that media colors its coverage of every issue, of course.

The particular issue that interests me at the moment is its attitude toward technology.

As it happens this media seems to me utterly technophiliac when it comes to pushing the garbage business wants to sell the consumer--a pack of claqueurs when it comes to something like, for instance, "the Internet of Things" they wanted us all to be so excited about a little while ago.

By contrast they turn into a pack of sneering technophobes when it comes to technologies that might actually solve problems--like, for instance, anything at all that might help with a problem such as climate change, from cellular agriculture to renewable energy.

This can seem contradictory. But really it isn't. It is a natural outcome of a business that lives on selling fear, not selling hope; which is deferential to industries and their spokespersons that love to talk about "disruption" when singing "the market" but do everything possible to resist disruption in real life whenever it threatens their speculation and rent-seeking (sometimes because they actually own said media); and, with all a centrist's fearfulness of calls for change (and perhaps also their equation of pessimism with wisdom), committed to a narrative of apathy-inducing doomism precisely because it is apathy-inducing. Michael Mann rightly speaks of "inactivism" in this area--which seems their default mode toward everything in life, and toward technology ever making any change in it for the better.

Disentangling the Media's Biases: Center, Right and Left

In talking about "media bias" it may be helpful to discuss which part of the media we are talking about, as it is not all one thing.

Simply when generalizing among national-level news outlets from the standpoint of the familiar ideological spectrum one can speak of a "mainstream" media--that media which is most commonplace and visible and generally accepted as the baseline of discussion--of which the New York Times could be called representative. People often call it "liberal," or even "left," but it is far more accurate to call it "centrist."

Meanwhile there is a considerable media to its right, which we could identify with, for example, the Wall Street Journal or the New York Post.

There is also a body of media which can be considered genuinely "left." Much smaller and less visible than the others, and generally shut out of a mainstream conversation which ordinarily acts, in part because it prefers to act, as if it did not exist (this is implicitly what happens when people call centrists, whose philosophy is actually deeply conservative and whose positions are ever further to the right, "the left"), one might name as an example of a news site of indisputable left credentials, and which has now and then come to mainstream attention in the wake of the controversies over political-censorship-by-search-engine-algorithm and the 1619 Project the World Socialist Web Site. (If you've never heard of it, that just goes to prove the point about the left media's marginality.)

We do well to remember which of these we are talking about when we speak of the media and its "bias."

"Thank You For Your Support." Wait, What Support?

When considering the "aspiring" writer's lot I think time and again of Jack London's Martin Eden--the truest depiction of that unhappy state that I have ever encountered in print--and often one particular aspect of that situation, its loneliness. This is not just a function of the fact that writing is a solitary and often isolating activity, but the way that others treat someone going about it when they have not "made it"--specifically their lack of empathy, sympathy and respect, and their sense that they have no obligation to be tactful, or even civil, about what they really think. There is, too, what all of this means if one ever does get anywhere, as Eden does--for he does "make it," becoming rich and famous, such that the multitude of people who knowing full well who he was would not have looked twice at the sight of him starving to death in the street now all want to have him over for dinner, or even marry their daughter.

So I suspect does it go online--even at the level of merely giving people an absolutely cost-less "like" or "share" on social media and those other platforms akin to it. A celebrity can get an outpouring of enthusiasm for uttering pure banality--while a nobody can scarcely hope for support no matter what they say, a mere click of the mouse that could mean a great deal in their position begrudged them. It is a testament to the shallowness of our online life--and the extreme stupidity of so many of those who people it. And experiencing it this way takes its toll. In Balzac's Lost Illusions Lucien de Rubempre, callow and ultimately ill-fated as he is, does realize some hard facts of life along the way before his final destruction, among them not only how scarce a sympathetic word or gesture can be for those who have not "succeeded" ("In the home circle, as in the world without, success is a necessity"), but the toll that even those who attain "success" are likely to pay before they get there. ("Perhaps it is impossible to attain to success until the heart is seared and callous in every most sensitive spot.")

In that there may be the tiny grain of truth in those stupid scenes of writers with supercilious looks on their faces as they autograph copies of their latest for fawning idiots--the praises cannot touch them, because if they had not long ago stopped caring what other people think they would have given it all up, perhaps as completely as Lucien does.

Wednesday, March 15, 2023

The Rubble of the '90s

Over the years I have written a lot about the '90s--in part because the period encouraged certain illusions which have proven destructive, and increasingly appeared destroyed themselves by contact with reality. There was the way that those singers of the information age, so long trying to persuade Americans that what was obviously deindustrialization was actually progress, almost looked credible amid the tech boom of the latter part of the decade--enough so as to profoundly distort the national dialogue about the economy for an extended and critical period. There was the way that Americans were persuaded to believe the "unipolar moment" of uncontestable full-spectrum American hyperpower might go on forever. There was the way the public was persuaded that the Internet would be an ultra-democratic cyber-utopia, as political correctness triumphed over all--so much so that, as Thomas Frank recounts, right-wing populism seemed to be assuming a new face, not only trying to persuade the people that "the market" was the common man's best friend, but appearing to embrace wokeness, while declaring the market its greatest champion.

What people think of as the elite are, as I have said again and again and will go on saying again and again, careerist mediocrities who are profoundly out of touch with reality, as they show us again and again when relied upon to lead the way. Accordingly many of them may still believe in such drivel--still, to use Paul Krugman's words, seem "stuck in some sort of time warp, in which it's always 1997"--but not all of them do so, while credulity in such silliness is the weaker the further out one gets from that overprivileged little world they inhabit.

C.P. Snow's "The Two Cultures" in 2023

The essential claim made in C.P. Snow's lecture-turned-essay on "The Two Cultures" of science and letters was that this division of intellectual life between them in the modern West has gone with mutual ignorance, incomprehension and even dislike, between those on either side of the split.

On first encountering C.P. Snow's essay I thought there was a good deal of truth to it and still do. After all, intellectual life is indeed extremely specialized, with the fact underlined by how outside their specialties even exceptionally intelligent and well-educated people often have only the most banal thoughts to offer (as they often show when accorded that kind of Public Intellectual status in which they are given a platform from which to comment on anything and everything). And all of this seems to me to have got worse rather than better with time, as their work has only become more esoteric and minute (how many scientists does it take to publish a single paper these days?), while the pursuit of an academic or research career has become a more harried thing, leaving less time or energy for "extracurriculars," which at any rate are frowned upon. (The "professions," as the root of the word suggests, are indeed priesthood-like in many ways, often living down to the unhappiest and most stultifying connotations of that term, like that disapproval of any outside interest, or indeed any ideas but the orthodoxy imposed by the hierarchs.)

This specialization may have gone so far that those in the sciences and letters alike are doing their jobs less well than they may need to be doing them--with scientists reportedly lacking sufficient grounding in philosophy to understand the scientific method on which their work depends, while, as I discovered struggling with composition textbooks, and their usually clumsy explanations of so basic a matter as "What is a thesis?" (while finding reference to the scientific method, to scientific examples, useful in redressing such weaknesses--in the classroom and in my own book) I have often wondered if the writers would not have benefited from a bit of scientific study themselves.

Still, salient as the division within intellectual life remains there is also the political framework within which that life goes on. To his credit, Snow was sensitive to that. The cultural orthodoxy is a function of the priorities of those who have power--whose concern is first, last and always the bottom line, to which they recognize science as contributing. That is not to say that they actually respect the scientific method, or the work of scientists, even just enough to care about funding scientific education and research properly, let alone make use of the best available scientific advice when making policy--just that they know there is something here they can use, much more than is the case with letters.

All of this would seem to have become more rather than less consequential in the neoliberal era--amid unending austerity, with the pressure to justify everything in terms of short-term corporate profitability the greater, with a predictable result that where neoliberalism has gone, so too has the imperative to reduce education to occupational-technical training (from the Thatcher-era overhaul of educational funding, which has continued ever since, to the requirements of the "structural reform" programs foisted on developing countries). Meanwhile the politics of interest are ever more marginalized, the politics of status in the ascendant--with the resulting era of culture war seeing the humanities demonized, to the point that disdain for the work of historians and the like has right-wingers openly gloating over the prospect of the demise of humanities programs on campus.

It would be a profound mistake to imagine all this does not have consequences for how those of the sciences and letters see each other--with, perhaps, the young indicative of the trajectory. While I think that there is, again, much ignorance and incomprehension on the part of those in the sciences and letters alike, I never actually noticed any actual disrespect. I am less sure that this is the case today--certainly to go by what we hear of the sneering, mocking STEM major openly disdainful of the humanities, an attitude all too predictable in an era where the war cry of "STEM! STEM! STEM!" is unceasing, and unmatched by anything the defenders of the humanities have to say on their behalf.

Will the New Chatbots Mean the End of the Novelist?

Over the years I have remarked again and again the reports that writers' wages--and certainly the wages of writers of fiction--have long been collapsing. The result is that, while the chances to "make a living" writing fiction (or anything else) have always been few, the path far from straightforward, the remuneration usually paltry relative to the effort and skill one put in--and one might add, society at large deeply unsympathetic to the travails in question (just ask Balzac, or London)--it does not seem unreasonable to say that matters have just gone on getting worse this way.

Consider, for instance, how even in the middle years of the last century there were writers who scraped by (sometimes did better than scrape by) selling short stories to pulp fiction magazines. Any such thing seems unimaginable today--in an age where writers have it so tough that they are giving away vast amounts of quite readable content, and finding no takers. Meanwhile what could be salable seems an ever smaller category as a good many old reliable genres die (consider action-adventure fiction, for example), and what does sell, if always reliant on authors' platforms and "brand names" to persuade consumers to take a look (just remember the unpleasant truths spoken by Balzac's vile Dauriat), the reliance on those supports would seem to have only grown greater and greater, while one is struck by how new platforms and new brand names are not emerging, hinting at the game being one of milking an aging and dwindling consumer base. (Just look at the paperback rack. The thriller novelists you will find there are pretty much the same ones you saw in the '90s--James Patterson, John Grisham and company, with Tom Clancy and Clive Cussler still writing somehow to go by the bylines.)

Indeed, it seems to me safe to say that the key factor here has been the shift in the relation of supply to demand. People spend a lot less time reading fiction than they used to do, their leisure time increasingly consumed by audiovisual media. (I suspect that in this century far more people have played Tom Clancy video games than read Tom Clancy novels.) Meanwhile, due to the widening of educational opportunity, the number of those who endeavor to become published authors has grown, with publishing fiction, while a nearly impossible task for anyone who does not have fame or nepotism on their side, still seeming more plausible than trying to make it as a writer in the more glamorous but more closed-off film and television industry. Still, whatever one makes of the cause, the result is inarguable--and the writers' troubles may well escalate if what we are hearing about chatbots bears out. Whatever writers, critics or readers may think, publishers--who are first, last and always businessmen, moving literature as they want any other good (again, I refer you to Balzac)--prefer to keep putting out the same old stuff that has had takers in the past, and it is inconceivable that they would balk at replacing writers with chatbots. For the moment, of course, they are barred from taking such a course by the limits of those bots, which are much better at producing short work than long (and truth be told, not producing that very well). However, if we see the progress some expect they may get to the point where a human would only be needed to clean up the result--and maybe do a lot less clean-up in the process--within a decade's time, and publishers will not hesitate to exploit the possibility (such that the Jack Ryan and Dirk Pitt novels of ten years hence, and the Alex Cross novels too, may be churned out by some new iteration of GPT). The only question then would be whether there will still be a big enough audience for those books to make the venture profitable.

The New Yorker Notices the Collapse of the Humanities on Campus

Having recently noted for myself the surge in STEM majors as enrollment in the humanities collapses, and wondered why we were not hearing more about either phenomenon, Nathan Heller's "The End of the English Major" caught my eye.

Alas, Heller's piece reminded me of much that I had to say about the quality of the coverage rendered by the mainstream media a while back. Here as elsewhere the journalist offers a few factoids, and comment culled from others presumed to have an understanding of the matter, with some interesting bits possibly to be found in the mass, but the whole less than the sum of its parts, many of which were not impressive to begin with. Too much of it consists of the writer recounting his wandering about the Harvard campus getting quotes from students and faculty of whom we are supposed to be in awe because "HARVARD, HARVARD, HARVARD!" ("[G]olden kids from Harvard," he calls the students. "Basic employability is assured by the diploma: even a Harvard graduate who majors in somersaults will be able to find some kind of job to pay the bills," Heller writes. Alas, not how it works, but that he says so is very telling about the level of thought to be found here.) Then after this he dutifully endeavors to force the bits all into the "brisk, forward-looking, optimistic" view to which our journalism, as so much else, is so unfortunately addicted. The result is that instead of a better sense of "What it all means" there was a rain of details of uncertain meaning and often no meaning at all, a good deal of obnoxious, shamelessly name-dropping elitism in its particularly grating "Cult of the Good School" form (HARVARDHARVARDHARVARDHARVARD!), and ultimate complacency, that left me persuaded that I wasted my time bothering with the thing.

The "Enshittification" of Innovation

Cory Doctorow recently wrote of "enshittification," a term that seems to be catching on. (I have since seen it used in such publications as The Financial Times.) It denotes a cycle of business activity in which businesses are "good to their users" at the outset, gain a consumer base which cannot easily depart because of "network effects" and "switching costs," and then exploit that "captive" consumer --successfully for at least a time, though with (at least if the market works) the abuse of the consumer catching up with and ultimately costing said businesses.

It is, of course, to be expected that with business following such a trajectory technological "innovation" will reflect the imperative--that companies will mainly think about how to more fully profit from abusing the consumer rather than set about that more difficult task of inducing them to use their service by producing things they would actually want to buy. And it would seem that the great wave of techno-hype of the '10s was exemplary of that. Consider what a buzzword "Big Data" became. Of course, one could envision the collection and analysis of data on a revolutionary scale with revolutionary thoroughness yielding something significant. However, I remember that the analysts of these matter often talked about Big Data as some great boon to advertising specifically--talked about it breathlessly, and not only in the pages of some trade journal but more general publications, as if people generally were supposed to be excited about advertising efforts being refined, and individually targeted to them.

Of course the great majority of people would not be excited about such a prospect. They do not want more collection and sifting of their personal data for the sake of more individualized advertising online. What they want is for Big Business to leave their data alone, and a better Ad-blocker so that ads will never be forced on them again. But one would never have guessed that from the coverage of these matters. (Consider, for instance, this New York Times piece, with its few cursory references to "privacy advocates" in an overwhelmingly enthusiastic item.)

The enshittification imperative, in confirmation of the view of the press as the "stenographers of power," was absolutely taken for granted, and ultimately successful here, advertising today indeed individually targeted in this manner. (Thus when you look at your browser's privacy and security settings you are likely to see the higher setting come with a "warning" that the advertising you see will be less individualized for you--as if this were a bad thing!)

Thinking over the fact I find myself again recalling the disappointing record of "innovation" in recent decades. Along with the way that, for example, investors' love affair with tech was never the same after the "dot-com" crash, or the prevalence of short-termism in company decision-making and the preference of business for "sustaining" innovation over the "disruptive" kinds that people selling fantasies about gales of creative destruction love talking about, the fact that enshittification may absorb so much R & D effort may be a reason why technological progress has been so grindingly slow.

Tuesday, March 14, 2023

Notes on the 95th Academy Awards

I will be blunt--I haven't bothered to even try to look at the Oscar ceremony in more years than I will bother to bespeak (given that it would, of course, indicate just how old I am getting to be). Why bother, after all, when no real-life Oscar ceremony is anywhere near as entertaining as, for instance, the last act of the last of the Brooks-ZAZ wave of gag comedies that was actually funny, The Naked Gun 33 1/3rd?

Still, I do tend to check out the list of wins and losses the next day--and of course, check out film critic David Walsh's annual reportage on the event, the principal theme of which is usually the pretension, backwardness, hypocrisy and sheer stupidity of the Academy, Hollywood and the cultural elite of which they are a part (again, with this criticism usually more illuminating and entertaining than anything the ceremony offers--with Mr. Walsh not disappointing this year).

This year such thoughts as I have had about the ceremony have to do mainly with the movie that swept the awards, Everything Everywhere All at Once (EEAAO from here on out so I don't have to keep writing out the full title)--picking up Best Picture, Best Director, Best Original Screenplay and three of the four acting prizes (a near-perfect sweep save for Brendan Fraser's win for The Whale). At a glance EEAAO seems an unlikely contender for such an award, given that it not only has a sci-fi premise, and that of the more head-spinning variety, but, in even more pointed contrast with such prior sci-fi-tinged Oscar darlings as Michel Gondry's The Eternal Sunshine of the Spotless Mind (2004) or Spike Jonze's Her (2014) or Guillermo del Toro's The Shape of Water (2017) it has a conventional action-adventure premise (however unconventionally realized in respects).

For all that I still suspected it had the best shot at the key prizes. After all, it was the most popular of the movies not included as sops to the broader public which everyone knows cannot win much outside the technical categories (Top Gun 2, Avatar 2) in a moment in which popularity counts for even more than usual--with the film industry more than usually anxious about its survival after three financially catastrophic years, with the franchises so long carrying it looking ever more creaky, with the Academy in particular fearful of a declining relevance that has only got worse in the decade since I took up the theme a decade ago. The movie ticked off a fair number of the "wokeness" boxes. (Not every one of them, of course--the "status politics" of identity are by definition very zero-sum, and sheer mathematics makes it impossible to please everyone all the time, but all the same, it scored a lot of points this way.) The same factor makes it unlikely that a movie with a less than wholeheartedly cheerleading attitude toward #MeToo (hobbling at least one competitor). And there is the other kind of politics, where, for all of the culture war hyperbole, a Hollywood that has pretty much always tended to be "safe" and conservative on most of the issues that mean something in actual people's lives is ever more "safe" and conservative--such that a foreign-language adaptation of an "anti-war" classic depicting the horrors of trench-and-artillery barrage warfare in the fields of northern Europe during a world war seems a very unlikely Best Picture winner these days (ruling out what many seem to have regarded as its worthiest rival, relegated instead to the far less prestigious Best Foreign Language Feature category).

There also seems to have been a certain genuine affection and respect for the movie from even those who are not martial arts-loving "lowbrows," something a Best Picture winner ought to have, of course--though this kind of thing usually seems to come in last for consideration. Alas, such is Hollywood life in March 2023.

"The Irrelevance of Oscar Night?" Revisited

This year's Oscars had me looking back at a piece I wrote in 2012 about the ceremony's declining relevancy in the wake of that year's particular ceremony. I have since had occasion to question that critical comment. (David Walsh, who was himself no great fan of the 2012 Oscars, made an interesting case about the overstatement, and dubious motives, behind much of the criticism in the media after the March 2012 awards show.) Still, I do stand by what I had to say about the ceremony generally--its overlong and (as Ludlum might have had it) "robotic pavane"-like character, its being only one of an ever-growing host of awards shows, the gap between the kinds of movies that principally win Oscars and the ones "people actually see," etc.--as well as those more specific matters I discussed at length, namely the waning of the romance of cinema, the "cult" of the film star, and the centrality of Hollywood within the entertainment-media world, all of which seems to me even truer of the present moment than of a decade ago.

Certainly the decline of the theatrical experience's importance for the film-viewer has continued unabated. People still went to the movies about four times a year on average just before the pandemic, and I think that moviegoing is well along the course of recovering toward that level. Still, more than ever what people see when they go to the theater are the giant, flashy blockbusters--the movies that, as John Milius put it, offer a "cheap amusement park ride." The experience of character and narrative, their particular pleasures--the emotionally resonant experiences that become the stuff of romance and nostalgia--ever less have a locus here. Instead, when people do experience them it is ever more the case that they do so while looking at a small screen, which has ever more been the place to go for "serious drama," and just about anything else but the superhero-type fare.

So has it also gone with the cult of the film star whose decline I thought was an important factor here. Back in 2012 people were just noticing the failure of new stars to emerge. In 2023 the fact has long been taken for granted. Indeed, considering the list of most searched-for celebrities I am struck by how the figures of an earlier era retain a fascination for the Internet-using public completely overshadowing that of any newer celebrity. For example, Megan Fox's "it girl" status seemed unremarkable in the late '00s--indeed, the phenomenon was not only familiar and expected, but her time as such relatively short, due to a very hard fall as Hollywood and its courtiers and claqueurs in the entertainment press retaliated for her daring to speak ill of a major director when the "sacred" publicity rituals in which she was a participant called for flattering banalities--but there has been nothing like that run since. And indeed, if she would be very unlikely to make any list of the biggest stars in the world "Megan Fox" remains in the 200 top keyword searches at last check--and Megan Fox, in the ranking of searched-for celebrities, behind only Kim Kardashian, Carmen Electra, Jessica Alba, Jennifer Aniston, Lindsay Lohan, Jessica Simpson and Scarlett Johansson, among whom only Johansson would seem to owe much of the attention she gets to any films she has appeared in lately (and those, largely an extension of her having established herself in the Marvel Cinematic Universe as Black Widow way back in 2010). Meanwhile the absence of any newer personality in "the club" is conspicuous in what is very much a sign of the times.

And of course, it is ever less the case that anyone can pretend that Hollywood is the center of the entertainment universe--with China overtaking the U.S. as the world's biggest film market, and that market's ticket sales largely reaped by China's own productions (as the country's own industry finds itself in a position to back full-blown blockbuster-type fare); with the "Korean wave" going from strength to strength (Korean films routinely beating out Hollywood's biggest in its own considerable home market and scoring overseas); with the Indian film industry booming (as "Tollywood" joins Bollywood, the country big enough for two Hollywoods!); with even in the U.S. films based on popular anime series' reliable enough earners to get significant releases (Demon Slayer taking in $50 million in the U.S. itself amidst the pandemic as homegrown American blockbusters struggled to make as much!); among much, much else going on in just the cinematic world. Meanwhile in the streaming age the number of channels explodes toward infinity, with those international presences counting for just as much here, with Korea's Squid Game becoming Netflix's biggest hit ever, as much of the world binges on Turkish soap opera--all as streaming companies owned by non-Hollywood firms like Netflix and Apple play an ever-bigger role in financing TV and movies at home and abroad. The result is that, even more than a decade ago, Hollywood seems an ever-smaller part of the total scene, and, if every now and then it recognizes that other countries make movies too (as when it awarded Parasite Best Picture in 2020), ever more provincial in its attitude--a principally national, traditional industry awards show in an ever-more globalized and variegated scene.

Still, it seems to me that this does not by any means exhaust the issue, with the politics of our time possibly mattering more now than they did in 2012. Again, popular culture has always been politicized--and in spite of the peddling of a nostalgia that, in contrast with that of cherished memories of movie magic in the theater, is undeniably pernicious, there has never been a time, at least in the memory of anyone living, when this or any other major society on the surface of the planet was not polarized. However, those conflicts would seem to have gained a sharper edge, with the U.S. no exception, and in such a way as to alienate many from the movie industry. Given the media's ever-attentiveness toward the lamentations of the right what we are most likely to hear of is right-wing disgust with Hollywood "wokeness," but it is the case that other elements are at work, like a broader reaction against an elite--certainly to go by the reaction to the "discovery," or rather the umpteenth rediscovery, of Hollywood nepotism by the apparently profoundly clueless users of social media. Indeed, shocked by their shock I found myself remembering David Graeber's suggestion in his book of a few years ago that a good deal of the hostility toward the "coastal elite" from working people was, apart from the way in which they looked down on them "as knuckle-dragging cavemen," the way in which they monopolized any opportunity to make a living while "pursuing any other sort of value" besides pure money-making for themselves and their offspring, with Graeber singling out Hollywood for particular criticism. As he remarked, one is unlikely to find among "the lead actors of a major motion picture nowadays" anyone who "can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree." Indeed, "the film industry has come to be dominated by an in-marrying caste." The results of that--which I dare say extend to the artistic output of an American film and television industry testifying to their being profoundly out of touch with, and even disdainful, of the broader population ("I can't pretend to be somebody who makes $25,000 a year" said an Oscar winner whose job is literally all about pretending)--should not be lightly discounted.

Subscribe Now: Feed Icon