Thursday, March 23, 2023

What are the Prospects for Controlling the Pace of Artificial Intelligence Research Through International Agreement?

A Follow-Up to "Ezra Klein on AI in the New York Times: A Reaction."

In making his case regarding the rapid pace and potentially dire consequences of artificial intelligence research not only in the long run, but in the fairly near term, at the current pace of such research, Ezra Klein, in the absence of some acceleration of effective human adaptation to the results (or even alongside such adaptation) called for an international regime for controlling research in such technology, to the end of slowing it down.

While I do not share Mr. Klein's view on this particular matter his premise that something he sees as a problem may be redressed is certainly healthier than the apathy-encouraging doomism that an exceedingly crass and corrupt media delights in inflicting on the public every minute of the day. Still, I have to confess that the record of international cooperation in recent decades on any issue of importance has not been . . . inspiring; the trend in political life for the moment seems to be away from the circumstances conducive to such cooperation, rather than toward it; and the nature of this issue may make it particularly tough to address in this manner.

Consider, for example, those problems that have most seemed to demand global cooperation--where a potentially existential danger to human civilization was pressing, and nothing less than the whole world working together would really do in meeting that danger. The two most obvious are nuclear weapons and climate change--with the failure the more pointed here because of what states officially committed themselves to in the most public way. Consider, for example, the text of the 1968 Treaty on the Non-Proliferation of Nuclear Weapons. This has the undersigned declaring "their intention to achieve at the earliest possible date the cessation of the nuclear arms race and to undertake effective measures in the direction of nuclear disarmament," with this explicitly including agreeing to work toward
the cessation of the manufacture of nuclear weapons, the liquidation of all their existing stockpiles, and the elimination from national arsenals of nuclear weapons and the means of their delivery pursuant to a Treaty on general and complete disarmament under strict and effective international control.
Over a half century on, just how seriously would you say "the undersigned" took all this? How seriously would you say they are taking it now--as the bilateral arms control agreements between the U.S. and Soviet Union of the Cold War era, at best steps in the right direction, fall by the wayside with the most uncertain consequences, and acknowledged nuclear powers like Britain and China expand their arsenals, as they race to field a new generation of super-weapons straight out of a comic book super-villain's laboratory. Meanwhile, the latest United Nations' report on climate change making the round of the headlines (among much else) makes all too clear how little governments have even attempted to do relative to even the agreements they have signed, never mind what the scale of the problem would seem to demand.

By contrast, we would seem a much longer way from even such gestures in the direction of action as, the 2015 Paris Climate Agreement. Where artificial intelligence is concerned--comparatively few persuaded that any threat it poses is nearly so dire, with, not insignificantly, broad attitudes toward the issue differing greatly across cultures and nations. Significantly it has often been remarked that in East Asia--a particularly critical region for the development of the technology--attitudes toward AI are more positive than in the West, with a Pew Research Center survey from December 2020 providing some recent statistical confirmation. (In Japan, Taiwan, South Korea, of those polled 65 percent or more said the development of AI was a "good thing" for society, and no more than 22 percent said it was a bad one--with, one might add, the Pew Center's polling in India producing similar numbers. By contrast in the U.S. 47 percent identified it as a "good thing" and 44 percent as a bad, the responses in fellow G-7 members Britain, Germany and Canada similar, and respondents in France even more negative. Even those Westerners most positively disposed toward AI, the Swedish, among whom the positive/negative split was 60-24, still fell short of the poll numbers of all of the Asian countries named here.)

Were major Asian countries such as Japan, India--or a China that plausibly sees the matter similarly--to refuse to sign such an agreement it would be effectively a dead letter. However, I suspect that we would be unlikely to get anywhere near the point where there would be such a consensus even in the more AI-phobic West. It matters, for example, that positive views on the matter are correlated with education--suggesting that elites, with all this implies for policy, are more open to the idea of AI than the population at large. (In the U.S., for instance, 50 percent of those with "more education" who were polled thought AI a good thing, as against 41 percent of those with "less education.") It matters, too, that the young are more inclined to positive views, suggesting such attitudes will become more rather than less common in the years ahead. (Those below median age were more likely than those above it to think AI a good thing by a margin of 49 to 44 percent.) The result is that increasing openness to AI, rather than the opposite, seems to be the trend in the West. (Meanwhile if the trajectory in Asia parallels that in the West, as it actually seems to do by at least as large a margin--with in Japan the more educated 10 percent more likely than the less educated and those below median age 17 percent more likely than their elder to think AI a good thing--we would see an even more positive attitude there, with all that implies for Asian attitudes in the same area.)

I suspect it would take a great deal to change attitudes here--especially among, again, the more favorably inclined policymaking groups, the more in as governments and business presently seem to hope for so much from artificial intelligence (which, if experience in areas like climate change is anything to go by, will very easily outweigh very strong reservations among the broad public). Certainly in the United States a Wall Street embrace of AI such as some now speculate about would make any action seriously restricting it all but unimaginable to go by recent political history, while the deference of officials to such interests aside, concern for plain and simple state power will also be operative around the world. After all, we live in an age in which realpolitik is not declining but intensifying, with one dimension of this intensifying military competition--and susceptibility to technological hype in the military sphere, with at least one world leader of note having publicly expressed expectations that artificial intelligence superiority will be "ruler of the world." Such a statement is not to be taken lightly, the more in as he has just said what others are doubtless thinking, not least because countries facing a situation of geopolitical disadvantage they regard as intolerable (and many do) may see in AI their best hope of "leveling the field." Meanwhile, there is the humbler matter of economic productivity, perhaps the more urgent at a time of economic stagnation at best (or, depending on how does the math, collapsing output), and mounting economic "headwinds," with, again, East Asia worth mentioning here given its stagnant, aging, even shrinking populations. To cite simply the most dramatic example Japan has seen its working-age population collapse over this past generation, falling from 87 million in 1994-1995 to 73 million in 2021, with no end in sight. For a country in such straits, the prospect of AI-powered productivity coming to its economic rescue cannot be overlooked.

All that, too, will factor into the likelihood of an international AI control regime--or the opposite--and the likelihood of states abiding by it. Moreover, even were it possible to institute an agreement, in relative good faith, enforcement would be extremely challenging given the character of AI research. While CNBC remarked that the technology's development was "expensive" the figure for developing and training a program like GPT-3 may have been in the vicinity of $4 million.

That's it. Just $4 million. This may mean that the average person cannot afford to finance a "startup" such as this out of pocket, however much they would like to "get in on the ground floor here." But by the standards of high-technology corporate R & D this is nothing. Repeat, nothing. (According to a report from a few months ago some $75 billion--enough to fund 20,000 GPT-3s--have been spent on self-driving technology--likely to less result in the way of a useful product.) Still more is it nothing next to the resources governments have and are willing to apply to those ends they prioritize. (The Pentagon is requesting $130 billion for a single year's R & D.)

When the "barrier to entry" is that low it is that much more difficult for an international regime to effectively enforce the rules. Indeed, any number of wealthy individuals, who think little of spending nine figures on grand homes and gigayachts; and many of whom are famed for their transhumanist inclinations; could easily afford to fund such research on a personal basis even where they could not do so via their command of their business empires. Moreover, a program of this kind would be relatively easy to keep secret, given the nature of computer research--in contrast with the physical facilities required for the development, production and deployment of, for example, an arsenal of nuclear missiles, with all the temptation that the possibility of discretion, as well as cheapness, affords.

Of course, it may be that no obstacle is wholly insuperable. However, separately, and still more together, they are extremely formidable--enough so to have consistently defeated aspirations to meaningful cooperation in the past. Moreover, if Mr. Klein is right the proverbial clock is ticking--with perhaps not very much time left before the question of the "opening of the portal" becomes mooted by the actual event.

If one takes that prospect seriously, of course.

Wednesday, March 22, 2023

If the Cinematic Blockbuster Goes, Then What?

For decades now film--especially to the extent that we identify film specifically with what people see in a theater--has seemed to be ever-more dominated by the cinematic "blockbuster" spectacles of essentially one of two types: either "family"-oriented animated features tending toward comedy, and especially musical comedy; or action-adventure that has tended to be science-fiction or fantasy-themed, exemplified by the superhero movie. If anything, the pandemic has sharpened the tendency in this direction.

Of course, that raises the question of what will happen if these blockbuster genres go the way of, for instance, the "serious drama" or the romantic comedy--a prospect that may not be far from implausible. After all, those narratively limited forms of filmmaking are fairly well-won now, while the particular franchises that Hollywood so favors are ever more exhausted (witness Disney's troubles with Star Wars and Marvel, and, in the wake of Warner Bros. failures with the DC Universe, the desperation surrounding the talk of "James Gunn to the rescue!"). Meanwhile the big studios that finance such projects are in a financially battered state (with things perhaps getting worse before they get better, assuming they ever do get better). Between falling returns and the shakiness of the investors' portfolios, something could give, with this including the readiness to put up $300 million or more for the production, promotion and distribution of a big movie.

In such an eventuality the most obvious outcome would, in spite of the desire of the film studios to double down on theaters (and the income from $20 a ticket sales), seem to me to be the continued decline of the theaters--with such venues becoming less numerous, and, to go by what we are hearing of their recent business practice in these very lean years for them (in 2022 the box office gross, if recovering, was, even after Top Gun 2, Marvel's diminished but still not insignificant earnings from its movies, and the release of Avatar 2, still just half what it had been pre-pandemic), becoming more and more "special events" venues or general entertainment centers than houses principally oriented to screening new releases for a steady flow of filmgoers.* Meanwhile, with theatrical income ever less to be relied upon--and theatrical release and the publicity it justifies financially less reliable as a way of competing in the "attention economy"--one could expect the downward pressure on production budgets seen in non-blockbusters these days to continue. Just what that will mean for the content of film, however, will depend on the progress of cinematic technology--and perhaps in particular the possibility of making a TV show look like an "A" picture. The more enthusiastic assessments of the small-screen Star Wars incarnations of recent years generally credit those series' with delivering that, and ultimately it may be here that action-adventure of that type will live on; while if seemingly a more remote prospect the quality of animation on the small screen may begin to approximate that on the large.

* Together Dr. Strange 2, Thor 4 and Black Panther 2 made $1.2 billion in North America--perhaps not all that had been hoped for, but far from trivial, and indeed the kind of money that for most of 2020 and 2021 producers could only fantasize about (as when together Black Widow, The Eternals and Shangi-Chi, coming out in the five month July-November 2021 period, failed to crack $600 million between the three of them). And of course, 2022 also saw Marvel take in another $200 million via Spider-Man's ticket sales from New Year's Day on.

Headwinds for the Cinematic Blockbuster?

When considering the way the movie business has become more blockbuster-centered it seems one should not only be attentive to the way in which blockbusters crowd other kinds of fare out of the market (driving it to streaming, and usually, to obscurity on streaming), but also the frailty of the blockbuster model itself, and what that frailty means for the industry now so reliant on it. In particular I see that frailty as leaving it susceptible to at least four "headwinds" worth discussing:

1. The Continued Decline of Theater-Going.
As of 2023 people are returning to their pre-pandemic theater-going habits. However, --but it is worth remembering that these were already under strain before the pandemic, which seems to have encouraged them in the tendency to not come out for anything but the biggest "tentpoles." One may add that the pandemic has meant that when they are willing to "come out" there are fewer places for them to go. Actual theater closures have been less numerous than I once thought they might be given the way ticket sales crashed during the pandemic and have since stayed depressed--the total number of theaters shrinking less than a tenth. Still, in doing so they have likely taken a certain amount of custom with them (as some people have lost a favorite theater, now have to travel a longer way to get to any theater, etc., making them less easily persuaded to go), while one cannot be sure the closure wave has run its course, especially given the broader economic situation. Besides the obvious implications of downturn and rising interest rates for theaters as for all other business, there is what it means for the moviegoing public. In brief, between inflation and recession those who are seeing movies again may be more careful with their money, while, one should not forget, for many the trip to the theater (the pricey tickets, gas and parking, the cost of obscenely marked-up concessions, etc., as against staying home and watching streaming as they eat what they have in the refrigerator) is a luxury with all that means for the viability of the remaining theaters. All of this may portend further disruption of an ever-more fragile habit of theater-going--and the revenue stream from the $20 theater tickets that Hollywood cannot get along without in anything like its current form.

2. The Accumulating Holes in the Movie Industry's Balance Sheets.
In considering the matter of interest rates it is worth acknowledging that if this has been a problem for theaters it has probably been a bigger one for the studios. During the period of ultra-loose monetary policy, in which central bankers made giving away money to investors a cornerstone of the growth model--and business gorged itself on the effectively free money, with the would-be streaming giants no exception as they funded vast amounts of "content" for the sake of building up a subscriber base, with relatively little thought for what the limits to its profitability were. Add to that the revenue collapse associated with the depression of theater-going for three years as of this month, and counting--with its pressure to borrow that much more cash. Of course, the surge of inflation reduced the "real" weight of studio debts--but the more recent rise in interest rates meant the cheap borrowing was at an end. Between that debt accumulation and the higher cost of borrowing something has to give--and indeed it has. Certainly we see that in the greater reserve toward streaming among Netflix, WBD, Disney, among others. But, even as in the wake of their unhappy experiments with releasing big-budget movies to streaming on their opening weekend in theaters the studios affirm their commitment to theatrical release, it seems far from implausible that all this is not factoring into studio thinking about the money they put up for those theatrically released features. This may not make them more willing to invest in small movies--but they are still likely to be more reserved toward big ones than they were before, with reinforcing that tendency the fact that their prospects for both audience, and for investors, may both be shrinking.

3. The Fragmentation of the Global Film Market.
Amid the wretched, crisis-ridden performance of the neoliberal economic vision, worse than ever this past decade; the resurgence of dissent against it that has substantially taken, or been channeled into, nationalistic forms; and the deterioration of great power relations, exemplified by the fighting in Ukraine and the looming danger of war in the Far East; the world economy is "deglobalizing." It is unclear how far this process will be allowed to go in this deeply interconnected world (even now the level of global economic integration represented by trade and investment flows has no precedent from before the twenty-first century), but cinema (first and foremost an economic enterprise to those in charge of the industry) has certainly not been untouched by that process. The result has been diminished access for Hollywood to critical international markets, with all it spells for its earnings (Marvel's Phase Four being shut out of China, for instance, likely cost it its cut of hundreds of millions of dollars' worth of ticket sales and much else in that country). It has also reduced access to sources of financing of such movies. (Who would have thought that in making even Top Gun 2 Hollywood would have got financing from Chinese and Russian investors?) The result is that the Hollywood studios are not only cut off from ticket-buyers on whom they depend, but important sources of capital by the process, just as, again, the interest rate's going up has put the squeeze on domestic funding--and the number of safe investments for them to make dwindles.

4. The Wearing Out of the Old Franchises--and the Lack of New Ones.
In our time it is a truism that businesses all across the media spectrum have in recent decades preferred to mine old successes than prospect for new riches. (Consider just how old just about all those comic book superheroes are. Happy eighty-fifth birthday Superman!) These days the tendency seems to be catching up with them as just about every one of those franchises looks old and tired--and indeed, even the genres to which they belong look old and tired. After all, if the current superhero boom can be traced back to 2000's X-Men, then we have had nearly a quarter of a century of big-budget A-list superhero features as a standard and increasingly large part of our cinematic diet--an extremely long run for any cinematic fashion, with this exemplified by how studios have fallen into the habit of telling and retelling and re-retelling the same stories about the same characters over and over and over again. (Thus it is not just the case that we have more X-Men movies, but we have seen remakes of the origin stories of both Batman and Spider-Man, while the X-Men's Dark Phoenix saga is now a twice-told tale.) Even the claqueurs of the entertainment press can no longer pretend otherwise--while as Star Wars and Marvel and DC look ever-creakier it is far from clear that anything else can serve in their places amid all those long years of no one even trying to make anything new.

I suspect that the irony goes right over the heads of the vulgarians calling the shots. Still, limited beings that they are, they are not--in spite of the PR efforts that only the stupidest person would believe, and the suck-up writing of courtiers who would tell us these people actually care about movies, or anything else but their salaries--they are not wholly oblivious to the fact that they have been foisting stale garbage on the public over and over and over again. Certainly to go by the revelations of the Sony hack of a decade ago the executives were certainly aware of what they had on their hands with the Bond film Spectre long before disappointing reviews and disappointing box office revenue confirmed it for everyone. Some of them may well be feeling the same way about their handling of more recent high concept action-adventure franchises--caught on to the fact of its being run into the ground the same way they ran "the one that started it all" into the ground. But even if this is so I have no expectation whatsoever of their changing tack anytime soon--just their getting tackier, as they ever more shamelessly serve up the same refuse to ever-diminishing returns, financially as well as creatively.

Tuesday, March 21, 2023

Is an Artificial Intelligence Bubble Emerging?

Back at the turn of the century I was intrigued by the flurry of predictions regarding the progress of artificial intelligence most popularly identified with Ray Kurzweil. (Indeed, all this was a significant source of inspiration for Surviving the Spike.) Of course, all of this looked wildly off the mark a decade on, and I declared AI a bust--thinking in particular that Kurzweil had simply expected too much of the neural net-based pattern recognition on which so many of his other predictions hinged (while also being overoptimistic about the progress toward other relevant technologies like carbon nanotube-based 3-D chips potentially permitting computers three orders of magnitude faster).

Of course, in the mid-'10s there was a new burst of excitement based on progress in exactly the key area, neural nets and their training to recognize patterns. At my most optimistic I admitted to
find[ing] myself increasingly suspecting that Kurzweil was accurate enough in guessing what would happen, and how, but, due to overoptimism about the rate of improvement in the underlying technology, was off the mark in regard to when by a rough decade.
Suspecting it enough, at least, to be more watchful of a scene to which I had paid little heed for many years--with, I thought, how self-driving cars do a particularly good indicator of just how substantial, or insubstantial, the progress was. Whether or not an artificial intelligence could autonomously, reliably, safely navigate a large vehicle through a complex, variable, unpredictable urban environment seemed to me a meaningful test of progress in this area--and its passing the test reason to take the discussion seriously.

Alas, here we are in 2023 with, even after years of extended time on the test such vehicles decidedly not passing it, and not showing much sign of doing so very soon, with the hype, in fact, backfiring here. Meanwhile other technologies that might have afforded the foundations of later advance have similarly disappointed (the carbon nanotube-based computer chip remaining elusive, and nothing else any more likely to provide a near-term successor to silicon). Indeed, the pandemic drove the fact home. In a moment when business strongly desired automation there was just not much help on offer from that direction, with the event in fact a reminder of how little replacement there has yet been for human beings physically showing up to their workplace to perform innumerable tasks with their own hands.

The resulting sense of very strong, very recent, bust, made it very surprising when these past few months the media started buzzing again about AI--and a tribute to unsophistication on the part of our journalists given the cause of the fuss. Rather than some breakthrough in computer chip design promising much more powerful computers with all they might do, or the fundamental design of neural nets such as might make them radically more efficient learners (enabling them to overcome such critical "bottlenecks" to automation as "perception and manipulation" or "social intelligence"), the excitement is about "chatbots," and in particular a chatbot based less on fundamental design innovation in the underlying neural network than simply making the network bigger for the sake of "language prediction." Indeed, those less than impressed with the chatbots term them, unflatteringly but so far as I understand it not unfairly, glorified autocomplete functions.

Of course, it may well be that a better autocomplete function will surprise us with just what it can do. We are, for example, being told that it can learn to code. Others go further and say that such chatbots may be so versatile as to "destroy" the white collar salariat in a decade's time--or even warn that artificial intelligence research has reached the point of Lovecraftian horror story "meddling with forces we cannot possibly comprehend."

In that there seems the kind of hype that creates bubbles, though admittedly many other factors have to be considered, not all of which fit in with the bubble-making pattern. After all, this has been a period of rising interest rates and tightening monetary policy and falling asset values relative to liabilities that have come to be piled very high indeed amid cheap borrowing and economic catastrophe--hardly the sort of thing that encourages speculative euphoria. Still, as banks start failing it remains to be seen just how stalwart the central bankers will prove in their current course (on which they do not seem to have ever planned to go by their "stress tests" for the banks, and doubtless follow only with great reluctance). Moreover, there is, just as before, a very great deal of money out there that has to be parked somewhere, and still very few attractive places to put it (this is, for instance, why so much money was poured into those crappy mortgage-backed securities!), while, even as the press remains addicted to calling anyone with a large sum of money or high corporate office a "genius" (Ken Lay was a "genius," Jeffrey Epstein was a "genius," Elizabeth Holmes was a "genius, Sam Bankman-Fried was a "genius," on and on, ad nauseam), they consistently prove to be the extreme opposite, with that going the more for those who so eagerly jump on the bandwagon every time. It matters here that said persons tend to be perfectly okay with the prioritization of producing high share values over the creation of an actual product long central to such scenes ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company"), expect that if worse comes to worse there will always be a greater fool, and believe that if they run into any real trouble they will have the benefit of a bailout one way or the other, usually rightly.

The result is that, even if the money reportedly beginning to pour into artificial intelligence really is a case of people getting excited over an autocomplete function--and I will add, very little prospect of anything here to really compare with the insanity of the '90s dot-com bubble for the time being--I still wouldn't advise betting against there being a good deal of action at this particular table in the global casino in the coming months and years.

Sunday, March 19, 2023

Shazam! Fury of the Gods' Opening Weekend: How Did it Do?

The numbers are in for Shazam! Fury of the Gods' opening weekend--and as it happens the film's box office gross is below even the subdued earlier expectations. The movie has taken in a mere $30.5 million globally domestically, with the global take $65 million, and the reaction suggesting that (in contrast with, for example, Avatar 2) a not-quite-what-was-hoped-for opening weekend will not be redeemed by strong legs over the coming weeks.

In delivering the news Variety reports that the film's production budget was $110 million+, and the promotional budget (not very surprisingly given the tendency of the latter to match the former) another $100 million+, indicating over $210 million has been sunk into the movie. The result is that to make that back in its theatrical run the movie might have to take in well over $400 million, and likely more than $500 million--considerably more than the original Shazam.

By contrast the sequel seems likely to make a good deal less in the wake of the opening weekend's numbers--in the absence of better legs or international responsiveness relative to what it got at home than its predecessor, finishing up under $80 million domestically, and perhaps not even making $200 million globally.

Even allowing for the studio's having laid out less than the full sum that went into making and marketing the picture, and there being revenue besides the box office gross, Shazam 2 looks a long way from breaking even, never mind success by any other measure. Indeed, its numbers can make it seem as if Ant-Man 3 did not do so badly by comparison--though in considering any such possibility one has to remember that movie may have cost twice as much to make and market.* Additionally, where Shazam 2 can seem a matter of Warner Bros. (to use the terminology of television rather than film) "burning off" the last "episodes" of its DC Universe "show" prior to the much-ballyhooed James Gunn-led overhaul of its management of the franchise Ant-Man 3 was to be the start of Marvel's Phase Five, which Disney/Marvel desperately need to go very well after the disasters of the last three years (the pandemic's battering of the movie business generally, Marvel Phase Four's underperformance, the losses amassed in the foolish splurging on streaming, etc., etc.).

The result is that these two releases, which came just a month apart, testify to crisis in both of the great comic book film adaptation universes--and in the larger industry that has become so dependent on them.

* This weekend Ant-Man 3 took in about $4.1 million (another more than 40 percent drop) and seems to me still on track to finish up with about $220 million or less.

Saturday, March 18, 2023

Just How Well Would Shazam 2 Have to Do to Be a Winner?

What would it take for Shazam 2 (Shazam! Fury of the Gods) to be a hit?

None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.

My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.

Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.

Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.

So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.

With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.

Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.

Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.

The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.

Comparing and Contrasting Box Office Underperformance: Black Panther 2 vs. Atom-Man 3

When looking over Black Panther 2's box office numbers back in the autumn I argued that there was a case for regarding the film as a commercial failure--as it had made, in inflation-adjusted terms, only half what its predecessor had, while given its big budget (reported as $250 million) it may have been short of covering its cost at the end of its theatrical run.

While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.

It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.

Ant-Man 3's Underperformance and the Trajectory of Marvel

In the view of the analysts I have read the disappointing box office performance of Ant-Man 3 was a measure of the viability not only of a relatively minor superhero movie, but rather the bigger Marvel Cinematic Universe of which it is a part--with the disappointment the more significant because of how the franchise has fared in recent years. Once going from strength to strength, all the way up to the two-part confrontation of the MCU's assembled heroes with Thanos (a two-film, $5 billion grossing whose second part became the "highest-grossing film in history") it has been very different with the subsequent "Phase Four." Certainly its problems (and particularly those of its first three movies--Black Widow, The Eternals, and Shangi-Chi and the Legend of the Ten Rings) have been partially attributable to the pandemic, and at the international level, also to Phase Four's shutout from China's market. However, it is undeniable that the content itself has been an issue, with anything after Phase Three's build-up to and realization of the battle with Thanos an anticlimax--no "going bigger" really plausible here--while there was a pervasive sense of the pursuit of diminishing returns, be it in the turn toward prequels with Black Widow, to more obscure characters in The Eternals, and the all too characteristic weariness and tonal confusion of a series gone on too long in Thor: Love and Thunder, even before one considers such awkwardness as the excessive eagerness of Disney to tie its feature films to its small-screen streaming series in Dr. Strange 2 and the presentation of a Black Panther sequel without . . . Black Panther. The result was that only the Spider-Man sequel, which was again a one-of-a-kind event in its drawing together the stuff of three separate big-screen Spider-Man series into one sprawling "multiverse" narrative really felt like an event. And unsurprisingly, only that one performed really "above and beyond" at the box office (in fact becoming the movie that, with almost $2 billion banked globally, proved "filmgoing is back").

The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.

Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.

Is Ant-Man 3 a Flop? Crunching the Numbers

When Ant-Man and the Wasp: Quantumania came out on President's Day it looked like a hit, with its $106 million take in the first three days and the $120 million it picked up over the four-day weekend at the North American box office--a feat that seemed the more impressive in as Ant-Man has always been a "second-string" character. (Ant-Man is not Spider-Man, after all. Or Iron Man or Black Panther or Captain America or Captain Marvel or Thor or even Dr. Strange . . .) Still, a couple of weekends later the view of the film as "hit" (commercially, at least; the critics liked it a lot less than its predecessor) gave way to a view of the film as "flop," not only in the eyes of the professional Marvel-bashers (e.g. those who hate Disney's "woke" turn and are eager for evidence of the audience rejecting it), but even quite mainstream observers ordinarily well-disposed to the brand.*

Of course, those discussing these matters for a broad audience rarely crunch the numbers in any rigorous way, at best citing a figure or two, but the truth is that this is easy enough to do on the basis of conveniently available data--like the stats at Box Office Mojo--with an obvious place to start what prompted that change of opinion, namely the sharp drop in gross between the first weekend and the second. That drop was indeed dramatic, some 70 percent between the first and second Friday-to-Sunday periods. Still, one should put it into perspective. Ant-Man 3 was a "threequel," and not to some long-awaited follow-up to a record-breaking "New Classic," but the not-so-long-ago last film in one of the weaker component franchises of the exceedingly prolific Marvel Cinematic Universe; with that movie also debuting on a four-day holiday weekend. One expects such a film's take to be very front-loaded indeed--and the gap with the second weekend to show it--as seen with the preceding Ant-Man film, which had a 62 percent drop between the first and second weekends. The 70 percent drop Ant-Man 3 had does not seem very much worse in the circumstances (and even less so when compared with the franchise's other post-pandemic releases).**

Still, if the significance of the drop can be exaggerated one ought not to rush to dismiss it either. The days and weeks that followed bore out the impression of a faster fade for Ant-Man 3, with the result that 28 days on, if running ahead of Ant-Man 2 in current dollar terms ($202 million to $189 million), was about a ninth behind it in inflation-adjusted, real dollar terms (the prior film's $189 million more like $226 million), with the decline the more striking as Ant-Man 2 was, again, not too leggy itself. This is still not a disaster, but no great success, either, given what it bodes for the final take. Ant-Man 2 finished with about $217 million, or $259 million adjusted for inflation. Standing at $202 million after last Thursday, with the movie unlikely to make so much as $5 million over this weekend, it is all but certain to finish well short of Ant-Man 2's inflation-adjusted gross, doing well to finish up in the vicinity of $220 million.

All the same, in considering this it is worth remembering that in making and distributing Ant-Man 3 Disney/Marvel is not only counting on the North American market, but the global market, where Marvel's appeal has long been strong, and which has redeemed many a stateside disappointment. Indeed, it is worth recalling that Ant-Man 1 and Ant-Man 2 each made twice as much as their American take overseas, a fact crucial to the franchise's continuation. Adjusted for inflation Ant-Man 2 added to its rough quarter-billion dollar North American take a half billion more dollars in those international markets (its $406 million 2018 take abroad equal to $484 million in today's terms). Were Ant-Man 3 to manage the same one would not look on it too unfavorably.

So how has it been doing there?

Alas, not nearly so well as that. The movie has, at last check, made just $250 million--and may not have much more left to collect. After all, when Ant-Man 2 came out in U.S. its release in many important markets (among them Britain, Japan and of course China) came only a month later (indeed, almost two months later in the case of Japan and China). By contrast Ant-Man 3 was released in pretty much every major market at the same time, so far from waiting and seeing how it plays overseas we have already seen that, with the resulting numbers not encouraging. (Consider the example of China. Where Ant-Man 2 picked up $120 million in China in its first four weeks--the equivalent of $140 million today--thus far it has collected less than $40 million.)

Globally I simply do not see Ant-Man picking up another $250 million, or even $150 million abroad. Some are now suggesting that it will not get much past the half billion dollar mark, if that. This would make it the lowest-grossing of the Ant-Man movies in real terms, and even current-dollar terms, with this the more problematic given the $200 million production budget. While movie budgets are, again, often a matter of dubious, loss-claiming accounting, and often frayed by subsidy and production placement, while movie revenues are over time supplemented by revenue streams from home video, television, merchandising, etc. that enable even flops to get "into the black" after the passage of enough time, let us take the cited figure at face value, and stick with the rule of thumb that a profit on the theatrical run requires the gross of four to five times that figure (while anyway being important because a good theatrical run is what makes people buy your merchandise and want to pay streaming fees and the rest anyway). According to the resulting calculation the movie needed to make something in the $800 million to $1 billion range--about what it would have made if it bettered the inflation-adjusted take of Ant-Man 2 by even ten percent or so, and at least half again what it actually seems likely to collect. Even considering the possible compensations of other funding and revenue streams that gap between where it stands now and what it might take to "break even" yawning.

What does all this add up to in the end? That Ant-Man 3's North American gross, if initially impressive, dropped sharply after its opening weekend, not only showing disappointing legs but making for a disappointing total--but only moderately so. The real problem is the overseas performance of the movie, which positively crashed compared with Ant-Man 2's performance. (Rather than making a sixth less than Ant-Man 2, the movie may make more like two-fifths less elsewhere--falling from nearly $500 million in today's terms to under $300 million.) The latter goes a much longer way to putting a big dent in the movie's earnings, and Disney's balance sheets--with, interestingly, this more important part of the story (financially, anyway) scarcely being talked about at all.

* The critics gave Ant-Man 2 an 87 percent ("Fresh") Tomatometer rating. By contrast Ant-Man 3 rated just 47 percent (with the "Top Critics" score falling even more sharply, from 83 to 39 percent).
** According to the Bureau of Labor Statistics' Consumer Price Index inflation calculator Ant-Man 2's $75.8 million in July 2018 is roughly equivalent to $90.5 million in February 2023 terms.

Friday, March 17, 2023

Planet of the Chatbots

These days to mention Planet of the Apes is generally to evoke film--with, I suppose, the listener's generation determining the version of which they think. The young, I suppose, may be most likely to think of the currently ongoing franchise (headed into its fourth installment with The Kingdom of the Planet of the Apes headed to a theater near you in 2024). Still, even they may remember, if only through constant reference and parody (down to a whole invented musical in The Simpsons' glorious "golden age"), the first 1968 film adaptation of Planet of the Apes, and its closing scene with Charlton Heston's Taylor pounding the sand with his fist in rage and grief in the shadow of the Ozymandias-like wreck of the Statue of Liberty, testifying to the destruction of humanity in a nuclear war.

Yet the franchise really began with a novel that had nothing to do with nuclear war, or the latterly more popular theme of genetic engineering--Pierre Boulle's 1963 novel offering rather a different "secret" to the rise of the ape-dominated order, to which the book's narrator-protagonist first catches on during a visit to the apes' stock exchange, and subsequent "remembrance of the world of finance" as he had known it back on Earth. This was that for all its apparent complexity humanity had built a civilization on the basis not of reasoning, original thinking, but of "monkey see, monkey do," such that even "monkeys" could replicate it, and eventually did.*

Considering how chatbots work, and all people seem ready to credit them with being capable of doing, it appears we may be having such a moment. Our chatbots are a very long way from that intelligence that can "parse" language, and process subtle audio and visual data, and reason from fact to conclusion. By design they are just imitators, and often not terribly convincing ones as yet--but whether the task is writing an essay for a student, or being a romantic partner (!), they seem capable of all that is required.

I will not pretend to have all the implications of this worked out. But I think that it says more about the nature of society than of "human nature." Humans have their capacity for intelligence and creativity; but society, intricately and stultifyingly hierarchical and controlling, with relatively few allowed very much in the way of choice and initiative, requires the vast majority of us to simply play a limited and limiting part, even those who perform supposedly "elite" functions. Indeed, much of what we call "education" is about conditioning the individual to accommodating themselves to this limited part (with those acquiescent to the process showered with moral praises and those who are less acquiescent showered with opprobrium as "immature," "irresponsible," "refusing to grow up," etc.). Putting it bluntly, society trains people to be machines, and rewards or punishes them for being more or less serviceable machines. We should therefore not be surprised when we find that a machine is a perfectly adequate replacement for a human trying to be one.

* Indeed, in the book it was the use of trained apes, not machines, that became the basis of labor-saving and "automation," such that one could see it as a variation on the "robot rebellion" theme. (And lest it need be said, the author is fully aware that apes are not monkeys, hence the quotation marks.)

Ezra Klein on AI in the New York Times: A Reaction

Earlier this week journalist Ezra Klein penned an opinion piece on artificial intelligence (AI) in the New York Times.

What the piece offers the reader was predictable enough--a hazy sense that "Big Things" are happening, bolstered by an example or two (DeepMind "building a system" that "predict[ed] the 3-D structure of tens of thousands of proteins," etc.), after which, faced with the "Singularity" we get not the enthusiastic response of the Singularitarian but the more commonplace, negative reaction. This included plenty about our inability to actually understand what's going on with our feeble, merely human minds (citing Meghan O'Gieblyn repeatedly, not least her remark that "If there were gods, they would surely be laughing their heads off at the inconsistency of our logic"), which goes so far as to characterize artificial intelligence research as magic ("an act of summoning. The coders casting these spells have no idea what will stumble through the portal"), and declare that confronted with what he sees as the menace of AI to our sense of self we may feel compelled to retreat into the irrational ("the subjective experience of consciousness"), though we may hardly be safe that way (Mr. Klein citing a survey reporting that AI researchers themselves were fearful of the species being wiped out by a technology they do not control). Indeed, the author concludes that there are only two possible courses, either some kind of "accelerate[d] adaptation to these technologies or a collective, enforceable decision . . . made to slow the development of these technologies," while basically dismissing any skepticism toward his view of present AI research as a meddling with forces we cannot understand out of Lovecraftian horror as an ostrich-like burial of one's head in the sand (closing with Eric Davis' remark that "[i]n the court of the mind, skepticism makes a great grand vizier, but a lousy lord").

And so: epistemological nihilism, elevation of the irrational and anti-rational and even obscurantist (he actually talked about this as magic, folks!), misanthropy, Frankenstein complex-flavored Luddism, despair and the brush-off for any alternative view. In other words, same old same old for this take on the subject, old already in Isaac Asimov's day, so old and so pervasive that he made a career of Robot stories opposing the cliché (to regrettably little effect to go by how so many cite an Arnold Schwarzenegger B-movie from 1984 as some kind of philosophical treatise). Same old same old, too, for the discussion of the products of human reason going back to the dark, dark beginnings of the Counter-Enlightenment (on which those who find themselves "overwhelmed" have for so long been prone to fall back).

Still, it is a sign of the times that the piece ran in this "paper of record," especially at this moment in which so many of us (the recent, perhaps exaggerated, excitement over chatbots notwithstanding) feel that artificial intelligence has massively under-delivered, certainly when one goes by the predictions of a decade ago (at least, in their wildly misunderstood and misrepresented popular form, next to which chatbots slapping together scarcely readable prose does not seem so impressive). There is, too, a somewhat more original comparison Mr. Klein makes on the way to dismissing the "skeptical" view, namely between AI and cryptocurrency, raising the possibility of enthusiasm for the former replacing the enthusiasm for the latter as, in the wake of the criminality and catastrophe at FTX, investors pour their money into artificial intelligence.

Personally I have my doubts that AI will become anywhere near the darling of investors that "crypto" was. However much the press looks full of stories of startups raising capital, the plain and simple truth is that Wall Street's love affair with "tech" has not been anything close to what it was in the '90s since that time. (Indeed, investors have preferred that oldest, least tech-y commodity, real estate--hence the bubble and bust and other turmoil of this whole stagnant yet crisis-ridden century.) I do not see that behavior changing now--the more in as money is getting tighter and the economy slower.

Still, the activity in that area may be something for those interested in the field to watch for those curious as to how far things may go, how fast Meanwhile I suggest that anyone really concerned for humanity's survival direct their attention not to fantasies of robot rebellion, but to the fact that such scenarios seem a way of overlooking the very numerous, deadly serious and entirely real conflicts among humans of profoundly unequal power and opposed interest (not AI vs. human, but human vs. human with AI in the mix, and the more probable danger the way some humans may use AI against others). I suggest, too, that those concerned for human survival also worry less about artificial intelligences generally than nuclear weapons, fossil fuels, "forever chemicals," etc.--while remembering that besides the dangers caused by the technologies that we have there is the danger of our failing to produce, or properly use, the technologies we need in order to cope with the problems of today.

A Note on Balzac's The Two Brothers

Reading Lucien de Rubempre's story in Balzac's Lost Illusions I felt that, Jack London's Martin Eden apart, it was the truest portrait of what the young artist faces as he embarks upon a career.

If less impressive on that score there is nonetheless a comparable bit of that of truth in Balzac's other novel, The Two Brothers. There Joseph Bridau, a young artist of talent and application is looked down upon as an addle-brained youth of no prospects because of his career choice by virtually everyone, and a constant worry to his conventionally-minded mother--even as Joseph, because of his work as a painter, and his loyal and responsible personal conduct, is her and the family's sole support (his earnings not from a "day job," but from his painting itself, what enable them to survive financially). The lack of appreciation for what he does, and her lack of appreciation for Joseph because he does it, is accentuated by the contrast between the way his mother looks at him and the favored, elder son, Philippe, a creature of monstrous selfishness, callousness and irresponsibility who, after a superficially "brilliant" career as a soldier in the Napoleonic Wars, brings nothing but calamity to his family, and to anyone else unfortunate enough to be associated with him, in spite of which he remains the favorite, but ever remains the more admired and looked-to for "great things." (All this is even more the case when they venture out to the mother's provincial home town of Issodun.)

It is only as the end of the story draws near, and their mother approaches the end of a life constantly darkened and ultimately shortened by Philippe's betrayals, that she is induced--by her confessor--to recognize that if there has been a sin in her life it has been her not loving and appreciating Joseph. In the years after Philippe, whose ambitions have ultimately been wrecked by his own indiscretion (and the all too contemporary-seeming financial idiocy-cum-madness which looms so large in his, and others', tales of the Paris of that era), dies in a colonial campaign, and leaves what remain of his fortune and title to Joseph, who at the end of the story is not only a great success as an artist, but wealthy and literally ennobled (the painter a Comte, no less).

Considering Joseph's trajectory, which could not be more different from Lucien's, I had an impression of Balzac writing a wish-fulfillment here. I suppose it is a reminder that even so cold-eyed a realist as Balzac could be—so much so as has put many a reader off of him from his own time to today (Dickens acknowledging the fact, more recently even David Walsh allowing that "Balzac's relentless, ferocious assault on this environment and its denizens can at times be wearing")--even he could, occasionally, allow at least one of his characters a good outcome.

Thursday, March 16, 2023

The Academy Goes With the Popular Choice: The 48th Academy Awards, Rocky's Best Picture Win, and the Trajectory of Cinema Ever Since

At the 48th Academy Awards there were five films up for the Best Picture Oscar, namely Alan J. Pakula's All the President's Men; Hal Ashby's Bound for Glory; Martin Scorsese's Taxi Driver; Sidney Lumet's Network; and John G. Avildsen's Rocky.

The first four films were what a certain sort of ideologue would call "elitist"--tonally and technically sophisticated, with a socially critical outlook (be it Pakula's thriller-like treatment of Bob Woodward and Carl Bernstein's investigation of the Watergate scandal, Ashby's biopic about folk singer Woody Guthrie, Scorsese's vision of Times Square in a New York approaching what is still conventionally remembered as its "rock bottom," or Lumet's satire of network television centering on the rise of a right-wing populist "mad prophet of the airwaves"). By contrast Rocky was an old-fashioned sports story about an underdog who gets a shot at the top and makes the most of it, with, some thought, a rather right-wing edge (in its essentially "aspirational," "You only need willpower to be a champ" attitude, and what some saw as its racial subtext), which got it called "populist."

In what was then and now seen as a gesture toward the broader moviegoing audience Rocky was given the prize over those other more conventionally Oscar-worthy (more technically accomplished, more thematically ambitious, more substantial) films. And if it is less often spoken of than, for example, Jaws or Star Wars by those historians who recount the end of the New Hollywood, it does get mentioned from time to time as a sign of where film was headed, with some justice given the sequels. Looking back at Rocky IV, for example, one sees in the sequel-itis, the Cold War-ism, the thoroughly "high concept" character of the product's look and feel and tone, the "New" New Hollywood clearly arrived.

All the same, there is a surprise for those who look back from the sequel, or from Rocky's reputation as a marker of the decline of the New Hollywood, to the original. If Rocky IV purely belonged to the new era, "Rocky I" did not--with, instead of mansions and press conference halls, grand arenas and training facilities out of the future, the screen saturated with the shabbiness of working-class districts amid which people lead dead-end lives, while the movie actually looks, classically, like a movie. One can even quantify the fact. Where Rocky IV has an Average Shot Length (ASL) of a mere 2.3 seconds--shrinking toward 1 second in the training montages and fight scenes--the ASL was 8.5 seconds in the original, the sequel squeezing almost four times as many shots as the original into any given space of time, and watching it you really feel it.* Where the mid-'80s follow-up a 90-minute music video intended above all to dazzle with light and sound, with synchronization of image and soundtrack, all very Michael Bay a decade before Bay shot his first feature, the original was a movie telling a story, whatever one makes of it, about people in places that were none too dazzling in a way much out of fashion in major feature film by the '80s, and even more so now.

* Figures on ASL from the ever-handy and much-recommended Cinemetrics web site.

Of Media Bias and Technological Reportage

When most of us speak of "media bias," I think, we have in mind the "Big" mainstream, media--the national-level operations that tend toward a "centrist" perspective.

The "political economy" of that media colors its coverage of every issue, of course.

The particular issue that interests me at the moment is its attitude toward technology.

As it happens this media seems to me utterly technophiliac when it comes to pushing the garbage business wants to sell the consumer--a pack of claqueurs when it comes to something like, for instance, "the Internet of Things" they wanted us all to be so excited about a little while ago.

By contrast they turn into a pack of sneering technophobes when it comes to technologies that might actually solve problems--like, for instance, anything at all that might help with a problem such as climate change, from cellular agriculture to renewable energy.

This can seem contradictory. But really it isn't. It is a natural outcome of a business that lives on selling fear, not selling hope; which is deferential to industries and their spokespersons that love to talk about "disruption" when singing "the market" but do everything possible to resist disruption in real life whenever it threatens their speculation and rent-seeking (sometimes because they actually own said media); and, with all a centrist's fearfulness of calls for change (and perhaps also their equation of pessimism with wisdom), committed to a narrative of apathy-inducing doomism precisely because it is apathy-inducing. Michael Mann rightly speaks of "inactivism" in this area--which seems their default mode toward everything in life, and toward technology ever making any change in it for the better.

Disentangling the Media's Biases: Center, Right and Left

In talking about "media bias" it may be helpful to discuss which part of the media we are talking about, as it is not all one thing.

Simply when generalizing among national-level news outlets from the standpoint of the familiar ideological spectrum one can speak of a "mainstream" media--that media which is most commonplace and visible and generally accepted as the baseline of discussion--of which the New York Times could be called representative. People often call it "liberal," or even "left," but it is far more accurate to call it "centrist."

Meanwhile there is a considerable media to its right, which we could identify with, for example, the Wall Street Journal or the New York Post.

There is also a body of media which can be considered genuinely "left." Much smaller and less visible than the others, and generally shut out of a mainstream conversation which ordinarily acts, in part because it prefers to act, as if it did not exist (this is implicitly what happens when people call centrists, whose philosophy is actually deeply conservative and whose positions are ever further to the right, "the left"), one might name as an example of a news site of indisputable left credentials, and which has now and then come to mainstream attention in the wake of the controversies over political-censorship-by-search-engine-algorithm and the 1619 Project the World Socialist Web Site. (If you've never heard of it, that just goes to prove the point about the left media's marginality.)

We do well to remember which of these we are talking about when we speak of the media and its "bias."

"Thank You For Your Support." Wait, What Support?

When considering the "aspiring" writer's lot I think time and again of Jack London's Martin Eden--the truest depiction of that unhappy state that I have ever encountered in print--and often one particular aspect of that situation, its loneliness. This is not just a function of the fact that writing is a solitary and often isolating activity, but the way that others treat someone going about it when they have not "made it"--specifically their lack of empathy, sympathy and respect, and their sense that they have no obligation to be tactful, or even civil, about what they really think. There is, too, what all of this means if one ever does get anywhere, as Eden does--for he does "make it," becoming rich and famous, such that the multitude of people who knowing full well who he was would not have looked twice at the sight of him starving to death in the street now all want to have him over for dinner, or even marry their daughter.

So I suspect does it go online--even at the level of merely giving people an absolutely cost-less "like" or "share" on social media and those other platforms akin to it. A celebrity can get an outpouring of enthusiasm for uttering pure banality--while a nobody can scarcely hope for support no matter what they say, a mere click of the mouse that could mean a great deal in their position begrudged them. It is a testament to the shallowness of our online life--and the extreme stupidity of so many of those who people it. And experiencing it this way takes its toll. In Balzac's Lost Illusions Lucien de Rubempre, callow and ultimately ill-fated as he is, does realize some hard facts of life along the way before his final destruction, among them not only how scarce a sympathetic word or gesture can be for those who have not "succeeded" ("In the home circle, as in the world without, success is a necessity"), but the toll that even those who attain "success" are likely to pay before they get there. ("Perhaps it is impossible to attain to success until the heart is seared and callous in every most sensitive spot.")

In that there may be the tiny grain of truth in those stupid scenes of writers with supercilious looks on their faces as they autograph copies of their latest for fawning idiots--the praises cannot touch them, because if they had not long ago stopped caring what other people think they would have given it all up, perhaps as completely as Lucien does.

Wednesday, March 15, 2023

The Rubble of the '90s

Over the years I have written a lot about the '90s--in part because the period encouraged certain illusions which have proven destructive, and increasingly appeared destroyed themselves by contact with reality. There was the way that those singers of the information age, so long trying to persuade Americans that what was obviously deindustrialization was actually progress, almost looked credible amid the tech boom of the latter part of the decade--enough so as to profoundly distort the national dialogue about the economy for an extended and critical period. There was the way that Americans were persuaded to believe the "unipolar moment" of uncontestable full-spectrum American hyperpower might go on forever. There was the way the public was persuaded that the Internet would be an ultra-democratic cyber-utopia, as political correctness triumphed over all--so much so that, as Thomas Frank recounts, right-wing populism seemed to be assuming a new face, not only trying to persuade the people that "the market" was the common man's best friend, but appearing to embrace wokeness, while declaring the market its greatest champion.

What people think of as the elite are, as I have said again and again and will go on saying again and again, careerist mediocrities who are profoundly out of touch with reality, as they show us again and again when relied upon to lead the way. Accordingly many of them may still believe in such drivel--still, to use Paul Krugman's words, seem "stuck in some sort of time warp, in which it's always 1997"--but not all of them do so, while credulity in such silliness is the weaker the further out one gets from that overprivileged little world they inhabit.

C.P. Snow's "The Two Cultures" in 2023

The essential claim made in C.P. Snow's lecture-turned-essay on "The Two Cultures" of science and letters was that this division of intellectual life between them in the modern West has gone with mutual ignorance, incomprehension and even dislike, between those on either side of the split.

On first encountering C.P. Snow's essay I thought there was a good deal of truth to it and still do. After all, intellectual life is indeed extremely specialized, with the fact underlined by how outside their specialties even exceptionally intelligent and well-educated people often have only the most banal thoughts to offer (as they often show when accorded that kind of Public Intellectual status in which they are given a platform from which to comment on anything and everything). And all of this seems to me to have got worse rather than better with time, as their work has only become more esoteric and minute (how many scientists does it take to publish a single paper these days?), while the pursuit of an academic or research career has become a more harried thing, leaving less time or energy for "extracurriculars," which at any rate are frowned upon. (The "professions," as the root of the word suggests, are indeed priesthood-like in many ways, often living down to the unhappiest and most stultifying connotations of that term, like that disapproval of any outside interest, or indeed any ideas but the orthodoxy imposed by the hierarchs.)

This specialization may have gone so far that those in the sciences and letters alike are doing their jobs less well than they may need to be doing them--with scientists reportedly lacking sufficient grounding in philosophy to understand the scientific method on which their work depends, while, as I discovered struggling with composition textbooks, and their usually clumsy explanations of so basic a matter as "What is a thesis?" (while finding reference to the scientific method, to scientific examples, useful in redressing such weaknesses--in the classroom and in my own book) I have often wondered if the writers would not have benefited from a bit of scientific study themselves.

Still, salient as the division within intellectual life remains there is also the political framework within which that life goes on. To his credit, Snow was sensitive to that. The cultural orthodoxy is a function of the priorities of those who have power--whose concern is first, last and always the bottom line, to which they recognize science as contributing. That is not to say that they actually respect the scientific method, or the work of scientists, even just enough to care about funding scientific education and research properly, let alone make use of the best available scientific advice when making policy--just that they know there is something here they can use, much more than is the case with letters.

All of this would seem to have become more rather than less consequential in the neoliberal era--amid unending austerity, with the pressure to justify everything in terms of short-term corporate profitability the greater, with a predictable result that where neoliberalism has gone, so too has the imperative to reduce education to occupational-technical training (from the Thatcher-era overhaul of educational funding, which has continued ever since, to the requirements of the "structural reform" programs foisted on developing countries). Meanwhile the politics of interest are ever more marginalized, the politics of status in the ascendant--with the resulting era of culture war seeing the humanities demonized, to the point that disdain for the work of historians and the like has right-wingers openly gloating over the prospect of the demise of humanities programs on campus.

It would be a profound mistake to imagine all this does not have consequences for how those of the sciences and letters see each other--with, perhaps, the young indicative of the trajectory. While I think that there is, again, much ignorance and incomprehension on the part of those in the sciences and letters alike, I never actually noticed any actual disrespect. I am less sure that this is the case today--certainly to go by what we hear of the sneering, mocking STEM major openly disdainful of the humanities, an attitude all too predictable in an era where the war cry of "STEM! STEM! STEM!" is unceasing, and unmatched by anything the defenders of the humanities have to say on their behalf.

Will the New Chatbots Mean the End of the Novelist?

Over the years I have remarked again and again the reports that writers' wages--and certainly the wages of writers of fiction--have long been collapsing. The result is that, while the chances to "make a living" writing fiction (or anything else) have always been few, the path far from straightforward, the remuneration usually paltry relative to the effort and skill one put in--and one might add, society at large deeply unsympathetic to the travails in question (just ask Balzac, or London)--it does not seem unreasonable to say that matters have just gone on getting worse this way.

Consider, for instance, how even in the middle years of the last century there were writers who scraped by (sometimes did better than scrape by) selling short stories to pulp fiction magazines. Any such thing seems unimaginable today--in an age where writers have it so tough that they are giving away vast amounts of quite readable content, and finding no takers. Meanwhile what could be salable seems an ever smaller category as a good many old reliable genres die (consider action-adventure fiction, for example), and what does sell, if always reliant on authors' platforms and "brand names" to persuade consumers to take a look (just remember the unpleasant truths spoken by Balzac's vile Dauriat), the reliance on those supports would seem to have only grown greater and greater, while one is struck by how new platforms and new brand names are not emerging, hinting at the game being one of milking an aging and dwindling consumer base. (Just look at the paperback rack. The thriller novelists you will find there are pretty much the same ones you saw in the '90s--James Patterson, John Grisham and company, with Tom Clancy and Clive Cussler still writing somehow to go by the bylines.)

Indeed, it seems to me safe to say that the key factor here has been the shift in the relation of supply to demand. People spend a lot less time reading fiction than they used to do, their leisure time increasingly consumed by audiovisual media. (I suspect that in this century far more people have played Tom Clancy video games than read Tom Clancy novels.) Meanwhile, due to the widening of educational opportunity, the number of those who endeavor to become published authors has grown, with publishing fiction, while a nearly impossible task for anyone who does not have fame or nepotism on their side, still seeming more plausible than trying to make it as a writer in the more glamorous but more closed-off film and television industry. Still, whatever one makes of the cause, the result is inarguable--and the writers' troubles may well escalate if what we are hearing about chatbots bears out. Whatever writers, critics or readers may think, publishers--who are first, last and always businessmen, moving literature as they want any other good (again, I refer you to Balzac)--prefer to keep putting out the same old stuff that has had takers in the past, and it is inconceivable that they would balk at replacing writers with chatbots. For the moment, of course, they are barred from taking such a course by the limits of those bots, which are much better at producing short work than long (and truth be told, not producing that very well). However, if we see the progress some expect they may get to the point where a human would only be needed to clean up the result--and maybe do a lot less clean-up in the process--within a decade's time, and publishers will not hesitate to exploit the possibility (such that the Jack Ryan and Dirk Pitt novels of ten years hence, and the Alex Cross novels too, may be churned out by some new iteration of GPT). The only question then would be whether there will still be a big enough audience for those books to make the venture profitable.

Subscribe Now: Feed Icon