I was actually more favorably disposed toward Indiana Jones and the Kingdom of the Crystal Skull than most--quite willing to go along with how insanely over the top it went, the bit with the refrigerator included (after all, the budgets, effects and spectacle action movies offered had been scaled up for two additional decades since the last Indiana Jones film, and the movies had never taken themselves too seriously anyway); to enjoy the transfer of the character to the '50s period setting, and the turn from the old religio-magical themes to the Roswell mythos. In the second hour it admittedly became formulaic, but on the whole it was competently executed--and still more fun than most big summer movies, at any rate, while in the story's taking place on the eve of a space age with which the old world of his adventures could seem to be relegated to the past, and Jones' marrying Marion and becoming a father to Mutt, and maybe even settling into a new "day job," there was some sense of the saga drawing to an end (even if he was holding onto the hat).
Still, if it was a fun enough film that does not mean it was all a fan might have hoped for after two decades--or for that matter, "necessary" somehow, the more in as the title of the third film (The Last Crusade) had previously implied the rounding off of a trilogy with the heroes (literally) riding off into the sunset, with everything best left lying as it was. And if that was the case then (even as seen by, again, someone more favorable to the film than most) there would seem reason for doubt about the prospects of the fifth film in the series, coming your way this June, especially to go by what we have seen happen with old franchises like this one, and what we have heard about this particular film. After all, while we take Indiana Jones' blockbuster status for granted there really aren't a lot of similar movies out there, are there? (Once more: period pieces are a tough sell to American audiences, with this going no less for anachronistic sci-fi stuff like steampunk, "weird Westerns," and the like.) When Raiders of the Lost Ark hit theaters it had the advantage of being a comparative novelty in a less competitive market where really big-budget action movies were fewer, while benefiting from its association with Star Wars at the peak of its cachet (with George Lucas having writer and producer credits, and Harrison Ford's Han Solo barely frozen in carbonite and hauled off by Boba Fett to Jabba's palace), and one might add, a moment when audiences were relatively easier sells on period adventure than they are now. (If the fact seems forgotten now the '70s was a boom period for World War II stuff of this kind, with Ford himself, before fighting the Nazis as Jones, having done so as Lieutenant Colonel Mike Barnsby in the Guns of Navarone sequel Force 10 From Navarone, and Army Air Force Lieutenant David Halloran in Hanover Street--neither a great hit admittedly, but still testifying to the fact of that fashion, and perhaps to audiences having been prepared to see Ford in a period piece, just as, one might add, the disappointment that was his later Cowboys & Aliens can seem to suggest the opposite.)
Of course, having been such a big success then gives it a boost such as a freshly-launched franchise would not enjoy--but Hollywood has time and again run into the ground the biggest of '80s successes, as demonstrated by the Rambo, Die Hard and Terminator franchises, with the post-Bob Iger Star Wars showing that no franchise is immune. If it can happen to Star Wars, it can certainly happen to that other Lucas-Ford franchise, Indy, whose grosses were already trending downward in the '80s, with the fourth film no break from the pattern. While at the North American box office it made the most in current dollars--over $317 million, as against Raiders of the Lost Ark's $212 million, Indiana Jones and the Last Crusade's $197 million, and Indiana Jones and the Temple of Doom's $179 million--adjusting the figures for inflation (certainly if one uses the Consumer Price Index to get the values in 2022 dollars) produces a very different result*:
Raiders of the Lost Ark--$687 million
Indiana Jones and the Temple of Doom--$510 million
Indiana Jones and the Last Crusade--$468 million
Indiana Jones and the Kingdom of the Crystal Skull--$434 million
One may add that the pattern has held, more or less, at the international level--a fact especially worth remembering when one considers how much the opportunity to multiply the domestic gross has expanded.
Raiders of the Lost Ark--$390 million ($1.263 billion in 2022 dollars)
Indiana Jones and the Temple of Doom--$333 million ($944 million in 2022 dollars)
Indiana Jones and the Last Crusade--$474 million ($1.123 billion in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$791 million ($1.08 billion)
Only Indiana Jones and the Temple of Doom was outdone by its successor (in which it may well have been that that film's particularly dark tone was a factor). It is worth considering, too, that as the grosses went down the budgets went up, in nominal to say nothing of "real" prices.
Raiders of the Lost Ark--$18 million ($58 million in 2022 dollars)
Indiana Jones and the Temple of Doom--$28 million ($79 million in 2022 dollars)
Indiana Jones and the Last Crusade--$48 million ($114 million in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$185 million ($253 million in 2022 dollars)
If one goes by the rule of thumb that a movie released today may need a gross of four to five times its production budget in order to turn a profit in its theatrical run then, frankly, Indy 4 looks pretty marginal compared to the far-cheaper Raiders, a factor that may well have played into the very long wait for the fifth film--a wait that in turn likely affected its prospects negatively. When it has been fifteen years since the last Indiana Jones film--and that one, which was not too well-loved by audiences, came nineteen years after the preceding film (such that its original appearance is unremembered by anyone under forty)--one may have doubts about just how much the "nostalgia" factor so critical to making a movie like this one a hit will work in its favor. This may be more problematic still in the foreign markets where Indiana Jones films are moderately good but not great earners (internationally the last two films made about one-and-a-half times their domestic gross, rather than two or more times as much the way Marvel and Avatar movies do), and will have a handicap at the outset in Indy Jones 5 reportedly not playing in China, to absolutely no one's surprise. (Ford has publicly supported Tibetan independence, which, of course, has not pleased Beijing.)
So in short: the movie's being a tougher product to sell than it looks given its period character, the long downward trend in the grosses (as the budgets keep going up), and the likely waning of the nostalgia that might help it (and internationally, the possibility of its exclusion from China), will cut against any new Indiana Jones film at the box office, even before we get to the specific movie at hand. We have here hints that James Mangold of Logan fame may do for Indiana Jones with this one what he did for the X-Men--a gamble which, divisive if ultimately successful there, seems considerably less likely to pay off here (while Mangold's assurances that Indy 5 will not play like his Wolverine movie may not necessarily be persuasive to everyone). The reports of disastrous test screenings, denied but so far as I can tell not debunked, do not augur well for it. And some may well see other creative decisions as . . . unpromising. (Few figures in the world of entertainment have had the benefit of so much claquing-until-you-are-sick-of-hearing-about-them this past decade as Phoebe Waller-Bridge, and I do not doubt that those who, for whatever reason, responded to Killing Eve and Fleabag with unhinged, critical-reaction-to-Sopranos-like paroxysms of praise are delighted to see her prominently cast in the movie, those not sharing their opinions about life, the universe and everything will . . . feel differently. Moreover, while the professional "anti-woke" brigade have, as usual, made their position very clear, they are unlikely to be wholly alone in their reservations about this decision--all as one may remember that the last time Waller-Bridge was in a Lucasfilm project it was . . . Solo.)
In light of all that I am not bullish on this one. Indeed, I think Indy 5 will be the movie that reduces Indiana Jones from the tiny and ultra-exclusive club of "absolute, inarguable, classic, totally failure-proof franchise" to "just another movie franchise" (like Star Wars had become in 2017-2019) and maybe even "franchise old people see mainly out of nostalgia" (like the James Bond films have seemed to be, especially after the Waller-Bridge co-scripted No Time to Die--yes, there she is again!), because in contrast with all those years in which it finished the year at the top (Raiders was #1 in the U.S. in its year, Temple of Doom #3, Last Crusade and even Crystal Skull both #2, with their global standing usually at least as good), others will have clearly outdone it globally and even domestically. This will be all the more the case insofar as, given the colossal budget behind the film (we are hearing figures in the range of $300 million tossed about), even a billion dollars grossed could still leave the backers in the red at the end of the theatrical run, and anxiously awaiting accompanying and subsequent revenues' adding up simply to the break-even point. (And, again, nobody goes into business to "break even!") And even if I do not expect to see it fall so low as Solo did (in today's terms, a catastrophically low $450 million), I would not be shocked by a mere $800 million or even $700 million taken in globally at the end of the run--which would put another crater-sized hole in Disney's already hole-riddled books.
* Save for the data on Indiana Jones and the Last Crusade's global box office gross (gleaned from The Hollywood Reporter), and the information on Dial of Destiny's budget, all budgetary and box office figures were derived from the Boxofficemojo.com web site.
Sunday, April 2, 2023
The Last of Us: An Unconventional Take
As the amount of media content on offer has burgeoned, so has the quantity and intensity of the claquing intended to convince us that each and every last banal product of the colossal Masscult-generating machinery that is Big Media is a work of staggering, heart-breaking genius offering an experience we will never forgive ourselves for missing out on the least little bit.
I have long since come to find this wearying--so much so that even the occasional glance at the entertainment news is so often tiring, with all this especially going for the "prestige TV" churned out by the Midcult division of the Masscult machinery, the claquing on behalf of which has made the unhinged levels of critical acclaim with which The Sopranos was presented to the world increasingly the norm. Recently this claquing has extended to The Last of Us, which seems to be the successor to The Walking Dead as a prestige TV zombie show craze (because, apparently, someone decided that this trend simply cannot be allowed to end).
Naturally I have had an eye open for critical takes (especially when they did not fall into the standard, tiresome, "culture warriors vs. SJWs" pattern), with Ed Hightower supplying just that. As one might guess from his colleague Carlos Delgado's review of the game that was the show's basis as "a thoroughly miserable experience; nasty, gratuitously violent and misanthropic . . . more of a symptom of a coarse and violent culture than a commentary upon it," for Hightower the show "marks something of a new high-water mark for savage violence and self-assured misanthropy in the entertainment industry," with this drama in which "unspeakable cruelty" of various kinds is "most of the plot," in between acts of which cruelty the "dialogue barely rises above the level of grunts and barks." And, contrary to what those of fashionable, dark-and-gritty-loving taste may imagine, this was not meant as a compliment, the work not only repellent, but lazy, shallow and, rather than trite-but-true, trite-but-false throughout.
Sounds about right for this kind of thing.
I have long since come to find this wearying--so much so that even the occasional glance at the entertainment news is so often tiring, with all this especially going for the "prestige TV" churned out by the Midcult division of the Masscult machinery, the claquing on behalf of which has made the unhinged levels of critical acclaim with which The Sopranos was presented to the world increasingly the norm. Recently this claquing has extended to The Last of Us, which seems to be the successor to The Walking Dead as a prestige TV zombie show craze (because, apparently, someone decided that this trend simply cannot be allowed to end).
Naturally I have had an eye open for critical takes (especially when they did not fall into the standard, tiresome, "culture warriors vs. SJWs" pattern), with Ed Hightower supplying just that. As one might guess from his colleague Carlos Delgado's review of the game that was the show's basis as "a thoroughly miserable experience; nasty, gratuitously violent and misanthropic . . . more of a symptom of a coarse and violent culture than a commentary upon it," for Hightower the show "marks something of a new high-water mark for savage violence and self-assured misanthropy in the entertainment industry," with this drama in which "unspeakable cruelty" of various kinds is "most of the plot," in between acts of which cruelty the "dialogue barely rises above the level of grunts and barks." And, contrary to what those of fashionable, dark-and-gritty-loving taste may imagine, this was not meant as a compliment, the work not only repellent, but lazy, shallow and, rather than trite-but-true, trite-but-false throughout.
Sounds about right for this kind of thing.
Thursday, March 23, 2023
What are the Prospects for Controlling the Pace of Artificial Intelligence Research Through International Agreement?
A Follow-Up to "Ezra Klein on AI in the New York Times: A Reaction."
In making his case regarding the rapid pace and potentially dire consequences of artificial intelligence research not only in the long run, but in the fairly near term, at the current pace of such research, Ezra Klein, in the absence of some acceleration of effective human adaptation to the results (or even alongside such adaptation) called for an international regime for controlling research in such technology, to the end of slowing it down.
While I do not share Mr. Klein's view on this particular matter his premise that something he sees as a problem may be redressed is certainly healthier than the apathy-encouraging doomism that an exceedingly crass and corrupt media delights in inflicting on the public every minute of the day. Still, I have to confess that the record of international cooperation in recent decades on any issue of importance has not been . . . inspiring; the trend in political life for the moment seems to be away from the circumstances conducive to such cooperation, rather than toward it; and the nature of this issue may make it particularly tough to address in this manner.
Consider, for example, those problems that have most seemed to demand global cooperation--where a potentially existential danger to human civilization was pressing, and nothing less than the whole world working together would really do in meeting that danger. The two most obvious are nuclear weapons and climate change--with the failure the more pointed here because of what states officially committed themselves to in the most public way. Consider, for example, the text of the 1968 Treaty on the Non-Proliferation of Nuclear Weapons. This has the undersigned declaring "their intention to achieve at the earliest possible date the cessation of the nuclear arms race and to undertake effective measures in the direction of nuclear disarmament," with this explicitly including agreeing to work toward
By contrast, we would seem a much longer way from even such gestures in the direction of action as, the 2015 Paris Climate Agreement. Where artificial intelligence is concerned--comparatively few persuaded that any threat it poses is nearly so dire, with, not insignificantly, broad attitudes toward the issue differing greatly across cultures and nations. Significantly it has often been remarked that in East Asia--a particularly critical region for the development of the technology--attitudes toward AI are more positive than in the West, with a Pew Research Center survey from December 2020 providing some recent statistical confirmation. (In Japan, Taiwan, South Korea, of those polled 65 percent or more said the development of AI was a "good thing" for society, and no more than 22 percent said it was a bad one--with, one might add, the Pew Center's polling in India producing similar numbers. By contrast in the U.S. 47 percent identified it as a "good thing" and 44 percent as a bad, the responses in fellow G-7 members Britain, Germany and Canada similar, and respondents in France even more negative. Even those Westerners most positively disposed toward AI, the Swedish, among whom the positive/negative split was 60-24, still fell short of the poll numbers of all of the Asian countries named here.)
Were major Asian countries such as Japan, India--or a China that plausibly sees the matter similarly--to refuse to sign such an agreement it would be effectively a dead letter. However, I suspect that we would be unlikely to get anywhere near the point where there would be such a consensus even in the more AI-phobic West. It matters, for example, that positive views on the matter are correlated with education--suggesting that elites, with all this implies for policy, are more open to the idea of AI than the population at large. (In the U.S., for instance, 50 percent of those with "more education" who were polled thought AI a good thing, as against 41 percent of those with "less education.") It matters, too, that the young are more inclined to positive views, suggesting such attitudes will become more rather than less common in the years ahead. (Those below median age were more likely than those above it to think AI a good thing by a margin of 49 to 44 percent.) The result is that increasing openness to AI, rather than the opposite, seems to be the trend in the West. (Meanwhile if the trajectory in Asia parallels that in the West, as it actually seems to do by at least as large a margin--with in Japan the more educated 10 percent more likely than the less educated and those below median age 17 percent more likely than their elder to think AI a good thing--we would see an even more positive attitude there, with all that implies for Asian attitudes in the same area.)
I suspect it would take a great deal to change attitudes here--especially among, again, the more favorably inclined policymaking groups, the more in as governments and business presently seem to hope for so much from artificial intelligence (which, if experience in areas like climate change is anything to go by, will very easily outweigh very strong reservations among the broad public). Certainly in the United States a Wall Street embrace of AI such as some now speculate about would make any action seriously restricting it all but unimaginable to go by recent political history, while the deference of officials to such interests aside, concern for plain and simple state power will also be operative around the world. After all, we live in an age in which realpolitik is not declining but intensifying, with one dimension of this intensifying military competition--and susceptibility to technological hype in the military sphere, with at least one world leader of note having publicly expressed expectations that artificial intelligence superiority will be "ruler of the world." Such a statement is not to be taken lightly, the more in as he has just said what others are doubtless thinking, not least because countries facing a situation of geopolitical disadvantage they regard as intolerable (and many do) may see in AI their best hope of "leveling the field." Meanwhile, there is the humbler matter of economic productivity, perhaps the more urgent at a time of economic stagnation at best (or, depending on how does the math, collapsing output), and mounting economic "headwinds," with, again, East Asia worth mentioning here given its stagnant, aging, even shrinking populations. To cite simply the most dramatic example Japan has seen its working-age population collapse over this past generation, falling from 87 million in 1994-1995 to 73 million in 2021, with no end in sight. For a country in such straits, the prospect of AI-powered productivity coming to its economic rescue cannot be overlooked.
All that, too, will factor into the likelihood of an international AI control regime--or the opposite--and the likelihood of states abiding by it. Moreover, even were it possible to institute an agreement, in relative good faith, enforcement would be extremely challenging given the character of AI research. While CNBC remarked that the technology's development was "expensive" the figure for developing and training a program like GPT-3 may have been in the vicinity of $4 million.
That's it. Just $4 million. This may mean that the average person cannot afford to finance a "startup" such as this out of pocket, however much they would like to "get in on the ground floor here." But by the standards of high-technology corporate R & D this is nothing. Repeat, nothing. (According to a report from a few months ago some $75 billion--enough to fund 20,000 GPT-3s--have been spent on self-driving technology--likely to less result in the way of a useful product.) Still more is it nothing next to the resources governments have and are willing to apply to those ends they prioritize. (The Pentagon is requesting $130 billion for a single year's R & D.)
When the "barrier to entry" is that low it is that much more difficult for an international regime to effectively enforce the rules. Indeed, any number of wealthy individuals, who think little of spending nine figures on grand homes and gigayachts; and many of whom are famed for their transhumanist inclinations; could easily afford to fund such research on a personal basis even where they could not do so via their command of their business empires. Moreover, a program of this kind would be relatively easy to keep secret, given the nature of computer research--in contrast with the physical facilities required for the development, production and deployment of, for example, an arsenal of nuclear missiles, with all the temptation that the possibility of discretion, as well as cheapness, affords.
Of course, it may be that no obstacle is wholly insuperable. However, separately, and still more together, they are extremely formidable--enough so to have consistently defeated aspirations to meaningful cooperation in the past. Moreover, if Mr. Klein is right the proverbial clock is ticking--with perhaps not very much time left before the question of the "opening of the portal" becomes mooted by the actual event.
If one takes that prospect seriously, of course.
In making his case regarding the rapid pace and potentially dire consequences of artificial intelligence research not only in the long run, but in the fairly near term, at the current pace of such research, Ezra Klein, in the absence of some acceleration of effective human adaptation to the results (or even alongside such adaptation) called for an international regime for controlling research in such technology, to the end of slowing it down.
While I do not share Mr. Klein's view on this particular matter his premise that something he sees as a problem may be redressed is certainly healthier than the apathy-encouraging doomism that an exceedingly crass and corrupt media delights in inflicting on the public every minute of the day. Still, I have to confess that the record of international cooperation in recent decades on any issue of importance has not been . . . inspiring; the trend in political life for the moment seems to be away from the circumstances conducive to such cooperation, rather than toward it; and the nature of this issue may make it particularly tough to address in this manner.
Consider, for example, those problems that have most seemed to demand global cooperation--where a potentially existential danger to human civilization was pressing, and nothing less than the whole world working together would really do in meeting that danger. The two most obvious are nuclear weapons and climate change--with the failure the more pointed here because of what states officially committed themselves to in the most public way. Consider, for example, the text of the 1968 Treaty on the Non-Proliferation of Nuclear Weapons. This has the undersigned declaring "their intention to achieve at the earliest possible date the cessation of the nuclear arms race and to undertake effective measures in the direction of nuclear disarmament," with this explicitly including agreeing to work toward
the cessation of the manufacture of nuclear weapons, the liquidation of all their existing stockpiles, and the elimination from national arsenals of nuclear weapons and the means of their delivery pursuant to a Treaty on general and complete disarmament under strict and effective international control.Over a half century on, just how seriously would you say "the undersigned" took all this? How seriously would you say they are taking it now--as the bilateral arms control agreements between the U.S. and Soviet Union of the Cold War era, at best steps in the right direction, fall by the wayside with the most uncertain consequences, and acknowledged nuclear powers like Britain and China expand their arsenals, as they race to field a new generation of super-weapons straight out of a comic book super-villain's laboratory. Meanwhile, the latest United Nations' report on climate change making the round of the headlines (among much else) makes all too clear how little governments have even attempted to do relative to even the agreements they have signed, never mind what the scale of the problem would seem to demand.
By contrast, we would seem a much longer way from even such gestures in the direction of action as, the 2015 Paris Climate Agreement. Where artificial intelligence is concerned--comparatively few persuaded that any threat it poses is nearly so dire, with, not insignificantly, broad attitudes toward the issue differing greatly across cultures and nations. Significantly it has often been remarked that in East Asia--a particularly critical region for the development of the technology--attitudes toward AI are more positive than in the West, with a Pew Research Center survey from December 2020 providing some recent statistical confirmation. (In Japan, Taiwan, South Korea, of those polled 65 percent or more said the development of AI was a "good thing" for society, and no more than 22 percent said it was a bad one--with, one might add, the Pew Center's polling in India producing similar numbers. By contrast in the U.S. 47 percent identified it as a "good thing" and 44 percent as a bad, the responses in fellow G-7 members Britain, Germany and Canada similar, and respondents in France even more negative. Even those Westerners most positively disposed toward AI, the Swedish, among whom the positive/negative split was 60-24, still fell short of the poll numbers of all of the Asian countries named here.)
Were major Asian countries such as Japan, India--or a China that plausibly sees the matter similarly--to refuse to sign such an agreement it would be effectively a dead letter. However, I suspect that we would be unlikely to get anywhere near the point where there would be such a consensus even in the more AI-phobic West. It matters, for example, that positive views on the matter are correlated with education--suggesting that elites, with all this implies for policy, are more open to the idea of AI than the population at large. (In the U.S., for instance, 50 percent of those with "more education" who were polled thought AI a good thing, as against 41 percent of those with "less education.") It matters, too, that the young are more inclined to positive views, suggesting such attitudes will become more rather than less common in the years ahead. (Those below median age were more likely than those above it to think AI a good thing by a margin of 49 to 44 percent.) The result is that increasing openness to AI, rather than the opposite, seems to be the trend in the West. (Meanwhile if the trajectory in Asia parallels that in the West, as it actually seems to do by at least as large a margin--with in Japan the more educated 10 percent more likely than the less educated and those below median age 17 percent more likely than their elder to think AI a good thing--we would see an even more positive attitude there, with all that implies for Asian attitudes in the same area.)
I suspect it would take a great deal to change attitudes here--especially among, again, the more favorably inclined policymaking groups, the more in as governments and business presently seem to hope for so much from artificial intelligence (which, if experience in areas like climate change is anything to go by, will very easily outweigh very strong reservations among the broad public). Certainly in the United States a Wall Street embrace of AI such as some now speculate about would make any action seriously restricting it all but unimaginable to go by recent political history, while the deference of officials to such interests aside, concern for plain and simple state power will also be operative around the world. After all, we live in an age in which realpolitik is not declining but intensifying, with one dimension of this intensifying military competition--and susceptibility to technological hype in the military sphere, with at least one world leader of note having publicly expressed expectations that artificial intelligence superiority will be "ruler of the world." Such a statement is not to be taken lightly, the more in as he has just said what others are doubtless thinking, not least because countries facing a situation of geopolitical disadvantage they regard as intolerable (and many do) may see in AI their best hope of "leveling the field." Meanwhile, there is the humbler matter of economic productivity, perhaps the more urgent at a time of economic stagnation at best (or, depending on how does the math, collapsing output), and mounting economic "headwinds," with, again, East Asia worth mentioning here given its stagnant, aging, even shrinking populations. To cite simply the most dramatic example Japan has seen its working-age population collapse over this past generation, falling from 87 million in 1994-1995 to 73 million in 2021, with no end in sight. For a country in such straits, the prospect of AI-powered productivity coming to its economic rescue cannot be overlooked.
All that, too, will factor into the likelihood of an international AI control regime--or the opposite--and the likelihood of states abiding by it. Moreover, even were it possible to institute an agreement, in relative good faith, enforcement would be extremely challenging given the character of AI research. While CNBC remarked that the technology's development was "expensive" the figure for developing and training a program like GPT-3 may have been in the vicinity of $4 million.
That's it. Just $4 million. This may mean that the average person cannot afford to finance a "startup" such as this out of pocket, however much they would like to "get in on the ground floor here." But by the standards of high-technology corporate R & D this is nothing. Repeat, nothing. (According to a report from a few months ago some $75 billion--enough to fund 20,000 GPT-3s--have been spent on self-driving technology--likely to less result in the way of a useful product.) Still more is it nothing next to the resources governments have and are willing to apply to those ends they prioritize. (The Pentagon is requesting $130 billion for a single year's R & D.)
When the "barrier to entry" is that low it is that much more difficult for an international regime to effectively enforce the rules. Indeed, any number of wealthy individuals, who think little of spending nine figures on grand homes and gigayachts; and many of whom are famed for their transhumanist inclinations; could easily afford to fund such research on a personal basis even where they could not do so via their command of their business empires. Moreover, a program of this kind would be relatively easy to keep secret, given the nature of computer research--in contrast with the physical facilities required for the development, production and deployment of, for example, an arsenal of nuclear missiles, with all the temptation that the possibility of discretion, as well as cheapness, affords.
Of course, it may be that no obstacle is wholly insuperable. However, separately, and still more together, they are extremely formidable--enough so to have consistently defeated aspirations to meaningful cooperation in the past. Moreover, if Mr. Klein is right the proverbial clock is ticking--with perhaps not very much time left before the question of the "opening of the portal" becomes mooted by the actual event.
If one takes that prospect seriously, of course.
Wednesday, March 22, 2023
If the Cinematic Blockbuster Goes, Then What?
For decades now film--especially to the extent that we identify film specifically with what people see in a theater--has seemed to be ever-more dominated by the cinematic "blockbuster" spectacles of essentially one of two types: either "family"-oriented animated features tending toward comedy, and especially musical comedy; or action-adventure that has tended to be science-fiction or fantasy-themed, exemplified by the superhero movie. If anything, the pandemic has sharpened the tendency in this direction.
Of course, that raises the question of what will happen if these blockbuster genres go the way of, for instance, the "serious drama" or the romantic comedy--a prospect that may not be far from implausible. After all, those narratively limited forms of filmmaking are fairly well-won now, while the particular franchises that Hollywood so favors are ever more exhausted (witness Disney's troubles with Star Wars and Marvel, and, in the wake of Warner Bros. failures with the DC Universe, the desperation surrounding the talk of "James Gunn to the rescue!"). Meanwhile the big studios that finance such projects are in a financially battered state (with things perhaps getting worse before they get better, assuming they ever do get better). Between falling returns and the shakiness of the investors' portfolios, something could give, with this including the readiness to put up $300 million or more for the production, promotion and distribution of a big movie.
In such an eventuality the most obvious outcome would, in spite of the desire of the film studios to double down on theaters (and the income from $20 a ticket sales), seem to me to be the continued decline of the theaters--with such venues becoming less numerous, and, to go by what we are hearing of their recent business practice in these very lean years for them (in 2022 the box office gross, if recovering, was, even after Top Gun 2, Marvel's diminished but still not insignificant earnings from its movies, and the release of Avatar 2, still just half what it had been pre-pandemic), becoming more and more "special events" venues or general entertainment centers than houses principally oriented to screening new releases for a steady flow of filmgoers.* Meanwhile, with theatrical income ever less to be relied upon--and theatrical release and the publicity it justifies financially less reliable as a way of competing in the "attention economy"--one could expect the downward pressure on production budgets seen in non-blockbusters these days to continue. Just what that will mean for the content of film, however, will depend on the progress of cinematic technology--and perhaps in particular the possibility of making a TV show look like an "A" picture. The more enthusiastic assessments of the small-screen Star Wars incarnations of recent years generally credit those series' with delivering that, and ultimately it may be here that action-adventure of that type will live on; while if seemingly a more remote prospect the quality of animation on the small screen may begin to approximate that on the large.
* Together Dr. Strange 2, Thor 4 and Black Panther 2 made $1.2 billion in North America--perhaps not all that had been hoped for, but far from trivial, and indeed the kind of money that for most of 2020 and 2021 producers could only fantasize about (as when together Black Widow, The Eternals and Shangi-Chi, coming out in the five month July-November 2021 period, failed to crack $600 million between the three of them). And of course, 2022 also saw Marvel take in another $200 million via Spider-Man's ticket sales from New Year's Day on.
Of course, that raises the question of what will happen if these blockbuster genres go the way of, for instance, the "serious drama" or the romantic comedy--a prospect that may not be far from implausible. After all, those narratively limited forms of filmmaking are fairly well-won now, while the particular franchises that Hollywood so favors are ever more exhausted (witness Disney's troubles with Star Wars and Marvel, and, in the wake of Warner Bros. failures with the DC Universe, the desperation surrounding the talk of "James Gunn to the rescue!"). Meanwhile the big studios that finance such projects are in a financially battered state (with things perhaps getting worse before they get better, assuming they ever do get better). Between falling returns and the shakiness of the investors' portfolios, something could give, with this including the readiness to put up $300 million or more for the production, promotion and distribution of a big movie.
In such an eventuality the most obvious outcome would, in spite of the desire of the film studios to double down on theaters (and the income from $20 a ticket sales), seem to me to be the continued decline of the theaters--with such venues becoming less numerous, and, to go by what we are hearing of their recent business practice in these very lean years for them (in 2022 the box office gross, if recovering, was, even after Top Gun 2, Marvel's diminished but still not insignificant earnings from its movies, and the release of Avatar 2, still just half what it had been pre-pandemic), becoming more and more "special events" venues or general entertainment centers than houses principally oriented to screening new releases for a steady flow of filmgoers.* Meanwhile, with theatrical income ever less to be relied upon--and theatrical release and the publicity it justifies financially less reliable as a way of competing in the "attention economy"--one could expect the downward pressure on production budgets seen in non-blockbusters these days to continue. Just what that will mean for the content of film, however, will depend on the progress of cinematic technology--and perhaps in particular the possibility of making a TV show look like an "A" picture. The more enthusiastic assessments of the small-screen Star Wars incarnations of recent years generally credit those series' with delivering that, and ultimately it may be here that action-adventure of that type will live on; while if seemingly a more remote prospect the quality of animation on the small screen may begin to approximate that on the large.
* Together Dr. Strange 2, Thor 4 and Black Panther 2 made $1.2 billion in North America--perhaps not all that had been hoped for, but far from trivial, and indeed the kind of money that for most of 2020 and 2021 producers could only fantasize about (as when together Black Widow, The Eternals and Shangi-Chi, coming out in the five month July-November 2021 period, failed to crack $600 million between the three of them). And of course, 2022 also saw Marvel take in another $200 million via Spider-Man's ticket sales from New Year's Day on.
Headwinds for the Cinematic Blockbuster?
When considering the way the movie business has become more blockbuster-centered it seems one should not only be attentive to the way in which blockbusters crowd other kinds of fare out of the market (driving it to streaming, and usually, to obscurity on streaming), but also the frailty of the blockbuster model itself, and what that frailty means for the industry now so reliant on it. In particular I see that frailty as leaving it susceptible to at least four "headwinds" worth discussing:
1. The Continued Decline of Theater-Going.
As of 2023 people are returning to their pre-pandemic theater-going habits. However, --but it is worth remembering that these were already under strain before the pandemic, which seems to have encouraged them in the tendency to not come out for anything but the biggest "tentpoles." One may add that the pandemic has meant that when they are willing to "come out" there are fewer places for them to go. Actual theater closures have been less numerous than I once thought they might be given the way ticket sales crashed during the pandemic and have since stayed depressed--the total number of theaters shrinking less than a tenth. Still, in doing so they have likely taken a certain amount of custom with them (as some people have lost a favorite theater, now have to travel a longer way to get to any theater, etc., making them less easily persuaded to go), while one cannot be sure the closure wave has run its course, especially given the broader economic situation. Besides the obvious implications of downturn and rising interest rates for theaters as for all other business, there is what it means for the moviegoing public. In brief, between inflation and recession those who are seeing movies again may be more careful with their money, while, one should not forget, for many the trip to the theater (the pricey tickets, gas and parking, the cost of obscenely marked-up concessions, etc., as against staying home and watching streaming as they eat what they have in the refrigerator) is a luxury with all that means for the viability of the remaining theaters. All of this may portend further disruption of an ever-more fragile habit of theater-going--and the revenue stream from the $20 theater tickets that Hollywood cannot get along without in anything like its current form.
2. The Accumulating Holes in the Movie Industry's Balance Sheets.
In considering the matter of interest rates it is worth acknowledging that if this has been a problem for theaters it has probably been a bigger one for the studios. During the period of ultra-loose monetary policy, in which central bankers made giving away money to investors a cornerstone of the growth model--and business gorged itself on the effectively free money, with the would-be streaming giants no exception as they funded vast amounts of "content" for the sake of building up a subscriber base, with relatively little thought for what the limits to its profitability were. Add to that the revenue collapse associated with the depression of theater-going for three years as of this month, and counting--with its pressure to borrow that much more cash. Of course, the surge of inflation reduced the "real" weight of studio debts--but the more recent rise in interest rates meant the cheap borrowing was at an end. Between that debt accumulation and the higher cost of borrowing something has to give--and indeed it has. Certainly we see that in the greater reserve toward streaming among Netflix, WBD, Disney, among others. But, even as in the wake of their unhappy experiments with releasing big-budget movies to streaming on their opening weekend in theaters the studios affirm their commitment to theatrical release, it seems far from implausible that all this is not factoring into studio thinking about the money they put up for those theatrically released features. This may not make them more willing to invest in small movies--but they are still likely to be more reserved toward big ones than they were before, with reinforcing that tendency the fact that their prospects for both audience, and for investors, may both be shrinking.
3. The Fragmentation of the Global Film Market.
Amid the wretched, crisis-ridden performance of the neoliberal economic vision, worse than ever this past decade; the resurgence of dissent against it that has substantially taken, or been channeled into, nationalistic forms; and the deterioration of great power relations, exemplified by the fighting in Ukraine and the looming danger of war in the Far East; the world economy is "deglobalizing." It is unclear how far this process will be allowed to go in this deeply interconnected world (even now the level of global economic integration represented by trade and investment flows has no precedent from before the twenty-first century), but cinema (first and foremost an economic enterprise to those in charge of the industry) has certainly not been untouched by that process. The result has been diminished access for Hollywood to critical international markets, with all it spells for its earnings (Marvel's Phase Four being shut out of China, for instance, likely cost it its cut of hundreds of millions of dollars' worth of ticket sales and much else in that country). It has also reduced access to sources of financing of such movies. (Who would have thought that in making even Top Gun 2 Hollywood would have got financing from Chinese and Russian investors?) The result is that the Hollywood studios are not only cut off from ticket-buyers on whom they depend, but important sources of capital by the process, just as, again, the interest rate's going up has put the squeeze on domestic funding--and the number of safe investments for them to make dwindles.
4. The Wearing Out of the Old Franchises--and the Lack of New Ones.
In our time it is a truism that businesses all across the media spectrum have in recent decades preferred to mine old successes than prospect for new riches. (Consider just how old just about all those comic book superheroes are. Happy eighty-fifth birthday Superman!) These days the tendency seems to be catching up with them as just about every one of those franchises looks old and tired--and indeed, even the genres to which they belong look old and tired. After all, if the current superhero boom can be traced back to 2000's X-Men, then we have had nearly a quarter of a century of big-budget A-list superhero features as a standard and increasingly large part of our cinematic diet--an extremely long run for any cinematic fashion, with this exemplified by how studios have fallen into the habit of telling and retelling and re-retelling the same stories about the same characters over and over and over again. (Thus it is not just the case that we have more X-Men movies, but we have seen remakes of the origin stories of both Batman and Spider-Man, while the X-Men's Dark Phoenix saga is now a twice-told tale.) Even the claqueurs of the entertainment press can no longer pretend otherwise--while as Star Wars and Marvel and DC look ever-creakier it is far from clear that anything else can serve in their places amid all those long years of no one even trying to make anything new.
I suspect that the irony goes right over the heads of the vulgarians calling the shots. Still, limited beings that they are, they are not--in spite of the PR efforts that only the stupidest person would believe, and the suck-up writing of courtiers who would tell us these people actually care about movies, or anything else but their salaries--they are not wholly oblivious to the fact that they have been foisting stale garbage on the public over and over and over again. Certainly to go by the revelations of the Sony hack of a decade ago the executives were certainly aware of what they had on their hands with the Bond film Spectre long before disappointing reviews and disappointing box office revenue confirmed it for everyone. Some of them may well be feeling the same way about their handling of more recent high concept action-adventure franchises--caught on to the fact of its being run into the ground the same way they ran "the one that started it all" into the ground. But even if this is so I have no expectation whatsoever of their changing tack anytime soon--just their getting tackier, as they ever more shamelessly serve up the same refuse to ever-diminishing returns, financially as well as creatively.
1. The Continued Decline of Theater-Going.
As of 2023 people are returning to their pre-pandemic theater-going habits. However, --but it is worth remembering that these were already under strain before the pandemic, which seems to have encouraged them in the tendency to not come out for anything but the biggest "tentpoles." One may add that the pandemic has meant that when they are willing to "come out" there are fewer places for them to go. Actual theater closures have been less numerous than I once thought they might be given the way ticket sales crashed during the pandemic and have since stayed depressed--the total number of theaters shrinking less than a tenth. Still, in doing so they have likely taken a certain amount of custom with them (as some people have lost a favorite theater, now have to travel a longer way to get to any theater, etc., making them less easily persuaded to go), while one cannot be sure the closure wave has run its course, especially given the broader economic situation. Besides the obvious implications of downturn and rising interest rates for theaters as for all other business, there is what it means for the moviegoing public. In brief, between inflation and recession those who are seeing movies again may be more careful with their money, while, one should not forget, for many the trip to the theater (the pricey tickets, gas and parking, the cost of obscenely marked-up concessions, etc., as against staying home and watching streaming as they eat what they have in the refrigerator) is a luxury with all that means for the viability of the remaining theaters. All of this may portend further disruption of an ever-more fragile habit of theater-going--and the revenue stream from the $20 theater tickets that Hollywood cannot get along without in anything like its current form.
2. The Accumulating Holes in the Movie Industry's Balance Sheets.
In considering the matter of interest rates it is worth acknowledging that if this has been a problem for theaters it has probably been a bigger one for the studios. During the period of ultra-loose monetary policy, in which central bankers made giving away money to investors a cornerstone of the growth model--and business gorged itself on the effectively free money, with the would-be streaming giants no exception as they funded vast amounts of "content" for the sake of building up a subscriber base, with relatively little thought for what the limits to its profitability were. Add to that the revenue collapse associated with the depression of theater-going for three years as of this month, and counting--with its pressure to borrow that much more cash. Of course, the surge of inflation reduced the "real" weight of studio debts--but the more recent rise in interest rates meant the cheap borrowing was at an end. Between that debt accumulation and the higher cost of borrowing something has to give--and indeed it has. Certainly we see that in the greater reserve toward streaming among Netflix, WBD, Disney, among others. But, even as in the wake of their unhappy experiments with releasing big-budget movies to streaming on their opening weekend in theaters the studios affirm their commitment to theatrical release, it seems far from implausible that all this is not factoring into studio thinking about the money they put up for those theatrically released features. This may not make them more willing to invest in small movies--but they are still likely to be more reserved toward big ones than they were before, with reinforcing that tendency the fact that their prospects for both audience, and for investors, may both be shrinking.
3. The Fragmentation of the Global Film Market.
Amid the wretched, crisis-ridden performance of the neoliberal economic vision, worse than ever this past decade; the resurgence of dissent against it that has substantially taken, or been channeled into, nationalistic forms; and the deterioration of great power relations, exemplified by the fighting in Ukraine and the looming danger of war in the Far East; the world economy is "deglobalizing." It is unclear how far this process will be allowed to go in this deeply interconnected world (even now the level of global economic integration represented by trade and investment flows has no precedent from before the twenty-first century), but cinema (first and foremost an economic enterprise to those in charge of the industry) has certainly not been untouched by that process. The result has been diminished access for Hollywood to critical international markets, with all it spells for its earnings (Marvel's Phase Four being shut out of China, for instance, likely cost it its cut of hundreds of millions of dollars' worth of ticket sales and much else in that country). It has also reduced access to sources of financing of such movies. (Who would have thought that in making even Top Gun 2 Hollywood would have got financing from Chinese and Russian investors?) The result is that the Hollywood studios are not only cut off from ticket-buyers on whom they depend, but important sources of capital by the process, just as, again, the interest rate's going up has put the squeeze on domestic funding--and the number of safe investments for them to make dwindles.
4. The Wearing Out of the Old Franchises--and the Lack of New Ones.
In our time it is a truism that businesses all across the media spectrum have in recent decades preferred to mine old successes than prospect for new riches. (Consider just how old just about all those comic book superheroes are. Happy eighty-fifth birthday Superman!) These days the tendency seems to be catching up with them as just about every one of those franchises looks old and tired--and indeed, even the genres to which they belong look old and tired. After all, if the current superhero boom can be traced back to 2000's X-Men, then we have had nearly a quarter of a century of big-budget A-list superhero features as a standard and increasingly large part of our cinematic diet--an extremely long run for any cinematic fashion, with this exemplified by how studios have fallen into the habit of telling and retelling and re-retelling the same stories about the same characters over and over and over again. (Thus it is not just the case that we have more X-Men movies, but we have seen remakes of the origin stories of both Batman and Spider-Man, while the X-Men's Dark Phoenix saga is now a twice-told tale.) Even the claqueurs of the entertainment press can no longer pretend otherwise--while as Star Wars and Marvel and DC look ever-creakier it is far from clear that anything else can serve in their places amid all those long years of no one even trying to make anything new.
I suspect that the irony goes right over the heads of the vulgarians calling the shots. Still, limited beings that they are, they are not--in spite of the PR efforts that only the stupidest person would believe, and the suck-up writing of courtiers who would tell us these people actually care about movies, or anything else but their salaries--they are not wholly oblivious to the fact that they have been foisting stale garbage on the public over and over and over again. Certainly to go by the revelations of the Sony hack of a decade ago the executives were certainly aware of what they had on their hands with the Bond film Spectre long before disappointing reviews and disappointing box office revenue confirmed it for everyone. Some of them may well be feeling the same way about their handling of more recent high concept action-adventure franchises--caught on to the fact of its being run into the ground the same way they ran "the one that started it all" into the ground. But even if this is so I have no expectation whatsoever of their changing tack anytime soon--just their getting tackier, as they ever more shamelessly serve up the same refuse to ever-diminishing returns, financially as well as creatively.
Tuesday, March 21, 2023
Is an Artificial Intelligence Bubble Emerging?
Back at the turn of the century I was intrigued by the flurry of predictions regarding the progress of artificial intelligence most popularly identified with Ray Kurzweil. (Indeed, all this was a significant source of inspiration for Surviving the Spike.) Of course, all of this looked wildly off the mark a decade on, and I declared AI a bust--thinking in particular that Kurzweil had simply expected too much of the neural net-based pattern recognition on which so many of his other predictions hinged (while also being overoptimistic about the progress toward other relevant technologies like carbon nanotube-based 3-D chips potentially permitting computers three orders of magnitude faster).
Of course, in the mid-'10s there was a new burst of excitement based on progress in exactly the key area, neural nets and their training to recognize patterns. At my most optimistic I admitted to
Alas, here we are in 2023 with, even after years of extended time on the test such vehicles decidedly not passing it, and not showing much sign of doing so very soon, with the hype, in fact, backfiring here. Meanwhile other technologies that might have afforded the foundations of later advance have similarly disappointed (the carbon nanotube-based computer chip remaining elusive, and nothing else any more likely to provide a near-term successor to silicon). Indeed, the pandemic drove the fact home. In a moment when business strongly desired automation there was just not much help on offer from that direction, with the event in fact a reminder of how little replacement there has yet been for human beings physically showing up to their workplace to perform innumerable tasks with their own hands.
The resulting sense of very strong, very recent, bust, made it very surprising when these past few months the media started buzzing again about AI--and a tribute to unsophistication on the part of our journalists given the cause of the fuss. Rather than some breakthrough in computer chip design promising much more powerful computers with all they might do, or the fundamental design of neural nets such as might make them radically more efficient learners (enabling them to overcome such critical "bottlenecks" to automation as "perception and manipulation" or "social intelligence"), the excitement is about "chatbots," and in particular a chatbot based less on fundamental design innovation in the underlying neural network than simply making the network bigger for the sake of "language prediction." Indeed, those less than impressed with the chatbots term them, unflatteringly but so far as I understand it not unfairly, glorified autocomplete functions.
Of course, it may well be that a better autocomplete function will surprise us with just what it can do. We are, for example, being told that it can learn to code. Others go further and say that such chatbots may be so versatile as to "destroy" the white collar salariat in a decade's time--or even warn that artificial intelligence research has reached the point of Lovecraftian horror story "meddling with forces we cannot possibly comprehend."
In that there seems the kind of hype that creates bubbles, though admittedly many other factors have to be considered, not all of which fit in with the bubble-making pattern. After all, this has been a period of rising interest rates and tightening monetary policy and falling asset values relative to liabilities that have come to be piled very high indeed amid cheap borrowing and economic catastrophe--hardly the sort of thing that encourages speculative euphoria. Still, as banks start failing it remains to be seen just how stalwart the central bankers will prove in their current course (on which they do not seem to have ever planned to go by their "stress tests" for the banks, and doubtless follow only with great reluctance). Moreover, there is, just as before, a very great deal of money out there that has to be parked somewhere, and still very few attractive places to put it (this is, for instance, why so much money was poured into those crappy mortgage-backed securities!), while, even as the press remains addicted to calling anyone with a large sum of money or high corporate office a "genius" (Ken Lay was a "genius," Jeffrey Epstein was a "genius," Elizabeth Holmes was a "genius, Sam Bankman-Fried was a "genius," on and on, ad nauseam), they consistently prove to be the extreme opposite, with that going the more for those who so eagerly jump on the bandwagon every time. It matters here that said persons tend to be perfectly okay with the prioritization of producing high share values over the creation of an actual product long central to such scenes ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company"), expect that if worse comes to worse there will always be a greater fool, and believe that if they run into any real trouble they will have the benefit of a bailout one way or the other, usually rightly.
The result is that, even if the money reportedly beginning to pour into artificial intelligence really is a case of people getting excited over an autocomplete function--and I will add, very little prospect of anything here to really compare with the insanity of the '90s dot-com bubble for the time being--I still wouldn't advise betting against there being a good deal of action at this particular table in the global casino in the coming months and years.
Of course, in the mid-'10s there was a new burst of excitement based on progress in exactly the key area, neural nets and their training to recognize patterns. At my most optimistic I admitted to
find[ing] myself increasingly suspecting that Kurzweil was accurate enough in guessing what would happen, and how, but, due to overoptimism about the rate of improvement in the underlying technology, was off the mark in regard to when by a rough decade.Suspecting it enough, at least, to be more watchful of a scene to which I had paid little heed for many years--with, I thought, how self-driving cars do a particularly good indicator of just how substantial, or insubstantial, the progress was. Whether or not an artificial intelligence could autonomously, reliably, safely navigate a large vehicle through a complex, variable, unpredictable urban environment seemed to me a meaningful test of progress in this area--and its passing the test reason to take the discussion seriously.
Alas, here we are in 2023 with, even after years of extended time on the test such vehicles decidedly not passing it, and not showing much sign of doing so very soon, with the hype, in fact, backfiring here. Meanwhile other technologies that might have afforded the foundations of later advance have similarly disappointed (the carbon nanotube-based computer chip remaining elusive, and nothing else any more likely to provide a near-term successor to silicon). Indeed, the pandemic drove the fact home. In a moment when business strongly desired automation there was just not much help on offer from that direction, with the event in fact a reminder of how little replacement there has yet been for human beings physically showing up to their workplace to perform innumerable tasks with their own hands.
The resulting sense of very strong, very recent, bust, made it very surprising when these past few months the media started buzzing again about AI--and a tribute to unsophistication on the part of our journalists given the cause of the fuss. Rather than some breakthrough in computer chip design promising much more powerful computers with all they might do, or the fundamental design of neural nets such as might make them radically more efficient learners (enabling them to overcome such critical "bottlenecks" to automation as "perception and manipulation" or "social intelligence"), the excitement is about "chatbots," and in particular a chatbot based less on fundamental design innovation in the underlying neural network than simply making the network bigger for the sake of "language prediction." Indeed, those less than impressed with the chatbots term them, unflatteringly but so far as I understand it not unfairly, glorified autocomplete functions.
Of course, it may well be that a better autocomplete function will surprise us with just what it can do. We are, for example, being told that it can learn to code. Others go further and say that such chatbots may be so versatile as to "destroy" the white collar salariat in a decade's time--or even warn that artificial intelligence research has reached the point of Lovecraftian horror story "meddling with forces we cannot possibly comprehend."
In that there seems the kind of hype that creates bubbles, though admittedly many other factors have to be considered, not all of which fit in with the bubble-making pattern. After all, this has been a period of rising interest rates and tightening monetary policy and falling asset values relative to liabilities that have come to be piled very high indeed amid cheap borrowing and economic catastrophe--hardly the sort of thing that encourages speculative euphoria. Still, as banks start failing it remains to be seen just how stalwart the central bankers will prove in their current course (on which they do not seem to have ever planned to go by their "stress tests" for the banks, and doubtless follow only with great reluctance). Moreover, there is, just as before, a very great deal of money out there that has to be parked somewhere, and still very few attractive places to put it (this is, for instance, why so much money was poured into those crappy mortgage-backed securities!), while, even as the press remains addicted to calling anyone with a large sum of money or high corporate office a "genius" (Ken Lay was a "genius," Jeffrey Epstein was a "genius," Elizabeth Holmes was a "genius, Sam Bankman-Fried was a "genius," on and on, ad nauseam), they consistently prove to be the extreme opposite, with that going the more for those who so eagerly jump on the bandwagon every time. It matters here that said persons tend to be perfectly okay with the prioritization of producing high share values over the creation of an actual product long central to such scenes ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company"), expect that if worse comes to worse there will always be a greater fool, and believe that if they run into any real trouble they will have the benefit of a bailout one way or the other, usually rightly.
The result is that, even if the money reportedly beginning to pour into artificial intelligence really is a case of people getting excited over an autocomplete function--and I will add, very little prospect of anything here to really compare with the insanity of the '90s dot-com bubble for the time being--I still wouldn't advise betting against there being a good deal of action at this particular table in the global casino in the coming months and years.
Sunday, March 19, 2023
Shazam! Fury of the Gods' Opening Weekend: How Did it Do?
The numbers are in for Shazam! Fury of the Gods' opening weekend--and as it happens the film's box office gross is below even the subdued earlier expectations. The movie has taken in a mere $30.5 million globally domestically, with the global take $65 million, and the reaction suggesting that (in contrast with, for example, Avatar 2) a not-quite-what-was-hoped-for opening weekend will not be redeemed by strong legs over the coming weeks.
In delivering the news Variety reports that the film's production budget was $110 million+, and the promotional budget (not very surprisingly given the tendency of the latter to match the former) another $100 million+, indicating over $210 million has been sunk into the movie. The result is that to make that back in its theatrical run the movie might have to take in well over $400 million, and likely more than $500 million--considerably more than the original Shazam.
By contrast the sequel seems likely to make a good deal less in the wake of the opening weekend's numbers--in the absence of better legs or international responsiveness relative to what it got at home than its predecessor, finishing up under $80 million domestically, and perhaps not even making $200 million globally.
Even allowing for the studio's having laid out less than the full sum that went into making and marketing the picture, and there being revenue besides the box office gross, Shazam 2 looks a long way from breaking even, never mind success by any other measure. Indeed, its numbers can make it seem as if Ant-Man 3 did not do so badly by comparison--though in considering any such possibility one has to remember that movie may have cost twice as much to make and market.* Additionally, where Shazam 2 can seem a matter of Warner Bros. (to use the terminology of television rather than film) "burning off" the last "episodes" of its DC Universe "show" prior to the much-ballyhooed James Gunn-led overhaul of its management of the franchise Ant-Man 3 was to be the start of Marvel's Phase Five, which Disney/Marvel desperately need to go very well after the disasters of the last three years (the pandemic's battering of the movie business generally, Marvel Phase Four's underperformance, the losses amassed in the foolish splurging on streaming, etc., etc.).
The result is that these two releases, which came just a month apart, testify to crisis in both of the great comic book film adaptation universes--and in the larger industry that has become so dependent on them.
* This weekend Ant-Man 3 took in about $4.1 million (another more than 40 percent drop) and seems to me still on track to finish up with about $220 million or less.
In delivering the news Variety reports that the film's production budget was $110 million+, and the promotional budget (not very surprisingly given the tendency of the latter to match the former) another $100 million+, indicating over $210 million has been sunk into the movie. The result is that to make that back in its theatrical run the movie might have to take in well over $400 million, and likely more than $500 million--considerably more than the original Shazam.
By contrast the sequel seems likely to make a good deal less in the wake of the opening weekend's numbers--in the absence of better legs or international responsiveness relative to what it got at home than its predecessor, finishing up under $80 million domestically, and perhaps not even making $200 million globally.
Even allowing for the studio's having laid out less than the full sum that went into making and marketing the picture, and there being revenue besides the box office gross, Shazam 2 looks a long way from breaking even, never mind success by any other measure. Indeed, its numbers can make it seem as if Ant-Man 3 did not do so badly by comparison--though in considering any such possibility one has to remember that movie may have cost twice as much to make and market.* Additionally, where Shazam 2 can seem a matter of Warner Bros. (to use the terminology of television rather than film) "burning off" the last "episodes" of its DC Universe "show" prior to the much-ballyhooed James Gunn-led overhaul of its management of the franchise Ant-Man 3 was to be the start of Marvel's Phase Five, which Disney/Marvel desperately need to go very well after the disasters of the last three years (the pandemic's battering of the movie business generally, Marvel Phase Four's underperformance, the losses amassed in the foolish splurging on streaming, etc., etc.).
The result is that these two releases, which came just a month apart, testify to crisis in both of the great comic book film adaptation universes--and in the larger industry that has become so dependent on them.
* This weekend Ant-Man 3 took in about $4.1 million (another more than 40 percent drop) and seems to me still on track to finish up with about $220 million or less.
Saturday, March 18, 2023
Just How Well Would Shazam 2 Have to Do to Be a Winner?
What would it take for Shazam 2 (Shazam! Fury of the Gods) to be a hit?
None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.
My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.
Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.
Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.
So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.
With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.
Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.
Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.
The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.
None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.
My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.
Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.
Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.
So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.
With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.
Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.
Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.
The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.
Comparing and Contrasting Box Office Underperformance: Black Panther 2 vs. Atom-Man 3
When looking over Black Panther 2's box office numbers back in the autumn I argued that there was a case for regarding the film as a commercial failure--as it had made, in inflation-adjusted terms, only half what its predecessor had, while given its big budget (reported as $250 million) it may have been short of covering its cost at the end of its theatrical run.
While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.
It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.
While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.
It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.
Ant-Man 3's Underperformance and the Trajectory of Marvel
In the view of the analysts I have read the disappointing box office performance of Ant-Man 3 was a measure of the viability not only of a relatively minor superhero movie, but rather the bigger Marvel Cinematic Universe of which it is a part--with the disappointment the more significant because of how the franchise has fared in recent years. Once going from strength to strength, all the way up to the two-part confrontation of the MCU's assembled heroes with Thanos (a two-film, $5 billion grossing whose second part became the "highest-grossing film in history") it has been very different with the subsequent "Phase Four." Certainly its problems (and particularly those of its first three movies--Black Widow, The Eternals, and Shangi-Chi and the Legend of the Ten Rings) have been partially attributable to the pandemic, and at the international level, also to Phase Four's shutout from China's market. However, it is undeniable that the content itself has been an issue, with anything after Phase Three's build-up to and realization of the battle with Thanos an anticlimax--no "going bigger" really plausible here--while there was a pervasive sense of the pursuit of diminishing returns, be it in the turn toward prequels with Black Widow, to more obscure characters in The Eternals, and the all too characteristic weariness and tonal confusion of a series gone on too long in Thor: Love and Thunder, even before one considers such awkwardness as the excessive eagerness of Disney to tie its feature films to its small-screen streaming series in Dr. Strange 2 and the presentation of a Black Panther sequel without . . . Black Panther. The result was that only the Spider-Man sequel, which was again a one-of-a-kind event in its drawing together the stuff of three separate big-screen Spider-Man series into one sprawling "multiverse" narrative really felt like an event. And unsurprisingly, only that one performed really "above and beyond" at the box office (in fact becoming the movie that, with almost $2 billion banked globally, proved "filmgoing is back").
The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.
Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.
The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.
Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.
Is Ant-Man 3 a Flop? Crunching the Numbers
When Ant-Man and the Wasp: Quantumania came out on President's Day it looked like a hit, with its $106 million take in the first three days and the $120 million it picked up over the four-day weekend at the North American box office--a feat that seemed the more impressive in as Ant-Man has always been a "second-string" character. (Ant-Man is not Spider-Man, after all. Or Iron Man or Black Panther or Captain America or Captain Marvel or Thor or even Dr. Strange . . .) Still, a couple of weekends later the view of the film as "hit" (commercially, at least; the critics liked it a lot less than its predecessor) gave way to a view of the film as "flop," not only in the eyes of the professional Marvel-bashers (e.g. those who hate Disney's "woke" turn and are eager for evidence of the audience rejecting it), but even quite mainstream observers ordinarily well-disposed to the brand.*
Of course, those discussing these matters for a broad audience rarely crunch the numbers in any rigorous way, at best citing a figure or two, but the truth is that this is easy enough to do on the basis of conveniently available data--like the stats at Box Office Mojo--with an obvious place to start what prompted that change of opinion, namely the sharp drop in gross between the first weekend and the second. That drop was indeed dramatic, some 70 percent between the first and second Friday-to-Sunday periods. Still, one should put it into perspective. Ant-Man 3 was a "threequel," and not to some long-awaited follow-up to a record-breaking "New Classic," but the not-so-long-ago last film in one of the weaker component franchises of the exceedingly prolific Marvel Cinematic Universe; with that movie also debuting on a four-day holiday weekend. One expects such a film's take to be very front-loaded indeed--and the gap with the second weekend to show it--as seen with the preceding Ant-Man film, which had a 62 percent drop between the first and second weekends. The 70 percent drop Ant-Man 3 had does not seem very much worse in the circumstances (and even less so when compared with the franchise's other post-pandemic releases).**
Still, if the significance of the drop can be exaggerated one ought not to rush to dismiss it either. The days and weeks that followed bore out the impression of a faster fade for Ant-Man 3, with the result that 28 days on, if running ahead of Ant-Man 2 in current dollar terms ($202 million to $189 million), was about a ninth behind it in inflation-adjusted, real dollar terms (the prior film's $189 million more like $226 million), with the decline the more striking as Ant-Man 2 was, again, not too leggy itself. This is still not a disaster, but no great success, either, given what it bodes for the final take. Ant-Man 2 finished with about $217 million, or $259 million adjusted for inflation. Standing at $202 million after last Thursday, with the movie unlikely to make so much as $5 million over this weekend, it is all but certain to finish well short of Ant-Man 2's inflation-adjusted gross, doing well to finish up in the vicinity of $220 million.
All the same, in considering this it is worth remembering that in making and distributing Ant-Man 3 Disney/Marvel is not only counting on the North American market, but the global market, where Marvel's appeal has long been strong, and which has redeemed many a stateside disappointment. Indeed, it is worth recalling that Ant-Man 1 and Ant-Man 2 each made twice as much as their American take overseas, a fact crucial to the franchise's continuation. Adjusted for inflation Ant-Man 2 added to its rough quarter-billion dollar North American take a half billion more dollars in those international markets (its $406 million 2018 take abroad equal to $484 million in today's terms). Were Ant-Man 3 to manage the same one would not look on it too unfavorably.
So how has it been doing there?
Alas, not nearly so well as that. The movie has, at last check, made just $250 million--and may not have much more left to collect. After all, when Ant-Man 2 came out in U.S. its release in many important markets (among them Britain, Japan and of course China) came only a month later (indeed, almost two months later in the case of Japan and China). By contrast Ant-Man 3 was released in pretty much every major market at the same time, so far from waiting and seeing how it plays overseas we have already seen that, with the resulting numbers not encouraging. (Consider the example of China. Where Ant-Man 2 picked up $120 million in China in its first four weeks--the equivalent of $140 million today--thus far it has collected less than $40 million.)
Globally I simply do not see Ant-Man picking up another $250 million, or even $150 million abroad. Some are now suggesting that it will not get much past the half billion dollar mark, if that. This would make it the lowest-grossing of the Ant-Man movies in real terms, and even current-dollar terms, with this the more problematic given the $200 million production budget. While movie budgets are, again, often a matter of dubious, loss-claiming accounting, and often frayed by subsidy and production placement, while movie revenues are over time supplemented by revenue streams from home video, television, merchandising, etc. that enable even flops to get "into the black" after the passage of enough time, let us take the cited figure at face value, and stick with the rule of thumb that a profit on the theatrical run requires the gross of four to five times that figure (while anyway being important because a good theatrical run is what makes people buy your merchandise and want to pay streaming fees and the rest anyway). According to the resulting calculation the movie needed to make something in the $800 million to $1 billion range--about what it would have made if it bettered the inflation-adjusted take of Ant-Man 2 by even ten percent or so, and at least half again what it actually seems likely to collect. Even considering the possible compensations of other funding and revenue streams that gap between where it stands now and what it might take to "break even" yawning.
What does all this add up to in the end? That Ant-Man 3's North American gross, if initially impressive, dropped sharply after its opening weekend, not only showing disappointing legs but making for a disappointing total--but only moderately so. The real problem is the overseas performance of the movie, which positively crashed compared with Ant-Man 2's performance. (Rather than making a sixth less than Ant-Man 2, the movie may make more like two-fifths less elsewhere--falling from nearly $500 million in today's terms to under $300 million.) The latter goes a much longer way to putting a big dent in the movie's earnings, and Disney's balance sheets--with, interestingly, this more important part of the story (financially, anyway) scarcely being talked about at all.
* The critics gave Ant-Man 2 an 87 percent ("Fresh") Tomatometer rating. By contrast Ant-Man 3 rated just 47 percent (with the "Top Critics" score falling even more sharply, from 83 to 39 percent).
** According to the Bureau of Labor Statistics' Consumer Price Index inflation calculator Ant-Man 2's $75.8 million in July 2018 is roughly equivalent to $90.5 million in February 2023 terms.
Of course, those discussing these matters for a broad audience rarely crunch the numbers in any rigorous way, at best citing a figure or two, but the truth is that this is easy enough to do on the basis of conveniently available data--like the stats at Box Office Mojo--with an obvious place to start what prompted that change of opinion, namely the sharp drop in gross between the first weekend and the second. That drop was indeed dramatic, some 70 percent between the first and second Friday-to-Sunday periods. Still, one should put it into perspective. Ant-Man 3 was a "threequel," and not to some long-awaited follow-up to a record-breaking "New Classic," but the not-so-long-ago last film in one of the weaker component franchises of the exceedingly prolific Marvel Cinematic Universe; with that movie also debuting on a four-day holiday weekend. One expects such a film's take to be very front-loaded indeed--and the gap with the second weekend to show it--as seen with the preceding Ant-Man film, which had a 62 percent drop between the first and second weekends. The 70 percent drop Ant-Man 3 had does not seem very much worse in the circumstances (and even less so when compared with the franchise's other post-pandemic releases).**
Still, if the significance of the drop can be exaggerated one ought not to rush to dismiss it either. The days and weeks that followed bore out the impression of a faster fade for Ant-Man 3, with the result that 28 days on, if running ahead of Ant-Man 2 in current dollar terms ($202 million to $189 million), was about a ninth behind it in inflation-adjusted, real dollar terms (the prior film's $189 million more like $226 million), with the decline the more striking as Ant-Man 2 was, again, not too leggy itself. This is still not a disaster, but no great success, either, given what it bodes for the final take. Ant-Man 2 finished with about $217 million, or $259 million adjusted for inflation. Standing at $202 million after last Thursday, with the movie unlikely to make so much as $5 million over this weekend, it is all but certain to finish well short of Ant-Man 2's inflation-adjusted gross, doing well to finish up in the vicinity of $220 million.
All the same, in considering this it is worth remembering that in making and distributing Ant-Man 3 Disney/Marvel is not only counting on the North American market, but the global market, where Marvel's appeal has long been strong, and which has redeemed many a stateside disappointment. Indeed, it is worth recalling that Ant-Man 1 and Ant-Man 2 each made twice as much as their American take overseas, a fact crucial to the franchise's continuation. Adjusted for inflation Ant-Man 2 added to its rough quarter-billion dollar North American take a half billion more dollars in those international markets (its $406 million 2018 take abroad equal to $484 million in today's terms). Were Ant-Man 3 to manage the same one would not look on it too unfavorably.
So how has it been doing there?
Alas, not nearly so well as that. The movie has, at last check, made just $250 million--and may not have much more left to collect. After all, when Ant-Man 2 came out in U.S. its release in many important markets (among them Britain, Japan and of course China) came only a month later (indeed, almost two months later in the case of Japan and China). By contrast Ant-Man 3 was released in pretty much every major market at the same time, so far from waiting and seeing how it plays overseas we have already seen that, with the resulting numbers not encouraging. (Consider the example of China. Where Ant-Man 2 picked up $120 million in China in its first four weeks--the equivalent of $140 million today--thus far it has collected less than $40 million.)
Globally I simply do not see Ant-Man picking up another $250 million, or even $150 million abroad. Some are now suggesting that it will not get much past the half billion dollar mark, if that. This would make it the lowest-grossing of the Ant-Man movies in real terms, and even current-dollar terms, with this the more problematic given the $200 million production budget. While movie budgets are, again, often a matter of dubious, loss-claiming accounting, and often frayed by subsidy and production placement, while movie revenues are over time supplemented by revenue streams from home video, television, merchandising, etc. that enable even flops to get "into the black" after the passage of enough time, let us take the cited figure at face value, and stick with the rule of thumb that a profit on the theatrical run requires the gross of four to five times that figure (while anyway being important because a good theatrical run is what makes people buy your merchandise and want to pay streaming fees and the rest anyway). According to the resulting calculation the movie needed to make something in the $800 million to $1 billion range--about what it would have made if it bettered the inflation-adjusted take of Ant-Man 2 by even ten percent or so, and at least half again what it actually seems likely to collect. Even considering the possible compensations of other funding and revenue streams that gap between where it stands now and what it might take to "break even" yawning.
What does all this add up to in the end? That Ant-Man 3's North American gross, if initially impressive, dropped sharply after its opening weekend, not only showing disappointing legs but making for a disappointing total--but only moderately so. The real problem is the overseas performance of the movie, which positively crashed compared with Ant-Man 2's performance. (Rather than making a sixth less than Ant-Man 2, the movie may make more like two-fifths less elsewhere--falling from nearly $500 million in today's terms to under $300 million.) The latter goes a much longer way to putting a big dent in the movie's earnings, and Disney's balance sheets--with, interestingly, this more important part of the story (financially, anyway) scarcely being talked about at all.
* The critics gave Ant-Man 2 an 87 percent ("Fresh") Tomatometer rating. By contrast Ant-Man 3 rated just 47 percent (with the "Top Critics" score falling even more sharply, from 83 to 39 percent).
** According to the Bureau of Labor Statistics' Consumer Price Index inflation calculator Ant-Man 2's $75.8 million in July 2018 is roughly equivalent to $90.5 million in February 2023 terms.
Friday, March 17, 2023
Planet of the Chatbots
These days to mention Planet of the Apes is generally to evoke film--with, I suppose, the listener's generation determining the version of which they think. The young, I suppose, may be most likely to think of the currently ongoing franchise (headed into its fourth installment with The Kingdom of the Planet of the Apes headed to a theater near you in 2024). Still, even they may remember, if only through constant reference and parody (down to a whole invented musical in The Simpsons' glorious "golden age"), the first 1968 film adaptation of Planet of the Apes, and its closing scene with Charlton Heston's Taylor pounding the sand with his fist in rage and grief in the shadow of the Ozymandias-like wreck of the Statue of Liberty, testifying to the destruction of humanity in a nuclear war.
Yet the franchise really began with a novel that had nothing to do with nuclear war, or the latterly more popular theme of genetic engineering--Pierre Boulle's 1963 novel offering rather a different "secret" to the rise of the ape-dominated order, to which the book's narrator-protagonist first catches on during a visit to the apes' stock exchange, and subsequent "remembrance of the world of finance" as he had known it back on Earth. This was that for all its apparent complexity humanity had built a civilization on the basis not of reasoning, original thinking, but of "monkey see, monkey do," such that even "monkeys" could replicate it, and eventually did.*
Considering how chatbots work, and all people seem ready to credit them with being capable of doing, it appears we may be having such a moment. Our chatbots are a very long way from that intelligence that can "parse" language, and process subtle audio and visual data, and reason from fact to conclusion. By design they are just imitators, and often not terribly convincing ones as yet--but whether the task is writing an essay for a student, or being a romantic partner (!), they seem capable of all that is required.
I will not pretend to have all the implications of this worked out. But I think that it says more about the nature of society than of "human nature." Humans have their capacity for intelligence and creativity; but society, intricately and stultifyingly hierarchical and controlling, with relatively few allowed very much in the way of choice and initiative, requires the vast majority of us to simply play a limited and limiting part, even those who perform supposedly "elite" functions. Indeed, much of what we call "education" is about conditioning the individual to accommodating themselves to this limited part (with those acquiescent to the process showered with moral praises and those who are less acquiescent showered with opprobrium as "immature," "irresponsible," "refusing to grow up," etc.). Putting it bluntly, society trains people to be machines, and rewards or punishes them for being more or less serviceable machines. We should therefore not be surprised when we find that a machine is a perfectly adequate replacement for a human trying to be one.
* Indeed, in the book it was the use of trained apes, not machines, that became the basis of labor-saving and "automation," such that one could see it as a variation on the "robot rebellion" theme. (And lest it need be said, the author is fully aware that apes are not monkeys, hence the quotation marks.)
Yet the franchise really began with a novel that had nothing to do with nuclear war, or the latterly more popular theme of genetic engineering--Pierre Boulle's 1963 novel offering rather a different "secret" to the rise of the ape-dominated order, to which the book's narrator-protagonist first catches on during a visit to the apes' stock exchange, and subsequent "remembrance of the world of finance" as he had known it back on Earth. This was that for all its apparent complexity humanity had built a civilization on the basis not of reasoning, original thinking, but of "monkey see, monkey do," such that even "monkeys" could replicate it, and eventually did.*
Considering how chatbots work, and all people seem ready to credit them with being capable of doing, it appears we may be having such a moment. Our chatbots are a very long way from that intelligence that can "parse" language, and process subtle audio and visual data, and reason from fact to conclusion. By design they are just imitators, and often not terribly convincing ones as yet--but whether the task is writing an essay for a student, or being a romantic partner (!), they seem capable of all that is required.
I will not pretend to have all the implications of this worked out. But I think that it says more about the nature of society than of "human nature." Humans have their capacity for intelligence and creativity; but society, intricately and stultifyingly hierarchical and controlling, with relatively few allowed very much in the way of choice and initiative, requires the vast majority of us to simply play a limited and limiting part, even those who perform supposedly "elite" functions. Indeed, much of what we call "education" is about conditioning the individual to accommodating themselves to this limited part (with those acquiescent to the process showered with moral praises and those who are less acquiescent showered with opprobrium as "immature," "irresponsible," "refusing to grow up," etc.). Putting it bluntly, society trains people to be machines, and rewards or punishes them for being more or less serviceable machines. We should therefore not be surprised when we find that a machine is a perfectly adequate replacement for a human trying to be one.
* Indeed, in the book it was the use of trained apes, not machines, that became the basis of labor-saving and "automation," such that one could see it as a variation on the "robot rebellion" theme. (And lest it need be said, the author is fully aware that apes are not monkeys, hence the quotation marks.)
Ezra Klein on AI in the New York Times: A Reaction
Earlier this week journalist Ezra Klein penned an opinion piece on artificial intelligence (AI) in the New York Times.
What the piece offers the reader was predictable enough--a hazy sense that "Big Things" are happening, bolstered by an example or two (DeepMind "building a system" that "predict[ed] the 3-D structure of tens of thousands of proteins," etc.), after which, faced with the "Singularity" we get not the enthusiastic response of the Singularitarian but the more commonplace, negative reaction. This included plenty about our inability to actually understand what's going on with our feeble, merely human minds (citing Meghan O'Gieblyn repeatedly, not least her remark that "If there were gods, they would surely be laughing their heads off at the inconsistency of our logic"), which goes so far as to characterize artificial intelligence research as magic ("an act of summoning. The coders casting these spells have no idea what will stumble through the portal"), and declare that confronted with what he sees as the menace of AI to our sense of self we may feel compelled to retreat into the irrational ("the subjective experience of consciousness"), though we may hardly be safe that way (Mr. Klein citing a survey reporting that AI researchers themselves were fearful of the species being wiped out by a technology they do not control). Indeed, the author concludes that there are only two possible courses, either some kind of "accelerate[d] adaptation to these technologies or a collective, enforceable decision . . . made to slow the development of these technologies," while basically dismissing any skepticism toward his view of present AI research as a meddling with forces we cannot understand out of Lovecraftian horror as an ostrich-like burial of one's head in the sand (closing with Eric Davis' remark that "[i]n the court of the mind, skepticism makes a great grand vizier, but a lousy lord").
And so: epistemological nihilism, elevation of the irrational and anti-rational and even obscurantist (he actually talked about this as magic, folks!), misanthropy, Frankenstein complex-flavored Luddism, despair and the brush-off for any alternative view. In other words, same old same old for this take on the subject, old already in Isaac Asimov's day, so old and so pervasive that he made a career of Robot stories opposing the cliché (to regrettably little effect to go by how so many cite an Arnold Schwarzenegger B-movie from 1984 as some kind of philosophical treatise). Same old same old, too, for the discussion of the products of human reason going back to the dark, dark beginnings of the Counter-Enlightenment (on which those who find themselves "overwhelmed" have for so long been prone to fall back).
Still, it is a sign of the times that the piece ran in this "paper of record," especially at this moment in which so many of us (the recent, perhaps exaggerated, excitement over chatbots notwithstanding) feel that artificial intelligence has massively under-delivered, certainly when one goes by the predictions of a decade ago (at least, in their wildly misunderstood and misrepresented popular form, next to which chatbots slapping together scarcely readable prose does not seem so impressive). There is, too, a somewhat more original comparison Mr. Klein makes on the way to dismissing the "skeptical" view, namely between AI and cryptocurrency, raising the possibility of enthusiasm for the former replacing the enthusiasm for the latter as, in the wake of the criminality and catastrophe at FTX, investors pour their money into artificial intelligence.
Personally I have my doubts that AI will become anywhere near the darling of investors that "crypto" was. However much the press looks full of stories of startups raising capital, the plain and simple truth is that Wall Street's love affair with "tech" has not been anything close to what it was in the '90s since that time. (Indeed, investors have preferred that oldest, least tech-y commodity, real estate--hence the bubble and bust and other turmoil of this whole stagnant yet crisis-ridden century.) I do not see that behavior changing now--the more in as money is getting tighter and the economy slower.
Still, the activity in that area may be something for those interested in the field to watch for those curious as to how far things may go, how fast Meanwhile I suggest that anyone really concerned for humanity's survival direct their attention not to fantasies of robot rebellion, but to the fact that such scenarios seem a way of overlooking the very numerous, deadly serious and entirely real conflicts among humans of profoundly unequal power and opposed interest (not AI vs. human, but human vs. human with AI in the mix, and the more probable danger the way some humans may use AI against others). I suggest, too, that those concerned for human survival also worry less about artificial intelligences generally than nuclear weapons, fossil fuels, "forever chemicals," etc.--while remembering that besides the dangers caused by the technologies that we have there is the danger of our failing to produce, or properly use, the technologies we need in order to cope with the problems of today.
What the piece offers the reader was predictable enough--a hazy sense that "Big Things" are happening, bolstered by an example or two (DeepMind "building a system" that "predict[ed] the 3-D structure of tens of thousands of proteins," etc.), after which, faced with the "Singularity" we get not the enthusiastic response of the Singularitarian but the more commonplace, negative reaction. This included plenty about our inability to actually understand what's going on with our feeble, merely human minds (citing Meghan O'Gieblyn repeatedly, not least her remark that "If there were gods, they would surely be laughing their heads off at the inconsistency of our logic"), which goes so far as to characterize artificial intelligence research as magic ("an act of summoning. The coders casting these spells have no idea what will stumble through the portal"), and declare that confronted with what he sees as the menace of AI to our sense of self we may feel compelled to retreat into the irrational ("the subjective experience of consciousness"), though we may hardly be safe that way (Mr. Klein citing a survey reporting that AI researchers themselves were fearful of the species being wiped out by a technology they do not control). Indeed, the author concludes that there are only two possible courses, either some kind of "accelerate[d] adaptation to these technologies or a collective, enforceable decision . . . made to slow the development of these technologies," while basically dismissing any skepticism toward his view of present AI research as a meddling with forces we cannot understand out of Lovecraftian horror as an ostrich-like burial of one's head in the sand (closing with Eric Davis' remark that "[i]n the court of the mind, skepticism makes a great grand vizier, but a lousy lord").
And so: epistemological nihilism, elevation of the irrational and anti-rational and even obscurantist (he actually talked about this as magic, folks!), misanthropy, Frankenstein complex-flavored Luddism, despair and the brush-off for any alternative view. In other words, same old same old for this take on the subject, old already in Isaac Asimov's day, so old and so pervasive that he made a career of Robot stories opposing the cliché (to regrettably little effect to go by how so many cite an Arnold Schwarzenegger B-movie from 1984 as some kind of philosophical treatise). Same old same old, too, for the discussion of the products of human reason going back to the dark, dark beginnings of the Counter-Enlightenment (on which those who find themselves "overwhelmed" have for so long been prone to fall back).
Still, it is a sign of the times that the piece ran in this "paper of record," especially at this moment in which so many of us (the recent, perhaps exaggerated, excitement over chatbots notwithstanding) feel that artificial intelligence has massively under-delivered, certainly when one goes by the predictions of a decade ago (at least, in their wildly misunderstood and misrepresented popular form, next to which chatbots slapping together scarcely readable prose does not seem so impressive). There is, too, a somewhat more original comparison Mr. Klein makes on the way to dismissing the "skeptical" view, namely between AI and cryptocurrency, raising the possibility of enthusiasm for the former replacing the enthusiasm for the latter as, in the wake of the criminality and catastrophe at FTX, investors pour their money into artificial intelligence.
Personally I have my doubts that AI will become anywhere near the darling of investors that "crypto" was. However much the press looks full of stories of startups raising capital, the plain and simple truth is that Wall Street's love affair with "tech" has not been anything close to what it was in the '90s since that time. (Indeed, investors have preferred that oldest, least tech-y commodity, real estate--hence the bubble and bust and other turmoil of this whole stagnant yet crisis-ridden century.) I do not see that behavior changing now--the more in as money is getting tighter and the economy slower.
Still, the activity in that area may be something for those interested in the field to watch for those curious as to how far things may go, how fast Meanwhile I suggest that anyone really concerned for humanity's survival direct their attention not to fantasies of robot rebellion, but to the fact that such scenarios seem a way of overlooking the very numerous, deadly serious and entirely real conflicts among humans of profoundly unequal power and opposed interest (not AI vs. human, but human vs. human with AI in the mix, and the more probable danger the way some humans may use AI against others). I suggest, too, that those concerned for human survival also worry less about artificial intelligences generally than nuclear weapons, fossil fuels, "forever chemicals," etc.--while remembering that besides the dangers caused by the technologies that we have there is the danger of our failing to produce, or properly use, the technologies we need in order to cope with the problems of today.
A Note on Balzac's The Two Brothers
Reading Lucien de Rubempre's story in Balzac's Lost Illusions I felt that, Jack London's Martin Eden apart, it was the truest portrait of what the young artist faces as he embarks upon a career.
If less impressive on that score there is nonetheless a comparable bit of that of truth in Balzac's other novel, The Two Brothers. There Joseph Bridau, a young artist of talent and application is looked down upon as an addle-brained youth of no prospects because of his career choice by virtually everyone, and a constant worry to his conventionally-minded mother--even as Joseph, because of his work as a painter, and his loyal and responsible personal conduct, is her and the family's sole support (his earnings not from a "day job," but from his painting itself, what enable them to survive financially). The lack of appreciation for what he does, and her lack of appreciation for Joseph because he does it, is accentuated by the contrast between the way his mother looks at him and the favored, elder son, Philippe, a creature of monstrous selfishness, callousness and irresponsibility who, after a superficially "brilliant" career as a soldier in the Napoleonic Wars, brings nothing but calamity to his family, and to anyone else unfortunate enough to be associated with him, in spite of which he remains the favorite, but ever remains the more admired and looked-to for "great things." (All this is even more the case when they venture out to the mother's provincial home town of Issodun.)
It is only as the end of the story draws near, and their mother approaches the end of a life constantly darkened and ultimately shortened by Philippe's betrayals, that she is induced--by her confessor--to recognize that if there has been a sin in her life it has been her not loving and appreciating Joseph. In the years after Philippe, whose ambitions have ultimately been wrecked by his own indiscretion (and the all too contemporary-seeming financial idiocy-cum-madness which looms so large in his, and others', tales of the Paris of that era), dies in a colonial campaign, and leaves what remain of his fortune and title to Joseph, who at the end of the story is not only a great success as an artist, but wealthy and literally ennobled (the painter a Comte, no less).
Considering Joseph's trajectory, which could not be more different from Lucien's, I had an impression of Balzac writing a wish-fulfillment here. I suppose it is a reminder that even so cold-eyed a realist as Balzac could be—so much so as has put many a reader off of him from his own time to today (Dickens acknowledging the fact, more recently even David Walsh allowing that "Balzac's relentless, ferocious assault on this environment and its denizens can at times be wearing")--even he could, occasionally, allow at least one of his characters a good outcome.
If less impressive on that score there is nonetheless a comparable bit of that of truth in Balzac's other novel, The Two Brothers. There Joseph Bridau, a young artist of talent and application is looked down upon as an addle-brained youth of no prospects because of his career choice by virtually everyone, and a constant worry to his conventionally-minded mother--even as Joseph, because of his work as a painter, and his loyal and responsible personal conduct, is her and the family's sole support (his earnings not from a "day job," but from his painting itself, what enable them to survive financially). The lack of appreciation for what he does, and her lack of appreciation for Joseph because he does it, is accentuated by the contrast between the way his mother looks at him and the favored, elder son, Philippe, a creature of monstrous selfishness, callousness and irresponsibility who, after a superficially "brilliant" career as a soldier in the Napoleonic Wars, brings nothing but calamity to his family, and to anyone else unfortunate enough to be associated with him, in spite of which he remains the favorite, but ever remains the more admired and looked-to for "great things." (All this is even more the case when they venture out to the mother's provincial home town of Issodun.)
It is only as the end of the story draws near, and their mother approaches the end of a life constantly darkened and ultimately shortened by Philippe's betrayals, that she is induced--by her confessor--to recognize that if there has been a sin in her life it has been her not loving and appreciating Joseph. In the years after Philippe, whose ambitions have ultimately been wrecked by his own indiscretion (and the all too contemporary-seeming financial idiocy-cum-madness which looms so large in his, and others', tales of the Paris of that era), dies in a colonial campaign, and leaves what remain of his fortune and title to Joseph, who at the end of the story is not only a great success as an artist, but wealthy and literally ennobled (the painter a Comte, no less).
Considering Joseph's trajectory, which could not be more different from Lucien's, I had an impression of Balzac writing a wish-fulfillment here. I suppose it is a reminder that even so cold-eyed a realist as Balzac could be—so much so as has put many a reader off of him from his own time to today (Dickens acknowledging the fact, more recently even David Walsh allowing that "Balzac's relentless, ferocious assault on this environment and its denizens can at times be wearing")--even he could, occasionally, allow at least one of his characters a good outcome.
Thursday, March 16, 2023
The Academy Goes With the Popular Choice: The 48th Academy Awards, Rocky's Best Picture Win, and the Trajectory of Cinema Ever Since
At the 48th Academy Awards there were five films up for the Best Picture Oscar, namely Alan J. Pakula's All the President's Men; Hal Ashby's Bound for Glory; Martin Scorsese's Taxi Driver; Sidney Lumet's Network; and John G. Avildsen's Rocky.
The first four films were what a certain sort of ideologue would call "elitist"--tonally and technically sophisticated, with a socially critical outlook (be it Pakula's thriller-like treatment of Bob Woodward and Carl Bernstein's investigation of the Watergate scandal, Ashby's biopic about folk singer Woody Guthrie, Scorsese's vision of Times Square in a New York approaching what is still conventionally remembered as its "rock bottom," or Lumet's satire of network television centering on the rise of a right-wing populist "mad prophet of the airwaves"). By contrast Rocky was an old-fashioned sports story about an underdog who gets a shot at the top and makes the most of it, with, some thought, a rather right-wing edge (in its essentially "aspirational," "You only need willpower to be a champ" attitude, and what some saw as its racial subtext), which got it called "populist."
In what was then and now seen as a gesture toward the broader moviegoing audience Rocky was given the prize over those other more conventionally Oscar-worthy (more technically accomplished, more thematically ambitious, more substantial) films. And if it is less often spoken of than, for example, Jaws or Star Wars by those historians who recount the end of the New Hollywood, it does get mentioned from time to time as a sign of where film was headed, with some justice given the sequels. Looking back at Rocky IV, for example, one sees in the sequel-itis, the Cold War-ism, the thoroughly "high concept" character of the product's look and feel and tone, the "New" New Hollywood clearly arrived.
All the same, there is a surprise for those who look back from the sequel, or from Rocky's reputation as a marker of the decline of the New Hollywood, to the original. If Rocky IV purely belonged to the new era, "Rocky I" did not--with, instead of mansions and press conference halls, grand arenas and training facilities out of the future, the screen saturated with the shabbiness of working-class districts amid which people lead dead-end lives, while the movie actually looks, classically, like a movie. One can even quantify the fact. Where Rocky IV has an Average Shot Length (ASL) of a mere 2.3 seconds--shrinking toward 1 second in the training montages and fight scenes--the ASL was 8.5 seconds in the original, the sequel squeezing almost four times as many shots as the original into any given space of time, and watching it you really feel it.* Where the mid-'80s follow-up a 90-minute music video intended above all to dazzle with light and sound, with synchronization of image and soundtrack, all very Michael Bay a decade before Bay shot his first feature, the original was a movie telling a story, whatever one makes of it, about people in places that were none too dazzling in a way much out of fashion in major feature film by the '80s, and even more so now.
* Figures on ASL from the ever-handy and much-recommended Cinemetrics web site.
The first four films were what a certain sort of ideologue would call "elitist"--tonally and technically sophisticated, with a socially critical outlook (be it Pakula's thriller-like treatment of Bob Woodward and Carl Bernstein's investigation of the Watergate scandal, Ashby's biopic about folk singer Woody Guthrie, Scorsese's vision of Times Square in a New York approaching what is still conventionally remembered as its "rock bottom," or Lumet's satire of network television centering on the rise of a right-wing populist "mad prophet of the airwaves"). By contrast Rocky was an old-fashioned sports story about an underdog who gets a shot at the top and makes the most of it, with, some thought, a rather right-wing edge (in its essentially "aspirational," "You only need willpower to be a champ" attitude, and what some saw as its racial subtext), which got it called "populist."
In what was then and now seen as a gesture toward the broader moviegoing audience Rocky was given the prize over those other more conventionally Oscar-worthy (more technically accomplished, more thematically ambitious, more substantial) films. And if it is less often spoken of than, for example, Jaws or Star Wars by those historians who recount the end of the New Hollywood, it does get mentioned from time to time as a sign of where film was headed, with some justice given the sequels. Looking back at Rocky IV, for example, one sees in the sequel-itis, the Cold War-ism, the thoroughly "high concept" character of the product's look and feel and tone, the "New" New Hollywood clearly arrived.
All the same, there is a surprise for those who look back from the sequel, or from Rocky's reputation as a marker of the decline of the New Hollywood, to the original. If Rocky IV purely belonged to the new era, "Rocky I" did not--with, instead of mansions and press conference halls, grand arenas and training facilities out of the future, the screen saturated with the shabbiness of working-class districts amid which people lead dead-end lives, while the movie actually looks, classically, like a movie. One can even quantify the fact. Where Rocky IV has an Average Shot Length (ASL) of a mere 2.3 seconds--shrinking toward 1 second in the training montages and fight scenes--the ASL was 8.5 seconds in the original, the sequel squeezing almost four times as many shots as the original into any given space of time, and watching it you really feel it.* Where the mid-'80s follow-up a 90-minute music video intended above all to dazzle with light and sound, with synchronization of image and soundtrack, all very Michael Bay a decade before Bay shot his first feature, the original was a movie telling a story, whatever one makes of it, about people in places that were none too dazzling in a way much out of fashion in major feature film by the '80s, and even more so now.
* Figures on ASL from the ever-handy and much-recommended Cinemetrics web site.
Subscribe to:
Posts (Atom)