Recently reading what others have had to say about the prospects of Indiana Jones 5 I was struck by one possibility some have raised--that Indiana Jones and the Dial of Destiny could, like Top Gun: Maverick, be a spectacular overperformer at the box office along similar lines as another follow-up to a "feel-good" '80s-era action blockbuster, with a star of the kind like they don't make anymore in the lead and plenty of nostalgic appeal, which through all this really connects with a moviegoing public that comes back to experience it over and over and over again.
I can see where they are coming from on this--but skeptical that Top Gun 2 is a good analog for Indiana Jones 5, for five reasons, three of which (numbers 1, 4 and 5) were pretty much ignored in the comment I read about last year's hit:
1. The memory of the original Top Gun was, in 2022, undimmed in the memories of fans by any exploitation after the original. By contrast Raiders of the Lost Ark has had three sequels, with both films three, and four, each giving the impression that it was "the last one" (with Indiana and company literally riding off into the sunset at the end of the "last" crusade, and then, in a movie redolent with the sense of Jones' time as adventurer as past, and now an assistant dean, married man and father)--and thus that anything further would not only go over the same ground yet again (as Crystal Skull seemed to be to many), but merely another go-round for the sake of a crass cash grab. It did not help that number four was received as less than a triumph (the film's audience score on Rotten Tomatoes a mere 53 percent). It also does not help that Disney's handling of Star Wars, the closest point of comparison to Indiana Jones 5 by far (the only really comparable franchise, likewise a Lucas-Ford show), has disappointed so many under the same exact management, and may be handling Jones the same way, the more in as some of the casting decisions weirdly similar to those of the divisive Star Wars movies (with Mads Mikkelsen again a man of uncertain loyalties working on a critical space project for a government at war, and Phoebe Waller-Bridge once more a "sidekick" to "Han Solo"). By contrast Top Gun 2 had no such baggage.
2. While much of the cultural, "zeitgeist"-type analysis of Top Gun 2 was simplistic (in line with the movie itself, frankly), there may be something to the argument that those who came out for it responded to the clear-cut good guys vs. bad guys premise and upbeat, Rah-rah spirit of the film. As recently described the James Mangold-helmed Indiana Jones 5 is expected to go in the opposite direction (if Logan was anything to go by, the extreme opposite direction), such that the same kind of satisfaction will simply not be on offer, and indeed, anyone looking for it is likely to come away feeling cheated--hardly the response Top Gun 2 elicited.
3. Just as one has to be cautious with talk of films and the zeitgeist, one has to be cautious in regard to the ways in which movie grosses interact with the country's culture wars--a very complex subject ill-served by the simplistic narratives constantly pushed on us--while I have seen lots of rhetoric but little hard data regarding Top Gun 2 specifically. (I have heard, for example, that like Taken the movie really exploded in the "Red" states and not so much the "Blue," but again, they had no hard data, or even industry insider comment, to offer in support of their claims.) Still, just as the Rah-rah spirit of the movie worked in its favor I will allow that Top Gun 2 may have had a boost in some quarters from its being perceived (rightly or wrongly) as a "conservative" film, perhaps to the point that going seemed in itself a way of "showing support for their side" and a rebuke to the other side. By contrast Indiana Jones 5 has been branded a "woke" film in some quarters, diminishing its appeal among those who went to see Top Gun 2 for such reasons. Of course, if "conservative" films have their audience, many a hit has also been scored on the basis of "wokeness" too--as with Black Panther or Captain Marvel--but I do not see Indy 5 as having that kind of cachet with that crowd, so there can be no compensating effect from that corner.
4. In the months leading up to Top Gun 2's release, through that release, and after, the media cheerled for the film as one (as seen not only in the 96 percent score it got from the critics on Rotten Tomatoes--as against the much poorer score the virtually identical original had from critics--but the commentary, where even what look like critical pieces at first glance turn out to be more "Rah-rah!"). Thus far Indy 5 has not had anything to compare with such support--and I see little sign of this changing in the three months between the time of this writing and the film's release.
5. When Top Gun 2 came out in the summer of 2022 it had the advantage of very weak summer competition--a mere three other tentpoles none of them a really stellar performer, which I am convinced permitted it its particularly strong legs. It hit theaters three weeks after a less than overwhelming Dr. Strange 2, and two weeks before a Jurassic World movie that did rather less well than its two predecessors, while Thor 4 came out four whole weeks later and had only a lackluster run, and was not followed by another equally big movie for months--which meant that much more room for Top Gun to keep drawing audiences (the movie's Friday-to-Sunday takes not slipping below the $10 million mark until its tenth weekend in release, on the way to its collecting a staggering six times what it took in its first weekend of play).
Indiana Jones 5 will not have anything close to that. It will come out a mere two weeks after The Flash, which, doing as well as the buzz suggests (prompting repeat viewings, good word-of-mouth, etc.), could have pretty good legs itself, while Mission: Impossible 7--which may be a stronger performer than usual given the good will toward Tom Cruise in the wake of the very same Top Gun 2 we have been talking about the whole time--will be along just two weeks after that, positioning each to take a bite out of Indy's grosses over its first three weekends, in a more generally crowded summer.
Of course, these factors--the weakening of interest in the franchise as a result of past missteps with Indiana Jones 4 and Disney's handling of Lucasfilm generally; the difference in tone, political associations and media attitude; and the more competitive summer season--do not guarantee that Indiana Jones 5 will not "overperform." However, if this were to happen (and admittedly I think it unlikely) it will not be on the basis of those factors that worked so much in Top Gun's favor, its success achieved in a different way and under different conditions.
Monday, April 3, 2023
Sunday, April 2, 2023
Is This the Singularity . . . or Just Lowering the Bar?
In his important rebuttal to the New Economy hype of the 1990s Robert Gordon pointed to "hand-and-eye coordination" as a significant barrier in regard to the fuller, more thoroughly productivity-enhancing computerization of the economy, while in 2013 the Carl Benedikt Frey and Michael Osborne-authored Future of Work study likewise identified "perception and manipulation" as one of the most significant "bottlenecks" for the automation of work. Accordingly the designers of artificial intelligence systems overcoming these barriers and passing through these obstacles with computer-guided machines able to perform eye-hand coordination-demanding, perception-and-manipulation-intensive tasks of these kinds adequately and reliably would betoken a great advance in the prospects for automation--and the technological state-of-the-art in the field of "AI."
Of course, a generation after Gordon wrote, a decade after Frey and Osborne wrote, progress here has been . . . unimpressive. And indeed, after a surge of hype in the mid-'10s during which self-driving cars and the beginnings of a tidal wave of workplace automation were supposed to change everything, just about NONE OF IT HAPPENED (even flipping burgers has been a bigger challenge than appreciated), after which the excitement about AI waned these past few years.
However, in recent months OpenAI's new chatbot, "GPT-3.5," and then just this month, the follow-up, GPT-4, got (some) people excited again. Indeed, reading Ezra Klein's New York Times opinion piece on the matter--among many, many others--I was consistently struck by the sense so many had not only of the GPT's approximation of human capacities, but of profound acceleration in the rate of progress in that emulation.
The kind of acceleration that had me start typing into Google "Is GPT-4 the Singularity?" and finding the search engine's autocomplete finishing the thought for me; while after I hit ENTER I saw from the search results that this is exactly the question lots and lots of people seem to be asking--and many answering that question in the affirmative. Calum Chace, for example, argues in Forbes that yes, this is a significant step in that direction.
Looking at all that there seems little doubt that many are greatly impressed by the chatbots' level of functionality--or at least what they are being told about it. (An oft-cited talking point, the more significant in its not necessarily being so straightforward as it sounds, is its scoring in the 90th percentile on the Bar Exam, whereas the GPT-3.5 version had scored only in the 10th.) But it can also seem to be the case that after the comparative bust of the mid-'10s wave of techno-hype, which may have led them to write off certain kinds of automation (e.g. the eye-hand coordination-requiring stuff) as not worth thinking about, they are easily impressed by developments in an area to which they paid less attention and about which they have accordingly not become so cynical as they are about, for instance, self-driving cars--while the Silicon Valley-to-Wall Street boosters, their courtiers and claqueurs in the media, and even the "criti-hype" of those folks who shout "Oh no, it's Skynet! We're doomed! DOOMED!" at every development making the most of that readiness to believe.
Of course, a generation after Gordon wrote, a decade after Frey and Osborne wrote, progress here has been . . . unimpressive. And indeed, after a surge of hype in the mid-'10s during which self-driving cars and the beginnings of a tidal wave of workplace automation were supposed to change everything, just about NONE OF IT HAPPENED (even flipping burgers has been a bigger challenge than appreciated), after which the excitement about AI waned these past few years.
However, in recent months OpenAI's new chatbot, "GPT-3.5," and then just this month, the follow-up, GPT-4, got (some) people excited again. Indeed, reading Ezra Klein's New York Times opinion piece on the matter--among many, many others--I was consistently struck by the sense so many had not only of the GPT's approximation of human capacities, but of profound acceleration in the rate of progress in that emulation.
The kind of acceleration that had me start typing into Google "Is GPT-4 the Singularity?" and finding the search engine's autocomplete finishing the thought for me; while after I hit ENTER I saw from the search results that this is exactly the question lots and lots of people seem to be asking--and many answering that question in the affirmative. Calum Chace, for example, argues in Forbes that yes, this is a significant step in that direction.
Looking at all that there seems little doubt that many are greatly impressed by the chatbots' level of functionality--or at least what they are being told about it. (An oft-cited talking point, the more significant in its not necessarily being so straightforward as it sounds, is its scoring in the 90th percentile on the Bar Exam, whereas the GPT-3.5 version had scored only in the 10th.) But it can also seem to be the case that after the comparative bust of the mid-'10s wave of techno-hype, which may have led them to write off certain kinds of automation (e.g. the eye-hand coordination-requiring stuff) as not worth thinking about, they are easily impressed by developments in an area to which they paid less attention and about which they have accordingly not become so cynical as they are about, for instance, self-driving cars--while the Silicon Valley-to-Wall Street boosters, their courtiers and claqueurs in the media, and even the "criti-hype" of those folks who shout "Oh no, it's Skynet! We're doomed! DOOMED!" at every development making the most of that readiness to believe.
The Summer Movie Season of 2023: Predictions
Last summer, during which it seemed fairly clear that the American and global box office were both recovering, the take still fell short of what the industry hoped for--in large part, in my view, because there were fewer than the usual number of big "tentpole" releases. (There were just Dr. Strange 2, Jurassic Park 3/6, Top Gun 2 and Thor 4, with the quiver empty after early July--four releases stretched thin over the first two months or so instead of a more evenly spread out eight over the four month period.)
By contrast the summer of 2023 looks to be on par with pre-pandemic summers in this respect, with Guardians of the Galaxy 3, Fast and Furious 10, Mission: Impossible 7, Transformers 6, The Flash and even Indiana Jones 5 coming our way in May-July, with Meg 2 and Blue Beetle coming in August.
Will these do the trick?
The Guardian series is relatively well-received ("worst" Chris and all), but box office-wise more Thor than Avengers--while Marvel's brand has not been looking its best lately. The Transformers franchise is even more clearly "not what it used to be." Blue Beetle is an "upgrade" of a formerly straight-to-streaming release about another second-stringer (as an alternative to the cancellation seen with Batgirl), and seems a particular long shot for a first-rank hit. And as I have said earlier, I am not very optimistic about Indiana Jones 5, which I think may fall into that ever-larger territory of "huge gross by any sane standard, but arguable flop given what it cost to make, what it netted at the end, and the expectations people must have had for it" (the way that, for example, Ant-Man 3 or Black Panther 2 have done, and maybe not even with more cash in the till).
That leaves the other three. If MI7 may do as well as its predecessors I do not expect to see it better those movies, which do well enough by spy-fi standards but fall short of the absolute top (though a billion dollar take does not seem wholly out of the question, if mainly as a function of rising ticket prices, although I can also picture it getting a bump from good will from the press and from fans of Top Gun 2 toward Tom Cruise). And Fast and Furious 10 will probably do better--though not necessarily beat its predecessors (impressive as their earnings have been).
That leaves The Flash. Apparently the feature film about the character has really, really great buzz--the buzz actually holding it to be better than any DC Universe film seen in a long time, maybe the best of the DC Extended Universe (DCEU) lot, maybe better than anything since Christopher Nolan's Batman films (the assumption, of course, being that these were superlative, an opinion I share less and less ). Still, the Flash is no Batman or Wonder Woman, I suspect no character is completely immune to superhero fatigue and DC Universe fatigue, and the DC Extended Universe as we know it is known to be on its way out, while I suspect the earlier, absolutely terrible, publicity attaching to the person in question(which less than a year ago had people talking of "cancellation" of the film and its star) will not be conveniently and completely forgotten eleven weeks from now. A wildly successful Flash movie is therefore no sure thing, no matter how good it supposedly is--but it may have more potential to surprise observers than anything else.
All the same, even with The Flash proving a hit--even its proving the biggest hit in DCEU history--I expect that all this will add up to a summer that, while clearly better than any of the last three years, looks more like a mediocre pre-pandemic summer than a sensational one.
By contrast the summer of 2023 looks to be on par with pre-pandemic summers in this respect, with Guardians of the Galaxy 3, Fast and Furious 10, Mission: Impossible 7, Transformers 6, The Flash and even Indiana Jones 5 coming our way in May-July, with Meg 2 and Blue Beetle coming in August.
Will these do the trick?
The Guardian series is relatively well-received ("worst" Chris and all), but box office-wise more Thor than Avengers--while Marvel's brand has not been looking its best lately. The Transformers franchise is even more clearly "not what it used to be." Blue Beetle is an "upgrade" of a formerly straight-to-streaming release about another second-stringer (as an alternative to the cancellation seen with Batgirl), and seems a particular long shot for a first-rank hit. And as I have said earlier, I am not very optimistic about Indiana Jones 5, which I think may fall into that ever-larger territory of "huge gross by any sane standard, but arguable flop given what it cost to make, what it netted at the end, and the expectations people must have had for it" (the way that, for example, Ant-Man 3 or Black Panther 2 have done, and maybe not even with more cash in the till).
That leaves the other three. If MI7 may do as well as its predecessors I do not expect to see it better those movies, which do well enough by spy-fi standards but fall short of the absolute top (though a billion dollar take does not seem wholly out of the question, if mainly as a function of rising ticket prices, although I can also picture it getting a bump from good will from the press and from fans of Top Gun 2 toward Tom Cruise). And Fast and Furious 10 will probably do better--though not necessarily beat its predecessors (impressive as their earnings have been).
That leaves The Flash. Apparently the feature film about the character has really, really great buzz--the buzz actually holding it to be better than any DC Universe film seen in a long time, maybe the best of the DC Extended Universe (DCEU) lot, maybe better than anything since Christopher Nolan's Batman films (the assumption, of course, being that these were superlative, an opinion I share less and less ). Still, the Flash is no Batman or Wonder Woman, I suspect no character is completely immune to superhero fatigue and DC Universe fatigue, and the DC Extended Universe as we know it is known to be on its way out, while I suspect the earlier, absolutely terrible, publicity attaching to the person in question(which less than a year ago had people talking of "cancellation" of the film and its star) will not be conveniently and completely forgotten eleven weeks from now. A wildly successful Flash movie is therefore no sure thing, no matter how good it supposedly is--but it may have more potential to surprise observers than anything else.
All the same, even with The Flash proving a hit--even its proving the biggest hit in DCEU history--I expect that all this will add up to a summer that, while clearly better than any of the last three years, looks more like a mediocre pre-pandemic summer than a sensational one.
Has the Hollywood Video Game Adaptation Become Respectable?
I remember how the first Super Mario Bros. movie was supposed to be one of the big summer movies of 1993--right along with such movies as Cliffhanger, Last Action Hero and Jurassic Park.
The reception was much more Last Action Hero than Jurassic Park. Indeed, it would seem to have fared even worse than Last Action Hero, going not just by its much-smaller gross, but how it was mostly forgotten, such that while it (probably) had its run on premium cable, I never ran into it on TV in the years afterward. Only in 2008, doing research for a piece on video game adaptations, did I try to seek it out--and as it happened, in those years when DVDs were typically packed with extras in an attempt to make them something people would choose to buy rather than just rent the one edition of that DVD I tracked down had just the movie on it. The viewing of that movie proved to me that, much as I hoped otherwise, it was no hidden gem.*
Indeed, fifteen years after its release the movie's principal legacy would seem to have been starting a tradition of poorly received adaptations of the type, with 1994's Double Dragon and Street Fighter not doing much better, etc., etc.. With 1995's Mortal Kombat the genre did seem to get its first real money-maker (there was a sequel, after all), but all the same, the growing list of commercial successes (some of which did attract a more durable following than the '93 Super Mario Bros.) did little to diminish the opprobrium.
Of course, it has since been another fifteen years--and there has been a measure of change since. The second crack at a Mortal Kombat franchise was no great commercial or critical success when it came out in 2021, but still seems to have commanded a bit more respect than the first. (Mortal Kombat 2 had a 4 percent rating on Rotten Tomatoes--not a typo!--and a not much better 20 percent from audiences, in contrast with the 54 percent score from the critics and the 86 percent score from audiences the 2021 movie had.) Sonic the Hedgehog has certainly done better than that--been, in fact, rather well-received by pandemic standards (the second movie one of the top ten hits at the recovering 2022 North American box office, while garnering a 69 percent with the critics and a 96 percent with audiences on Rotten Tomatoes). And now, as the action increasingly moves to the small screen, a TV series based on The Last of Us has actually become a prestige TV-loving critical darling.
In all this it is a different world indeed.
* Interestingly about that time I found out while teaching a class in science fiction literature that at the time of release the movie won over a great many very young fans who, even after being college-aged, still had fond memories of it. All the same, none of this seems to have gone so far as to produce any evidence of a cult following.
The reception was much more Last Action Hero than Jurassic Park. Indeed, it would seem to have fared even worse than Last Action Hero, going not just by its much-smaller gross, but how it was mostly forgotten, such that while it (probably) had its run on premium cable, I never ran into it on TV in the years afterward. Only in 2008, doing research for a piece on video game adaptations, did I try to seek it out--and as it happened, in those years when DVDs were typically packed with extras in an attempt to make them something people would choose to buy rather than just rent the one edition of that DVD I tracked down had just the movie on it. The viewing of that movie proved to me that, much as I hoped otherwise, it was no hidden gem.*
Indeed, fifteen years after its release the movie's principal legacy would seem to have been starting a tradition of poorly received adaptations of the type, with 1994's Double Dragon and Street Fighter not doing much better, etc., etc.. With 1995's Mortal Kombat the genre did seem to get its first real money-maker (there was a sequel, after all), but all the same, the growing list of commercial successes (some of which did attract a more durable following than the '93 Super Mario Bros.) did little to diminish the opprobrium.
Of course, it has since been another fifteen years--and there has been a measure of change since. The second crack at a Mortal Kombat franchise was no great commercial or critical success when it came out in 2021, but still seems to have commanded a bit more respect than the first. (Mortal Kombat 2 had a 4 percent rating on Rotten Tomatoes--not a typo!--and a not much better 20 percent from audiences, in contrast with the 54 percent score from the critics and the 86 percent score from audiences the 2021 movie had.) Sonic the Hedgehog has certainly done better than that--been, in fact, rather well-received by pandemic standards (the second movie one of the top ten hits at the recovering 2022 North American box office, while garnering a 69 percent with the critics and a 96 percent with audiences on Rotten Tomatoes). And now, as the action increasingly moves to the small screen, a TV series based on The Last of Us has actually become a prestige TV-loving critical darling.
In all this it is a different world indeed.
* Interestingly about that time I found out while teaching a class in science fiction literature that at the time of release the movie won over a great many very young fans who, even after being college-aged, still had fond memories of it. All the same, none of this seems to have gone so far as to produce any evidence of a cult following.
What Do the Superhero Flops Tell Us About the Box Office Today?
It has long been generally recognized that getting audiences out to a movie increasingly requires the audience's perceiving a film's release as an "event." Once upon a time splashy big-budget action blockbusters almost automatically had event status. However, there are now so many of them that something more is required, and increasingly it seems the case that, even where superhero movies are concerned, the "second-stringers" will not "cut it"--with a movie with a less well-known protagonist and a budget in the range of a "mere" $100 million falling into that category, as with a Shazam. Meanwhile, in the wake of the anticlimax anything past the Battle with Thanos has been (and probably could only have been), and the lack of ideas that even by superhero blockbuster standards could be called "compelling" for those films that followed (Spider-Man's big multiverse event excepted) it has been harder and harder to make the multiple Marvel releases still coming the audience's way seem like the "event" they were in the days before there had already been thirty of them (!), as evidenced by the less than triumphant result of the decision to debut the Marvel Cinematic Universe's Phase Five with Ant-Man 3--the second-stringer, even when given a more than usually prominent place in the bigger mythos and backed with a $200 million budget not delivering the goods.
We hear from time to time of Marvel fatigue, and that could seem a consideration, but as the case of Shazam 2 and much else shows, there may also be DC fatigue, and superhero fatigue, and big sci-fi actioner fatigue, and blockbuster fatigue. After all, the action movie, and the high-concept blockbuster as we know it generally, go back to the '60s-era James Bond films, thoroughly arrived in Hollywood with Star Wars, and have been a routine and increasingly large part of our cinematic diet since the '80s. The result is that no one under the age of fifty can remember a time when the movie houses were not full of the stuff, which can now seem pretty well played-out, even down to the visuals (the CGI boom so clearly underway with the original Jurassic Park now entering its fourth decade)--and while, admittedly, critics who thought there was too much such stuff have been forecasting its demise for decades, I suspect they are right in, if not necessarily the boom going bust, then at least the peak for this kind of film-making passing. If so, this will confront the industry with some hard decisions that, to all evidences, it is spectacularly ill-equipped to make. Of course, so it seemed as the studio system was falling apart--and from that emerged a New Hollywood today remembered as one of the most vibrant periods in the history of American cinema. Still, as a significant chronicler of that period recounted, that was a very different world from this, financially, technologically, politically and culturally, and I suspect that should history indeed repeat itself in this respect, it will do so only as farce.
We hear from time to time of Marvel fatigue, and that could seem a consideration, but as the case of Shazam 2 and much else shows, there may also be DC fatigue, and superhero fatigue, and big sci-fi actioner fatigue, and blockbuster fatigue. After all, the action movie, and the high-concept blockbuster as we know it generally, go back to the '60s-era James Bond films, thoroughly arrived in Hollywood with Star Wars, and have been a routine and increasingly large part of our cinematic diet since the '80s. The result is that no one under the age of fifty can remember a time when the movie houses were not full of the stuff, which can now seem pretty well played-out, even down to the visuals (the CGI boom so clearly underway with the original Jurassic Park now entering its fourth decade)--and while, admittedly, critics who thought there was too much such stuff have been forecasting its demise for decades, I suspect they are right in, if not necessarily the boom going bust, then at least the peak for this kind of film-making passing. If so, this will confront the industry with some hard decisions that, to all evidences, it is spectacularly ill-equipped to make. Of course, so it seemed as the studio system was falling apart--and from that emerged a New Hollywood today remembered as one of the most vibrant periods in the history of American cinema. Still, as a significant chronicler of that period recounted, that was a very different world from this, financially, technologically, politically and culturally, and I suspect that should history indeed repeat itself in this respect, it will do so only as farce.
Indiana Jones and the Dial of Destiny: Predictions
I was actually more favorably disposed toward Indiana Jones and the Kingdom of the Crystal Skull than most--quite willing to go along with how insanely over the top it went, the bit with the refrigerator included (after all, the budgets, effects and spectacle action movies offered had been scaled up for two additional decades since the last Indiana Jones film, and the movies had never taken themselves too seriously anyway); to enjoy the transfer of the character to the '50s period setting, and the turn from the old religio-magical themes to the Roswell mythos. In the second hour it admittedly became formulaic, but on the whole it was competently executed--and still more fun than most big summer movies, at any rate, while in the story's taking place on the eve of a space age with which the old world of his adventures could seem to be relegated to the past, and Jones' marrying Marion and becoming a father to Mutt, and maybe even settling into a new "day job," there was some sense of the saga drawing to an end (even if he was holding onto the hat).
Still, if it was a fun enough film that does not mean it was all a fan might have hoped for after two decades--or for that matter, "necessary" somehow, the more in as the title of the third film (The Last Crusade) had previously implied the rounding off of a trilogy with the heroes (literally) riding off into the sunset, with everything best left lying as it was. And if that was the case then (even as seen by, again, someone more favorable to the film than most) there would seem reason for doubt about the prospects of the fifth film in the series, coming your way this June, especially to go by what we have seen happen with old franchises like this one, and what we have heard about this particular film. After all, while we take Indiana Jones' blockbuster status for granted there really aren't a lot of similar movies out there, are there? (Once more: period pieces are a tough sell to American audiences, with this going no less for anachronistic sci-fi stuff like steampunk, "weird Westerns," and the like.) When Raiders of the Lost Ark hit theaters it had the advantage of being a comparative novelty in a less competitive market where really big-budget action movies were fewer, while benefiting from its association with Star Wars at the peak of its cachet (with George Lucas having writer and producer credits, and Harrison Ford's Han Solo barely frozen in carbonite and hauled off by Boba Fett to Jabba's palace), and one might add, a moment when audiences were relatively easier sells on period adventure than they are now. (If the fact seems forgotten now the '70s was a boom period for World War II stuff of this kind, with Ford himself, before fighting the Nazis as Jones, having done so as Lieutenant Colonel Mike Barnsby in the Guns of Navarone sequel Force 10 From Navarone, and Army Air Force Lieutenant David Halloran in Hanover Street--neither a great hit admittedly, but still testifying to the fact of that fashion, and perhaps to audiences having been prepared to see Ford in a period piece, just as, one might add, the disappointment that was his later Cowboys & Aliens can seem to suggest the opposite.)
Of course, having been such a big success then gives it a boost such as a freshly-launched franchise would not enjoy--but Hollywood has time and again run into the ground the biggest of '80s successes, as demonstrated by the Rambo, Die Hard and Terminator franchises, with the post-Bob Iger Star Wars showing that no franchise is immune. If it can happen to Star Wars, it can certainly happen to that other Lucas-Ford franchise, Indy, whose grosses were already trending downward in the '80s, with the fourth film no break from the pattern. While at the North American box office it made the most in current dollars--over $317 million, as against Raiders of the Lost Ark's $212 million, Indiana Jones and the Last Crusade's $197 million, and Indiana Jones and the Temple of Doom's $179 million--adjusting the figures for inflation (certainly if one uses the Consumer Price Index to get the values in 2022 dollars) produces a very different result*:
Raiders of the Lost Ark--$687 million
Indiana Jones and the Temple of Doom--$510 million
Indiana Jones and the Last Crusade--$468 million
Indiana Jones and the Kingdom of the Crystal Skull--$434 million
One may add that the pattern has held, more or less, at the international level--a fact especially worth remembering when one considers how much the opportunity to multiply the domestic gross has expanded.
Raiders of the Lost Ark--$390 million ($1.263 billion in 2022 dollars)
Indiana Jones and the Temple of Doom--$333 million ($944 million in 2022 dollars)
Indiana Jones and the Last Crusade--$474 million ($1.123 billion in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$791 million ($1.08 billion)
Only Indiana Jones and the Temple of Doom was outdone by its successor (in which it may well have been that that film's particularly dark tone was a factor). It is worth considering, too, that as the grosses went down the budgets went up, in nominal to say nothing of "real" prices.
Raiders of the Lost Ark--$18 million ($58 million in 2022 dollars)
Indiana Jones and the Temple of Doom--$28 million ($79 million in 2022 dollars)
Indiana Jones and the Last Crusade--$48 million ($114 million in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$185 million ($253 million in 2022 dollars)
If one goes by the rule of thumb that a movie released today may need a gross of four to five times its production budget in order to turn a profit in its theatrical run then, frankly, Indy 4 looks pretty marginal compared to the far-cheaper Raiders, a factor that may well have played into the very long wait for the fifth film--a wait that in turn likely affected its prospects negatively. When it has been fifteen years since the last Indiana Jones film--and that one, which was not too well-loved by audiences, came nineteen years after the preceding film (such that its original appearance is unremembered by anyone under forty)--one may have doubts about just how much the "nostalgia" factor so critical to making a movie like this one a hit will work in its favor. This may be more problematic still in the foreign markets where Indiana Jones films are moderately good but not great earners (internationally the last two films made about one-and-a-half times their domestic gross, rather than two or more times as much the way Marvel and Avatar movies do), and will have a handicap at the outset in Indy Jones 5 reportedly not playing in China, to absolutely no one's surprise. (Ford has publicly supported Tibetan independence, which, of course, has not pleased Beijing.)
So in short: the movie's being a tougher product to sell than it looks given its period character, the long downward trend in the grosses (as the budgets keep going up), and the likely waning of the nostalgia that might help it (and internationally, the possibility of its exclusion from China), will cut against any new Indiana Jones film at the box office, even before we get to the specific movie at hand. We have here hints that James Mangold of Logan fame may do for Indiana Jones with this one what he did for the X-Men--a gamble which, divisive if ultimately successful there, seems considerably less likely to pay off here (while Mangold's assurances that Indy 5 will not play like his Wolverine movie may not necessarily be persuasive to everyone). The reports of disastrous test screenings, denied but so far as I can tell not debunked, do not augur well for it. And some may well see other creative decisions as . . . unpromising. (Few figures in the world of entertainment have had the benefit of so much claquing-until-you-are-sick-of-hearing-about-them this past decade as Phoebe Waller-Bridge, and I do not doubt that those who, for whatever reason, responded to Killing Eve and Fleabag with unhinged, critical-reaction-to-Sopranos-like paroxysms of praise are delighted to see her prominently cast in the movie, those not sharing their opinions about life, the universe and everything will . . . feel differently. Moreover, while the professional "anti-woke" brigade have, as usual, made their position very clear, they are unlikely to be wholly alone in their reservations about this decision--all as one may remember that the last time Waller-Bridge was in a Lucasfilm project it was . . . Solo.)
In light of all that I am not bullish on this one. Indeed, I think Indy 5 will be the movie that reduces Indiana Jones from the tiny and ultra-exclusive club of "absolute, inarguable, classic, totally failure-proof franchise" to "just another movie franchise" (like Star Wars had become in 2017-2019) and maybe even "franchise old people see mainly out of nostalgia" (like the James Bond films have seemed to be, especially after the Waller-Bridge co-scripted No Time to Die--yes, there she is again!), because in contrast with all those years in which it finished the year at the top (Raiders was #1 in the U.S. in its year, Temple of Doom #3, Last Crusade and even Crystal Skull both #2, with their global standing usually at least as good), others will have clearly outdone it globally and even domestically. This will be all the more the case insofar as, given the colossal budget behind the film (we are hearing figures in the range of $300 million tossed about), even a billion dollars grossed could still leave the backers in the red at the end of the theatrical run, and anxiously awaiting accompanying and subsequent revenues' adding up simply to the break-even point. (And, again, nobody goes into business to "break even!") And even if I do not expect to see it fall so low as Solo did (in today's terms, a catastrophically low $450 million), I would not be shocked by a mere $800 million or even $700 million taken in globally at the end of the run--which would put another crater-sized hole in Disney's already hole-riddled books.
* Save for the data on Indiana Jones and the Last Crusade's global box office gross (gleaned from The Hollywood Reporter), and the information on Dial of Destiny's budget, all budgetary and box office figures were derived from the Boxofficemojo.com web site.
Still, if it was a fun enough film that does not mean it was all a fan might have hoped for after two decades--or for that matter, "necessary" somehow, the more in as the title of the third film (The Last Crusade) had previously implied the rounding off of a trilogy with the heroes (literally) riding off into the sunset, with everything best left lying as it was. And if that was the case then (even as seen by, again, someone more favorable to the film than most) there would seem reason for doubt about the prospects of the fifth film in the series, coming your way this June, especially to go by what we have seen happen with old franchises like this one, and what we have heard about this particular film. After all, while we take Indiana Jones' blockbuster status for granted there really aren't a lot of similar movies out there, are there? (Once more: period pieces are a tough sell to American audiences, with this going no less for anachronistic sci-fi stuff like steampunk, "weird Westerns," and the like.) When Raiders of the Lost Ark hit theaters it had the advantage of being a comparative novelty in a less competitive market where really big-budget action movies were fewer, while benefiting from its association with Star Wars at the peak of its cachet (with George Lucas having writer and producer credits, and Harrison Ford's Han Solo barely frozen in carbonite and hauled off by Boba Fett to Jabba's palace), and one might add, a moment when audiences were relatively easier sells on period adventure than they are now. (If the fact seems forgotten now the '70s was a boom period for World War II stuff of this kind, with Ford himself, before fighting the Nazis as Jones, having done so as Lieutenant Colonel Mike Barnsby in the Guns of Navarone sequel Force 10 From Navarone, and Army Air Force Lieutenant David Halloran in Hanover Street--neither a great hit admittedly, but still testifying to the fact of that fashion, and perhaps to audiences having been prepared to see Ford in a period piece, just as, one might add, the disappointment that was his later Cowboys & Aliens can seem to suggest the opposite.)
Of course, having been such a big success then gives it a boost such as a freshly-launched franchise would not enjoy--but Hollywood has time and again run into the ground the biggest of '80s successes, as demonstrated by the Rambo, Die Hard and Terminator franchises, with the post-Bob Iger Star Wars showing that no franchise is immune. If it can happen to Star Wars, it can certainly happen to that other Lucas-Ford franchise, Indy, whose grosses were already trending downward in the '80s, with the fourth film no break from the pattern. While at the North American box office it made the most in current dollars--over $317 million, as against Raiders of the Lost Ark's $212 million, Indiana Jones and the Last Crusade's $197 million, and Indiana Jones and the Temple of Doom's $179 million--adjusting the figures for inflation (certainly if one uses the Consumer Price Index to get the values in 2022 dollars) produces a very different result*:
Raiders of the Lost Ark--$687 million
Indiana Jones and the Temple of Doom--$510 million
Indiana Jones and the Last Crusade--$468 million
Indiana Jones and the Kingdom of the Crystal Skull--$434 million
One may add that the pattern has held, more or less, at the international level--a fact especially worth remembering when one considers how much the opportunity to multiply the domestic gross has expanded.
Raiders of the Lost Ark--$390 million ($1.263 billion in 2022 dollars)
Indiana Jones and the Temple of Doom--$333 million ($944 million in 2022 dollars)
Indiana Jones and the Last Crusade--$474 million ($1.123 billion in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$791 million ($1.08 billion)
Only Indiana Jones and the Temple of Doom was outdone by its successor (in which it may well have been that that film's particularly dark tone was a factor). It is worth considering, too, that as the grosses went down the budgets went up, in nominal to say nothing of "real" prices.
Raiders of the Lost Ark--$18 million ($58 million in 2022 dollars)
Indiana Jones and the Temple of Doom--$28 million ($79 million in 2022 dollars)
Indiana Jones and the Last Crusade--$48 million ($114 million in 2022 dollars)
Indiana Jones and the Kingdom of the Crystal Skull--$185 million ($253 million in 2022 dollars)
If one goes by the rule of thumb that a movie released today may need a gross of four to five times its production budget in order to turn a profit in its theatrical run then, frankly, Indy 4 looks pretty marginal compared to the far-cheaper Raiders, a factor that may well have played into the very long wait for the fifth film--a wait that in turn likely affected its prospects negatively. When it has been fifteen years since the last Indiana Jones film--and that one, which was not too well-loved by audiences, came nineteen years after the preceding film (such that its original appearance is unremembered by anyone under forty)--one may have doubts about just how much the "nostalgia" factor so critical to making a movie like this one a hit will work in its favor. This may be more problematic still in the foreign markets where Indiana Jones films are moderately good but not great earners (internationally the last two films made about one-and-a-half times their domestic gross, rather than two or more times as much the way Marvel and Avatar movies do), and will have a handicap at the outset in Indy Jones 5 reportedly not playing in China, to absolutely no one's surprise. (Ford has publicly supported Tibetan independence, which, of course, has not pleased Beijing.)
So in short: the movie's being a tougher product to sell than it looks given its period character, the long downward trend in the grosses (as the budgets keep going up), and the likely waning of the nostalgia that might help it (and internationally, the possibility of its exclusion from China), will cut against any new Indiana Jones film at the box office, even before we get to the specific movie at hand. We have here hints that James Mangold of Logan fame may do for Indiana Jones with this one what he did for the X-Men--a gamble which, divisive if ultimately successful there, seems considerably less likely to pay off here (while Mangold's assurances that Indy 5 will not play like his Wolverine movie may not necessarily be persuasive to everyone). The reports of disastrous test screenings, denied but so far as I can tell not debunked, do not augur well for it. And some may well see other creative decisions as . . . unpromising. (Few figures in the world of entertainment have had the benefit of so much claquing-until-you-are-sick-of-hearing-about-them this past decade as Phoebe Waller-Bridge, and I do not doubt that those who, for whatever reason, responded to Killing Eve and Fleabag with unhinged, critical-reaction-to-Sopranos-like paroxysms of praise are delighted to see her prominently cast in the movie, those not sharing their opinions about life, the universe and everything will . . . feel differently. Moreover, while the professional "anti-woke" brigade have, as usual, made their position very clear, they are unlikely to be wholly alone in their reservations about this decision--all as one may remember that the last time Waller-Bridge was in a Lucasfilm project it was . . . Solo.)
In light of all that I am not bullish on this one. Indeed, I think Indy 5 will be the movie that reduces Indiana Jones from the tiny and ultra-exclusive club of "absolute, inarguable, classic, totally failure-proof franchise" to "just another movie franchise" (like Star Wars had become in 2017-2019) and maybe even "franchise old people see mainly out of nostalgia" (like the James Bond films have seemed to be, especially after the Waller-Bridge co-scripted No Time to Die--yes, there she is again!), because in contrast with all those years in which it finished the year at the top (Raiders was #1 in the U.S. in its year, Temple of Doom #3, Last Crusade and even Crystal Skull both #2, with their global standing usually at least as good), others will have clearly outdone it globally and even domestically. This will be all the more the case insofar as, given the colossal budget behind the film (we are hearing figures in the range of $300 million tossed about), even a billion dollars grossed could still leave the backers in the red at the end of the theatrical run, and anxiously awaiting accompanying and subsequent revenues' adding up simply to the break-even point. (And, again, nobody goes into business to "break even!") And even if I do not expect to see it fall so low as Solo did (in today's terms, a catastrophically low $450 million), I would not be shocked by a mere $800 million or even $700 million taken in globally at the end of the run--which would put another crater-sized hole in Disney's already hole-riddled books.
* Save for the data on Indiana Jones and the Last Crusade's global box office gross (gleaned from The Hollywood Reporter), and the information on Dial of Destiny's budget, all budgetary and box office figures were derived from the Boxofficemojo.com web site.
The Last of Us: An Unconventional Take
As the amount of media content on offer has burgeoned, so has the quantity and intensity of the claquing intended to convince us that each and every last banal product of the colossal Masscult-generating machinery that is Big Media is a work of staggering, heart-breaking genius offering an experience we will never forgive ourselves for missing out on the least little bit.
I have long since come to find this wearying--so much so that even the occasional glance at the entertainment news is so often tiring, with all this especially going for the "prestige TV" churned out by the Midcult division of the Masscult machinery, the claquing on behalf of which has made the unhinged levels of critical acclaim with which The Sopranos was presented to the world increasingly the norm. Recently this claquing has extended to The Last of Us, which seems to be the successor to The Walking Dead as a prestige TV zombie show craze (because, apparently, someone decided that this trend simply cannot be allowed to end).
Naturally I have had an eye open for critical takes (especially when they did not fall into the standard, tiresome, "culture warriors vs. SJWs" pattern), with Ed Hightower supplying just that. As one might guess from his colleague Carlos Delgado's review of the game that was the show's basis as "a thoroughly miserable experience; nasty, gratuitously violent and misanthropic . . . more of a symptom of a coarse and violent culture than a commentary upon it," for Hightower the show "marks something of a new high-water mark for savage violence and self-assured misanthropy in the entertainment industry," with this drama in which "unspeakable cruelty" of various kinds is "most of the plot," in between acts of which cruelty the "dialogue barely rises above the level of grunts and barks." And, contrary to what those of fashionable, dark-and-gritty-loving taste may imagine, this was not meant as a compliment, the work not only repellent, but lazy, shallow and, rather than trite-but-true, trite-but-false throughout.
Sounds about right for this kind of thing.
I have long since come to find this wearying--so much so that even the occasional glance at the entertainment news is so often tiring, with all this especially going for the "prestige TV" churned out by the Midcult division of the Masscult machinery, the claquing on behalf of which has made the unhinged levels of critical acclaim with which The Sopranos was presented to the world increasingly the norm. Recently this claquing has extended to The Last of Us, which seems to be the successor to The Walking Dead as a prestige TV zombie show craze (because, apparently, someone decided that this trend simply cannot be allowed to end).
Naturally I have had an eye open for critical takes (especially when they did not fall into the standard, tiresome, "culture warriors vs. SJWs" pattern), with Ed Hightower supplying just that. As one might guess from his colleague Carlos Delgado's review of the game that was the show's basis as "a thoroughly miserable experience; nasty, gratuitously violent and misanthropic . . . more of a symptom of a coarse and violent culture than a commentary upon it," for Hightower the show "marks something of a new high-water mark for savage violence and self-assured misanthropy in the entertainment industry," with this drama in which "unspeakable cruelty" of various kinds is "most of the plot," in between acts of which cruelty the "dialogue barely rises above the level of grunts and barks." And, contrary to what those of fashionable, dark-and-gritty-loving taste may imagine, this was not meant as a compliment, the work not only repellent, but lazy, shallow and, rather than trite-but-true, trite-but-false throughout.
Sounds about right for this kind of thing.
Thursday, March 23, 2023
What are the Prospects for Controlling the Pace of Artificial Intelligence Research Through International Agreement?
A Follow-Up to "Ezra Klein on AI in the New York Times: A Reaction."
In making his case regarding the rapid pace and potentially dire consequences of artificial intelligence research not only in the long run, but in the fairly near term, at the current pace of such research, Ezra Klein, in the absence of some acceleration of effective human adaptation to the results (or even alongside such adaptation) called for an international regime for controlling research in such technology, to the end of slowing it down.
While I do not share Mr. Klein's view on this particular matter his premise that something he sees as a problem may be redressed is certainly healthier than the apathy-encouraging doomism that an exceedingly crass and corrupt media delights in inflicting on the public every minute of the day. Still, I have to confess that the record of international cooperation in recent decades on any issue of importance has not been . . . inspiring; the trend in political life for the moment seems to be away from the circumstances conducive to such cooperation, rather than toward it; and the nature of this issue may make it particularly tough to address in this manner.
Consider, for example, those problems that have most seemed to demand global cooperation--where a potentially existential danger to human civilization was pressing, and nothing less than the whole world working together would really do in meeting that danger. The two most obvious are nuclear weapons and climate change--with the failure the more pointed here because of what states officially committed themselves to in the most public way. Consider, for example, the text of the 1968 Treaty on the Non-Proliferation of Nuclear Weapons. This has the undersigned declaring "their intention to achieve at the earliest possible date the cessation of the nuclear arms race and to undertake effective measures in the direction of nuclear disarmament," with this explicitly including agreeing to work toward
By contrast, we would seem a much longer way from even such gestures in the direction of action as, the 2015 Paris Climate Agreement. Where artificial intelligence is concerned--comparatively few persuaded that any threat it poses is nearly so dire, with, not insignificantly, broad attitudes toward the issue differing greatly across cultures and nations. Significantly it has often been remarked that in East Asia--a particularly critical region for the development of the technology--attitudes toward AI are more positive than in the West, with a Pew Research Center survey from December 2020 providing some recent statistical confirmation. (In Japan, Taiwan, South Korea, of those polled 65 percent or more said the development of AI was a "good thing" for society, and no more than 22 percent said it was a bad one--with, one might add, the Pew Center's polling in India producing similar numbers. By contrast in the U.S. 47 percent identified it as a "good thing" and 44 percent as a bad, the responses in fellow G-7 members Britain, Germany and Canada similar, and respondents in France even more negative. Even those Westerners most positively disposed toward AI, the Swedish, among whom the positive/negative split was 60-24, still fell short of the poll numbers of all of the Asian countries named here.)
Were major Asian countries such as Japan, India--or a China that plausibly sees the matter similarly--to refuse to sign such an agreement it would be effectively a dead letter. However, I suspect that we would be unlikely to get anywhere near the point where there would be such a consensus even in the more AI-phobic West. It matters, for example, that positive views on the matter are correlated with education--suggesting that elites, with all this implies for policy, are more open to the idea of AI than the population at large. (In the U.S., for instance, 50 percent of those with "more education" who were polled thought AI a good thing, as against 41 percent of those with "less education.") It matters, too, that the young are more inclined to positive views, suggesting such attitudes will become more rather than less common in the years ahead. (Those below median age were more likely than those above it to think AI a good thing by a margin of 49 to 44 percent.) The result is that increasing openness to AI, rather than the opposite, seems to be the trend in the West. (Meanwhile if the trajectory in Asia parallels that in the West, as it actually seems to do by at least as large a margin--with in Japan the more educated 10 percent more likely than the less educated and those below median age 17 percent more likely than their elder to think AI a good thing--we would see an even more positive attitude there, with all that implies for Asian attitudes in the same area.)
I suspect it would take a great deal to change attitudes here--especially among, again, the more favorably inclined policymaking groups, the more in as governments and business presently seem to hope for so much from artificial intelligence (which, if experience in areas like climate change is anything to go by, will very easily outweigh very strong reservations among the broad public). Certainly in the United States a Wall Street embrace of AI such as some now speculate about would make any action seriously restricting it all but unimaginable to go by recent political history, while the deference of officials to such interests aside, concern for plain and simple state power will also be operative around the world. After all, we live in an age in which realpolitik is not declining but intensifying, with one dimension of this intensifying military competition--and susceptibility to technological hype in the military sphere, with at least one world leader of note having publicly expressed expectations that artificial intelligence superiority will be "ruler of the world." Such a statement is not to be taken lightly, the more in as he has just said what others are doubtless thinking, not least because countries facing a situation of geopolitical disadvantage they regard as intolerable (and many do) may see in AI their best hope of "leveling the field." Meanwhile, there is the humbler matter of economic productivity, perhaps the more urgent at a time of economic stagnation at best (or, depending on how does the math, collapsing output), and mounting economic "headwinds," with, again, East Asia worth mentioning here given its stagnant, aging, even shrinking populations. To cite simply the most dramatic example Japan has seen its working-age population collapse over this past generation, falling from 87 million in 1994-1995 to 73 million in 2021, with no end in sight. For a country in such straits, the prospect of AI-powered productivity coming to its economic rescue cannot be overlooked.
All that, too, will factor into the likelihood of an international AI control regime--or the opposite--and the likelihood of states abiding by it. Moreover, even were it possible to institute an agreement, in relative good faith, enforcement would be extremely challenging given the character of AI research. While CNBC remarked that the technology's development was "expensive" the figure for developing and training a program like GPT-3 may have been in the vicinity of $4 million.
That's it. Just $4 million. This may mean that the average person cannot afford to finance a "startup" such as this out of pocket, however much they would like to "get in on the ground floor here." But by the standards of high-technology corporate R & D this is nothing. Repeat, nothing. (According to a report from a few months ago some $75 billion--enough to fund 20,000 GPT-3s--have been spent on self-driving technology--likely to less result in the way of a useful product.) Still more is it nothing next to the resources governments have and are willing to apply to those ends they prioritize. (The Pentagon is requesting $130 billion for a single year's R & D.)
When the "barrier to entry" is that low it is that much more difficult for an international regime to effectively enforce the rules. Indeed, any number of wealthy individuals, who think little of spending nine figures on grand homes and gigayachts; and many of whom are famed for their transhumanist inclinations; could easily afford to fund such research on a personal basis even where they could not do so via their command of their business empires. Moreover, a program of this kind would be relatively easy to keep secret, given the nature of computer research--in contrast with the physical facilities required for the development, production and deployment of, for example, an arsenal of nuclear missiles, with all the temptation that the possibility of discretion, as well as cheapness, affords.
Of course, it may be that no obstacle is wholly insuperable. However, separately, and still more together, they are extremely formidable--enough so to have consistently defeated aspirations to meaningful cooperation in the past. Moreover, if Mr. Klein is right the proverbial clock is ticking--with perhaps not very much time left before the question of the "opening of the portal" becomes mooted by the actual event.
If one takes that prospect seriously, of course.
In making his case regarding the rapid pace and potentially dire consequences of artificial intelligence research not only in the long run, but in the fairly near term, at the current pace of such research, Ezra Klein, in the absence of some acceleration of effective human adaptation to the results (or even alongside such adaptation) called for an international regime for controlling research in such technology, to the end of slowing it down.
While I do not share Mr. Klein's view on this particular matter his premise that something he sees as a problem may be redressed is certainly healthier than the apathy-encouraging doomism that an exceedingly crass and corrupt media delights in inflicting on the public every minute of the day. Still, I have to confess that the record of international cooperation in recent decades on any issue of importance has not been . . . inspiring; the trend in political life for the moment seems to be away from the circumstances conducive to such cooperation, rather than toward it; and the nature of this issue may make it particularly tough to address in this manner.
Consider, for example, those problems that have most seemed to demand global cooperation--where a potentially existential danger to human civilization was pressing, and nothing less than the whole world working together would really do in meeting that danger. The two most obvious are nuclear weapons and climate change--with the failure the more pointed here because of what states officially committed themselves to in the most public way. Consider, for example, the text of the 1968 Treaty on the Non-Proliferation of Nuclear Weapons. This has the undersigned declaring "their intention to achieve at the earliest possible date the cessation of the nuclear arms race and to undertake effective measures in the direction of nuclear disarmament," with this explicitly including agreeing to work toward
the cessation of the manufacture of nuclear weapons, the liquidation of all their existing stockpiles, and the elimination from national arsenals of nuclear weapons and the means of their delivery pursuant to a Treaty on general and complete disarmament under strict and effective international control.Over a half century on, just how seriously would you say "the undersigned" took all this? How seriously would you say they are taking it now--as the bilateral arms control agreements between the U.S. and Soviet Union of the Cold War era, at best steps in the right direction, fall by the wayside with the most uncertain consequences, and acknowledged nuclear powers like Britain and China expand their arsenals, as they race to field a new generation of super-weapons straight out of a comic book super-villain's laboratory. Meanwhile, the latest United Nations' report on climate change making the round of the headlines (among much else) makes all too clear how little governments have even attempted to do relative to even the agreements they have signed, never mind what the scale of the problem would seem to demand.
By contrast, we would seem a much longer way from even such gestures in the direction of action as, the 2015 Paris Climate Agreement. Where artificial intelligence is concerned--comparatively few persuaded that any threat it poses is nearly so dire, with, not insignificantly, broad attitudes toward the issue differing greatly across cultures and nations. Significantly it has often been remarked that in East Asia--a particularly critical region for the development of the technology--attitudes toward AI are more positive than in the West, with a Pew Research Center survey from December 2020 providing some recent statistical confirmation. (In Japan, Taiwan, South Korea, of those polled 65 percent or more said the development of AI was a "good thing" for society, and no more than 22 percent said it was a bad one--with, one might add, the Pew Center's polling in India producing similar numbers. By contrast in the U.S. 47 percent identified it as a "good thing" and 44 percent as a bad, the responses in fellow G-7 members Britain, Germany and Canada similar, and respondents in France even more negative. Even those Westerners most positively disposed toward AI, the Swedish, among whom the positive/negative split was 60-24, still fell short of the poll numbers of all of the Asian countries named here.)
Were major Asian countries such as Japan, India--or a China that plausibly sees the matter similarly--to refuse to sign such an agreement it would be effectively a dead letter. However, I suspect that we would be unlikely to get anywhere near the point where there would be such a consensus even in the more AI-phobic West. It matters, for example, that positive views on the matter are correlated with education--suggesting that elites, with all this implies for policy, are more open to the idea of AI than the population at large. (In the U.S., for instance, 50 percent of those with "more education" who were polled thought AI a good thing, as against 41 percent of those with "less education.") It matters, too, that the young are more inclined to positive views, suggesting such attitudes will become more rather than less common in the years ahead. (Those below median age were more likely than those above it to think AI a good thing by a margin of 49 to 44 percent.) The result is that increasing openness to AI, rather than the opposite, seems to be the trend in the West. (Meanwhile if the trajectory in Asia parallels that in the West, as it actually seems to do by at least as large a margin--with in Japan the more educated 10 percent more likely than the less educated and those below median age 17 percent more likely than their elder to think AI a good thing--we would see an even more positive attitude there, with all that implies for Asian attitudes in the same area.)
I suspect it would take a great deal to change attitudes here--especially among, again, the more favorably inclined policymaking groups, the more in as governments and business presently seem to hope for so much from artificial intelligence (which, if experience in areas like climate change is anything to go by, will very easily outweigh very strong reservations among the broad public). Certainly in the United States a Wall Street embrace of AI such as some now speculate about would make any action seriously restricting it all but unimaginable to go by recent political history, while the deference of officials to such interests aside, concern for plain and simple state power will also be operative around the world. After all, we live in an age in which realpolitik is not declining but intensifying, with one dimension of this intensifying military competition--and susceptibility to technological hype in the military sphere, with at least one world leader of note having publicly expressed expectations that artificial intelligence superiority will be "ruler of the world." Such a statement is not to be taken lightly, the more in as he has just said what others are doubtless thinking, not least because countries facing a situation of geopolitical disadvantage they regard as intolerable (and many do) may see in AI their best hope of "leveling the field." Meanwhile, there is the humbler matter of economic productivity, perhaps the more urgent at a time of economic stagnation at best (or, depending on how does the math, collapsing output), and mounting economic "headwinds," with, again, East Asia worth mentioning here given its stagnant, aging, even shrinking populations. To cite simply the most dramatic example Japan has seen its working-age population collapse over this past generation, falling from 87 million in 1994-1995 to 73 million in 2021, with no end in sight. For a country in such straits, the prospect of AI-powered productivity coming to its economic rescue cannot be overlooked.
All that, too, will factor into the likelihood of an international AI control regime--or the opposite--and the likelihood of states abiding by it. Moreover, even were it possible to institute an agreement, in relative good faith, enforcement would be extremely challenging given the character of AI research. While CNBC remarked that the technology's development was "expensive" the figure for developing and training a program like GPT-3 may have been in the vicinity of $4 million.
That's it. Just $4 million. This may mean that the average person cannot afford to finance a "startup" such as this out of pocket, however much they would like to "get in on the ground floor here." But by the standards of high-technology corporate R & D this is nothing. Repeat, nothing. (According to a report from a few months ago some $75 billion--enough to fund 20,000 GPT-3s--have been spent on self-driving technology--likely to less result in the way of a useful product.) Still more is it nothing next to the resources governments have and are willing to apply to those ends they prioritize. (The Pentagon is requesting $130 billion for a single year's R & D.)
When the "barrier to entry" is that low it is that much more difficult for an international regime to effectively enforce the rules. Indeed, any number of wealthy individuals, who think little of spending nine figures on grand homes and gigayachts; and many of whom are famed for their transhumanist inclinations; could easily afford to fund such research on a personal basis even where they could not do so via their command of their business empires. Moreover, a program of this kind would be relatively easy to keep secret, given the nature of computer research--in contrast with the physical facilities required for the development, production and deployment of, for example, an arsenal of nuclear missiles, with all the temptation that the possibility of discretion, as well as cheapness, affords.
Of course, it may be that no obstacle is wholly insuperable. However, separately, and still more together, they are extremely formidable--enough so to have consistently defeated aspirations to meaningful cooperation in the past. Moreover, if Mr. Klein is right the proverbial clock is ticking--with perhaps not very much time left before the question of the "opening of the portal" becomes mooted by the actual event.
If one takes that prospect seriously, of course.
Wednesday, March 22, 2023
If the Cinematic Blockbuster Goes, Then What?
For decades now film--especially to the extent that we identify film specifically with what people see in a theater--has seemed to be ever-more dominated by the cinematic "blockbuster" spectacles of essentially one of two types: either "family"-oriented animated features tending toward comedy, and especially musical comedy; or action-adventure that has tended to be science-fiction or fantasy-themed, exemplified by the superhero movie. If anything, the pandemic has sharpened the tendency in this direction.
Of course, that raises the question of what will happen if these blockbuster genres go the way of, for instance, the "serious drama" or the romantic comedy--a prospect that may not be far from implausible. After all, those narratively limited forms of filmmaking are fairly well-won now, while the particular franchises that Hollywood so favors are ever more exhausted (witness Disney's troubles with Star Wars and Marvel, and, in the wake of Warner Bros. failures with the DC Universe, the desperation surrounding the talk of "James Gunn to the rescue!"). Meanwhile the big studios that finance such projects are in a financially battered state (with things perhaps getting worse before they get better, assuming they ever do get better). Between falling returns and the shakiness of the investors' portfolios, something could give, with this including the readiness to put up $300 million or more for the production, promotion and distribution of a big movie.
In such an eventuality the most obvious outcome would, in spite of the desire of the film studios to double down on theaters (and the income from $20 a ticket sales), seem to me to be the continued decline of the theaters--with such venues becoming less numerous, and, to go by what we are hearing of their recent business practice in these very lean years for them (in 2022 the box office gross, if recovering, was, even after Top Gun 2, Marvel's diminished but still not insignificant earnings from its movies, and the release of Avatar 2, still just half what it had been pre-pandemic), becoming more and more "special events" venues or general entertainment centers than houses principally oriented to screening new releases for a steady flow of filmgoers.* Meanwhile, with theatrical income ever less to be relied upon--and theatrical release and the publicity it justifies financially less reliable as a way of competing in the "attention economy"--one could expect the downward pressure on production budgets seen in non-blockbusters these days to continue. Just what that will mean for the content of film, however, will depend on the progress of cinematic technology--and perhaps in particular the possibility of making a TV show look like an "A" picture. The more enthusiastic assessments of the small-screen Star Wars incarnations of recent years generally credit those series' with delivering that, and ultimately it may be here that action-adventure of that type will live on; while if seemingly a more remote prospect the quality of animation on the small screen may begin to approximate that on the large.
* Together Dr. Strange 2, Thor 4 and Black Panther 2 made $1.2 billion in North America--perhaps not all that had been hoped for, but far from trivial, and indeed the kind of money that for most of 2020 and 2021 producers could only fantasize about (as when together Black Widow, The Eternals and Shangi-Chi, coming out in the five month July-November 2021 period, failed to crack $600 million between the three of them). And of course, 2022 also saw Marvel take in another $200 million via Spider-Man's ticket sales from New Year's Day on.
Of course, that raises the question of what will happen if these blockbuster genres go the way of, for instance, the "serious drama" or the romantic comedy--a prospect that may not be far from implausible. After all, those narratively limited forms of filmmaking are fairly well-won now, while the particular franchises that Hollywood so favors are ever more exhausted (witness Disney's troubles with Star Wars and Marvel, and, in the wake of Warner Bros. failures with the DC Universe, the desperation surrounding the talk of "James Gunn to the rescue!"). Meanwhile the big studios that finance such projects are in a financially battered state (with things perhaps getting worse before they get better, assuming they ever do get better). Between falling returns and the shakiness of the investors' portfolios, something could give, with this including the readiness to put up $300 million or more for the production, promotion and distribution of a big movie.
In such an eventuality the most obvious outcome would, in spite of the desire of the film studios to double down on theaters (and the income from $20 a ticket sales), seem to me to be the continued decline of the theaters--with such venues becoming less numerous, and, to go by what we are hearing of their recent business practice in these very lean years for them (in 2022 the box office gross, if recovering, was, even after Top Gun 2, Marvel's diminished but still not insignificant earnings from its movies, and the release of Avatar 2, still just half what it had been pre-pandemic), becoming more and more "special events" venues or general entertainment centers than houses principally oriented to screening new releases for a steady flow of filmgoers.* Meanwhile, with theatrical income ever less to be relied upon--and theatrical release and the publicity it justifies financially less reliable as a way of competing in the "attention economy"--one could expect the downward pressure on production budgets seen in non-blockbusters these days to continue. Just what that will mean for the content of film, however, will depend on the progress of cinematic technology--and perhaps in particular the possibility of making a TV show look like an "A" picture. The more enthusiastic assessments of the small-screen Star Wars incarnations of recent years generally credit those series' with delivering that, and ultimately it may be here that action-adventure of that type will live on; while if seemingly a more remote prospect the quality of animation on the small screen may begin to approximate that on the large.
* Together Dr. Strange 2, Thor 4 and Black Panther 2 made $1.2 billion in North America--perhaps not all that had been hoped for, but far from trivial, and indeed the kind of money that for most of 2020 and 2021 producers could only fantasize about (as when together Black Widow, The Eternals and Shangi-Chi, coming out in the five month July-November 2021 period, failed to crack $600 million between the three of them). And of course, 2022 also saw Marvel take in another $200 million via Spider-Man's ticket sales from New Year's Day on.
Headwinds for the Cinematic Blockbuster?
When considering the way the movie business has become more blockbuster-centered it seems one should not only be attentive to the way in which blockbusters crowd other kinds of fare out of the market (driving it to streaming, and usually, to obscurity on streaming), but also the frailty of the blockbuster model itself, and what that frailty means for the industry now so reliant on it. In particular I see that frailty as leaving it susceptible to at least four "headwinds" worth discussing:
1. The Continued Decline of Theater-Going.
As of 2023 people are returning to their pre-pandemic theater-going habits. However, --but it is worth remembering that these were already under strain before the pandemic, which seems to have encouraged them in the tendency to not come out for anything but the biggest "tentpoles." One may add that the pandemic has meant that when they are willing to "come out" there are fewer places for them to go. Actual theater closures have been less numerous than I once thought they might be given the way ticket sales crashed during the pandemic and have since stayed depressed--the total number of theaters shrinking less than a tenth. Still, in doing so they have likely taken a certain amount of custom with them (as some people have lost a favorite theater, now have to travel a longer way to get to any theater, etc., making them less easily persuaded to go), while one cannot be sure the closure wave has run its course, especially given the broader economic situation. Besides the obvious implications of downturn and rising interest rates for theaters as for all other business, there is what it means for the moviegoing public. In brief, between inflation and recession those who are seeing movies again may be more careful with their money, while, one should not forget, for many the trip to the theater (the pricey tickets, gas and parking, the cost of obscenely marked-up concessions, etc., as against staying home and watching streaming as they eat what they have in the refrigerator) is a luxury with all that means for the viability of the remaining theaters. All of this may portend further disruption of an ever-more fragile habit of theater-going--and the revenue stream from the $20 theater tickets that Hollywood cannot get along without in anything like its current form.
2. The Accumulating Holes in the Movie Industry's Balance Sheets.
In considering the matter of interest rates it is worth acknowledging that if this has been a problem for theaters it has probably been a bigger one for the studios. During the period of ultra-loose monetary policy, in which central bankers made giving away money to investors a cornerstone of the growth model--and business gorged itself on the effectively free money, with the would-be streaming giants no exception as they funded vast amounts of "content" for the sake of building up a subscriber base, with relatively little thought for what the limits to its profitability were. Add to that the revenue collapse associated with the depression of theater-going for three years as of this month, and counting--with its pressure to borrow that much more cash. Of course, the surge of inflation reduced the "real" weight of studio debts--but the more recent rise in interest rates meant the cheap borrowing was at an end. Between that debt accumulation and the higher cost of borrowing something has to give--and indeed it has. Certainly we see that in the greater reserve toward streaming among Netflix, WBD, Disney, among others. But, even as in the wake of their unhappy experiments with releasing big-budget movies to streaming on their opening weekend in theaters the studios affirm their commitment to theatrical release, it seems far from implausible that all this is not factoring into studio thinking about the money they put up for those theatrically released features. This may not make them more willing to invest in small movies--but they are still likely to be more reserved toward big ones than they were before, with reinforcing that tendency the fact that their prospects for both audience, and for investors, may both be shrinking.
3. The Fragmentation of the Global Film Market.
Amid the wretched, crisis-ridden performance of the neoliberal economic vision, worse than ever this past decade; the resurgence of dissent against it that has substantially taken, or been channeled into, nationalistic forms; and the deterioration of great power relations, exemplified by the fighting in Ukraine and the looming danger of war in the Far East; the world economy is "deglobalizing." It is unclear how far this process will be allowed to go in this deeply interconnected world (even now the level of global economic integration represented by trade and investment flows has no precedent from before the twenty-first century), but cinema (first and foremost an economic enterprise to those in charge of the industry) has certainly not been untouched by that process. The result has been diminished access for Hollywood to critical international markets, with all it spells for its earnings (Marvel's Phase Four being shut out of China, for instance, likely cost it its cut of hundreds of millions of dollars' worth of ticket sales and much else in that country). It has also reduced access to sources of financing of such movies. (Who would have thought that in making even Top Gun 2 Hollywood would have got financing from Chinese and Russian investors?) The result is that the Hollywood studios are not only cut off from ticket-buyers on whom they depend, but important sources of capital by the process, just as, again, the interest rate's going up has put the squeeze on domestic funding--and the number of safe investments for them to make dwindles.
4. The Wearing Out of the Old Franchises--and the Lack of New Ones.
In our time it is a truism that businesses all across the media spectrum have in recent decades preferred to mine old successes than prospect for new riches. (Consider just how old just about all those comic book superheroes are. Happy eighty-fifth birthday Superman!) These days the tendency seems to be catching up with them as just about every one of those franchises looks old and tired--and indeed, even the genres to which they belong look old and tired. After all, if the current superhero boom can be traced back to 2000's X-Men, then we have had nearly a quarter of a century of big-budget A-list superhero features as a standard and increasingly large part of our cinematic diet--an extremely long run for any cinematic fashion, with this exemplified by how studios have fallen into the habit of telling and retelling and re-retelling the same stories about the same characters over and over and over again. (Thus it is not just the case that we have more X-Men movies, but we have seen remakes of the origin stories of both Batman and Spider-Man, while the X-Men's Dark Phoenix saga is now a twice-told tale.) Even the claqueurs of the entertainment press can no longer pretend otherwise--while as Star Wars and Marvel and DC look ever-creakier it is far from clear that anything else can serve in their places amid all those long years of no one even trying to make anything new.
I suspect that the irony goes right over the heads of the vulgarians calling the shots. Still, limited beings that they are, they are not--in spite of the PR efforts that only the stupidest person would believe, and the suck-up writing of courtiers who would tell us these people actually care about movies, or anything else but their salaries--they are not wholly oblivious to the fact that they have been foisting stale garbage on the public over and over and over again. Certainly to go by the revelations of the Sony hack of a decade ago the executives were certainly aware of what they had on their hands with the Bond film Spectre long before disappointing reviews and disappointing box office revenue confirmed it for everyone. Some of them may well be feeling the same way about their handling of more recent high concept action-adventure franchises--caught on to the fact of its being run into the ground the same way they ran "the one that started it all" into the ground. But even if this is so I have no expectation whatsoever of their changing tack anytime soon--just their getting tackier, as they ever more shamelessly serve up the same refuse to ever-diminishing returns, financially as well as creatively.
1. The Continued Decline of Theater-Going.
As of 2023 people are returning to their pre-pandemic theater-going habits. However, --but it is worth remembering that these were already under strain before the pandemic, which seems to have encouraged them in the tendency to not come out for anything but the biggest "tentpoles." One may add that the pandemic has meant that when they are willing to "come out" there are fewer places for them to go. Actual theater closures have been less numerous than I once thought they might be given the way ticket sales crashed during the pandemic and have since stayed depressed--the total number of theaters shrinking less than a tenth. Still, in doing so they have likely taken a certain amount of custom with them (as some people have lost a favorite theater, now have to travel a longer way to get to any theater, etc., making them less easily persuaded to go), while one cannot be sure the closure wave has run its course, especially given the broader economic situation. Besides the obvious implications of downturn and rising interest rates for theaters as for all other business, there is what it means for the moviegoing public. In brief, between inflation and recession those who are seeing movies again may be more careful with their money, while, one should not forget, for many the trip to the theater (the pricey tickets, gas and parking, the cost of obscenely marked-up concessions, etc., as against staying home and watching streaming as they eat what they have in the refrigerator) is a luxury with all that means for the viability of the remaining theaters. All of this may portend further disruption of an ever-more fragile habit of theater-going--and the revenue stream from the $20 theater tickets that Hollywood cannot get along without in anything like its current form.
2. The Accumulating Holes in the Movie Industry's Balance Sheets.
In considering the matter of interest rates it is worth acknowledging that if this has been a problem for theaters it has probably been a bigger one for the studios. During the period of ultra-loose monetary policy, in which central bankers made giving away money to investors a cornerstone of the growth model--and business gorged itself on the effectively free money, with the would-be streaming giants no exception as they funded vast amounts of "content" for the sake of building up a subscriber base, with relatively little thought for what the limits to its profitability were. Add to that the revenue collapse associated with the depression of theater-going for three years as of this month, and counting--with its pressure to borrow that much more cash. Of course, the surge of inflation reduced the "real" weight of studio debts--but the more recent rise in interest rates meant the cheap borrowing was at an end. Between that debt accumulation and the higher cost of borrowing something has to give--and indeed it has. Certainly we see that in the greater reserve toward streaming among Netflix, WBD, Disney, among others. But, even as in the wake of their unhappy experiments with releasing big-budget movies to streaming on their opening weekend in theaters the studios affirm their commitment to theatrical release, it seems far from implausible that all this is not factoring into studio thinking about the money they put up for those theatrically released features. This may not make them more willing to invest in small movies--but they are still likely to be more reserved toward big ones than they were before, with reinforcing that tendency the fact that their prospects for both audience, and for investors, may both be shrinking.
3. The Fragmentation of the Global Film Market.
Amid the wretched, crisis-ridden performance of the neoliberal economic vision, worse than ever this past decade; the resurgence of dissent against it that has substantially taken, or been channeled into, nationalistic forms; and the deterioration of great power relations, exemplified by the fighting in Ukraine and the looming danger of war in the Far East; the world economy is "deglobalizing." It is unclear how far this process will be allowed to go in this deeply interconnected world (even now the level of global economic integration represented by trade and investment flows has no precedent from before the twenty-first century), but cinema (first and foremost an economic enterprise to those in charge of the industry) has certainly not been untouched by that process. The result has been diminished access for Hollywood to critical international markets, with all it spells for its earnings (Marvel's Phase Four being shut out of China, for instance, likely cost it its cut of hundreds of millions of dollars' worth of ticket sales and much else in that country). It has also reduced access to sources of financing of such movies. (Who would have thought that in making even Top Gun 2 Hollywood would have got financing from Chinese and Russian investors?) The result is that the Hollywood studios are not only cut off from ticket-buyers on whom they depend, but important sources of capital by the process, just as, again, the interest rate's going up has put the squeeze on domestic funding--and the number of safe investments for them to make dwindles.
4. The Wearing Out of the Old Franchises--and the Lack of New Ones.
In our time it is a truism that businesses all across the media spectrum have in recent decades preferred to mine old successes than prospect for new riches. (Consider just how old just about all those comic book superheroes are. Happy eighty-fifth birthday Superman!) These days the tendency seems to be catching up with them as just about every one of those franchises looks old and tired--and indeed, even the genres to which they belong look old and tired. After all, if the current superhero boom can be traced back to 2000's X-Men, then we have had nearly a quarter of a century of big-budget A-list superhero features as a standard and increasingly large part of our cinematic diet--an extremely long run for any cinematic fashion, with this exemplified by how studios have fallen into the habit of telling and retelling and re-retelling the same stories about the same characters over and over and over again. (Thus it is not just the case that we have more X-Men movies, but we have seen remakes of the origin stories of both Batman and Spider-Man, while the X-Men's Dark Phoenix saga is now a twice-told tale.) Even the claqueurs of the entertainment press can no longer pretend otherwise--while as Star Wars and Marvel and DC look ever-creakier it is far from clear that anything else can serve in their places amid all those long years of no one even trying to make anything new.
I suspect that the irony goes right over the heads of the vulgarians calling the shots. Still, limited beings that they are, they are not--in spite of the PR efforts that only the stupidest person would believe, and the suck-up writing of courtiers who would tell us these people actually care about movies, or anything else but their salaries--they are not wholly oblivious to the fact that they have been foisting stale garbage on the public over and over and over again. Certainly to go by the revelations of the Sony hack of a decade ago the executives were certainly aware of what they had on their hands with the Bond film Spectre long before disappointing reviews and disappointing box office revenue confirmed it for everyone. Some of them may well be feeling the same way about their handling of more recent high concept action-adventure franchises--caught on to the fact of its being run into the ground the same way they ran "the one that started it all" into the ground. But even if this is so I have no expectation whatsoever of their changing tack anytime soon--just their getting tackier, as they ever more shamelessly serve up the same refuse to ever-diminishing returns, financially as well as creatively.
Tuesday, March 21, 2023
Is an Artificial Intelligence Bubble Emerging?
Back at the turn of the century I was intrigued by the flurry of predictions regarding the progress of artificial intelligence most popularly identified with Ray Kurzweil. (Indeed, all this was a significant source of inspiration for Surviving the Spike.) Of course, all of this looked wildly off the mark a decade on, and I declared AI a bust--thinking in particular that Kurzweil had simply expected too much of the neural net-based pattern recognition on which so many of his other predictions hinged (while also being overoptimistic about the progress toward other relevant technologies like carbon nanotube-based 3-D chips potentially permitting computers three orders of magnitude faster).
Of course, in the mid-'10s there was a new burst of excitement based on progress in exactly the key area, neural nets and their training to recognize patterns. At my most optimistic I admitted to
Alas, here we are in 2023 with, even after years of extended time on the test such vehicles decidedly not passing it, and not showing much sign of doing so very soon, with the hype, in fact, backfiring here. Meanwhile other technologies that might have afforded the foundations of later advance have similarly disappointed (the carbon nanotube-based computer chip remaining elusive, and nothing else any more likely to provide a near-term successor to silicon). Indeed, the pandemic drove the fact home. In a moment when business strongly desired automation there was just not much help on offer from that direction, with the event in fact a reminder of how little replacement there has yet been for human beings physically showing up to their workplace to perform innumerable tasks with their own hands.
The resulting sense of very strong, very recent, bust, made it very surprising when these past few months the media started buzzing again about AI--and a tribute to unsophistication on the part of our journalists given the cause of the fuss. Rather than some breakthrough in computer chip design promising much more powerful computers with all they might do, or the fundamental design of neural nets such as might make them radically more efficient learners (enabling them to overcome such critical "bottlenecks" to automation as "perception and manipulation" or "social intelligence"), the excitement is about "chatbots," and in particular a chatbot based less on fundamental design innovation in the underlying neural network than simply making the network bigger for the sake of "language prediction." Indeed, those less than impressed with the chatbots term them, unflatteringly but so far as I understand it not unfairly, glorified autocomplete functions.
Of course, it may well be that a better autocomplete function will surprise us with just what it can do. We are, for example, being told that it can learn to code. Others go further and say that such chatbots may be so versatile as to "destroy" the white collar salariat in a decade's time--or even warn that artificial intelligence research has reached the point of Lovecraftian horror story "meddling with forces we cannot possibly comprehend."
In that there seems the kind of hype that creates bubbles, though admittedly many other factors have to be considered, not all of which fit in with the bubble-making pattern. After all, this has been a period of rising interest rates and tightening monetary policy and falling asset values relative to liabilities that have come to be piled very high indeed amid cheap borrowing and economic catastrophe--hardly the sort of thing that encourages speculative euphoria. Still, as banks start failing it remains to be seen just how stalwart the central bankers will prove in their current course (on which they do not seem to have ever planned to go by their "stress tests" for the banks, and doubtless follow only with great reluctance). Moreover, there is, just as before, a very great deal of money out there that has to be parked somewhere, and still very few attractive places to put it (this is, for instance, why so much money was poured into those crappy mortgage-backed securities!), while, even as the press remains addicted to calling anyone with a large sum of money or high corporate office a "genius" (Ken Lay was a "genius," Jeffrey Epstein was a "genius," Elizabeth Holmes was a "genius, Sam Bankman-Fried was a "genius," on and on, ad nauseam), they consistently prove to be the extreme opposite, with that going the more for those who so eagerly jump on the bandwagon every time. It matters here that said persons tend to be perfectly okay with the prioritization of producing high share values over the creation of an actual product long central to such scenes ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company"), expect that if worse comes to worse there will always be a greater fool, and believe that if they run into any real trouble they will have the benefit of a bailout one way or the other, usually rightly.
The result is that, even if the money reportedly beginning to pour into artificial intelligence really is a case of people getting excited over an autocomplete function--and I will add, very little prospect of anything here to really compare with the insanity of the '90s dot-com bubble for the time being--I still wouldn't advise betting against there being a good deal of action at this particular table in the global casino in the coming months and years.
Of course, in the mid-'10s there was a new burst of excitement based on progress in exactly the key area, neural nets and their training to recognize patterns. At my most optimistic I admitted to
find[ing] myself increasingly suspecting that Kurzweil was accurate enough in guessing what would happen, and how, but, due to overoptimism about the rate of improvement in the underlying technology, was off the mark in regard to when by a rough decade.Suspecting it enough, at least, to be more watchful of a scene to which I had paid little heed for many years--with, I thought, how self-driving cars do a particularly good indicator of just how substantial, or insubstantial, the progress was. Whether or not an artificial intelligence could autonomously, reliably, safely navigate a large vehicle through a complex, variable, unpredictable urban environment seemed to me a meaningful test of progress in this area--and its passing the test reason to take the discussion seriously.
Alas, here we are in 2023 with, even after years of extended time on the test such vehicles decidedly not passing it, and not showing much sign of doing so very soon, with the hype, in fact, backfiring here. Meanwhile other technologies that might have afforded the foundations of later advance have similarly disappointed (the carbon nanotube-based computer chip remaining elusive, and nothing else any more likely to provide a near-term successor to silicon). Indeed, the pandemic drove the fact home. In a moment when business strongly desired automation there was just not much help on offer from that direction, with the event in fact a reminder of how little replacement there has yet been for human beings physically showing up to their workplace to perform innumerable tasks with their own hands.
The resulting sense of very strong, very recent, bust, made it very surprising when these past few months the media started buzzing again about AI--and a tribute to unsophistication on the part of our journalists given the cause of the fuss. Rather than some breakthrough in computer chip design promising much more powerful computers with all they might do, or the fundamental design of neural nets such as might make them radically more efficient learners (enabling them to overcome such critical "bottlenecks" to automation as "perception and manipulation" or "social intelligence"), the excitement is about "chatbots," and in particular a chatbot based less on fundamental design innovation in the underlying neural network than simply making the network bigger for the sake of "language prediction." Indeed, those less than impressed with the chatbots term them, unflatteringly but so far as I understand it not unfairly, glorified autocomplete functions.
Of course, it may well be that a better autocomplete function will surprise us with just what it can do. We are, for example, being told that it can learn to code. Others go further and say that such chatbots may be so versatile as to "destroy" the white collar salariat in a decade's time--or even warn that artificial intelligence research has reached the point of Lovecraftian horror story "meddling with forces we cannot possibly comprehend."
In that there seems the kind of hype that creates bubbles, though admittedly many other factors have to be considered, not all of which fit in with the bubble-making pattern. After all, this has been a period of rising interest rates and tightening monetary policy and falling asset values relative to liabilities that have come to be piled very high indeed amid cheap borrowing and economic catastrophe--hardly the sort of thing that encourages speculative euphoria. Still, as banks start failing it remains to be seen just how stalwart the central bankers will prove in their current course (on which they do not seem to have ever planned to go by their "stress tests" for the banks, and doubtless follow only with great reluctance). Moreover, there is, just as before, a very great deal of money out there that has to be parked somewhere, and still very few attractive places to put it (this is, for instance, why so much money was poured into those crappy mortgage-backed securities!), while, even as the press remains addicted to calling anyone with a large sum of money or high corporate office a "genius" (Ken Lay was a "genius," Jeffrey Epstein was a "genius," Elizabeth Holmes was a "genius, Sam Bankman-Fried was a "genius," on and on, ad nauseam), they consistently prove to be the extreme opposite, with that going the more for those who so eagerly jump on the bandwagon every time. It matters here that said persons tend to be perfectly okay with the prioritization of producing high share values over the creation of an actual product long central to such scenes ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company"), expect that if worse comes to worse there will always be a greater fool, and believe that if they run into any real trouble they will have the benefit of a bailout one way or the other, usually rightly.
The result is that, even if the money reportedly beginning to pour into artificial intelligence really is a case of people getting excited over an autocomplete function--and I will add, very little prospect of anything here to really compare with the insanity of the '90s dot-com bubble for the time being--I still wouldn't advise betting against there being a good deal of action at this particular table in the global casino in the coming months and years.
Sunday, March 19, 2023
Shazam! Fury of the Gods' Opening Weekend: How Did it Do?
The numbers are in for Shazam! Fury of the Gods' opening weekend--and as it happens the film's box office gross is below even the subdued earlier expectations. The movie has taken in a mere $30.5 million globally domestically, with the global take $65 million, and the reaction suggesting that (in contrast with, for example, Avatar 2) a not-quite-what-was-hoped-for opening weekend will not be redeemed by strong legs over the coming weeks.
In delivering the news Variety reports that the film's production budget was $110 million+, and the promotional budget (not very surprisingly given the tendency of the latter to match the former) another $100 million+, indicating over $210 million has been sunk into the movie. The result is that to make that back in its theatrical run the movie might have to take in well over $400 million, and likely more than $500 million--considerably more than the original Shazam.
By contrast the sequel seems likely to make a good deal less in the wake of the opening weekend's numbers--in the absence of better legs or international responsiveness relative to what it got at home than its predecessor, finishing up under $80 million domestically, and perhaps not even making $200 million globally.
Even allowing for the studio's having laid out less than the full sum that went into making and marketing the picture, and there being revenue besides the box office gross, Shazam 2 looks a long way from breaking even, never mind success by any other measure. Indeed, its numbers can make it seem as if Ant-Man 3 did not do so badly by comparison--though in considering any such possibility one has to remember that movie may have cost twice as much to make and market.* Additionally, where Shazam 2 can seem a matter of Warner Bros. (to use the terminology of television rather than film) "burning off" the last "episodes" of its DC Universe "show" prior to the much-ballyhooed James Gunn-led overhaul of its management of the franchise Ant-Man 3 was to be the start of Marvel's Phase Five, which Disney/Marvel desperately need to go very well after the disasters of the last three years (the pandemic's battering of the movie business generally, Marvel Phase Four's underperformance, the losses amassed in the foolish splurging on streaming, etc., etc.).
The result is that these two releases, which came just a month apart, testify to crisis in both of the great comic book film adaptation universes--and in the larger industry that has become so dependent on them.
* This weekend Ant-Man 3 took in about $4.1 million (another more than 40 percent drop) and seems to me still on track to finish up with about $220 million or less.
In delivering the news Variety reports that the film's production budget was $110 million+, and the promotional budget (not very surprisingly given the tendency of the latter to match the former) another $100 million+, indicating over $210 million has been sunk into the movie. The result is that to make that back in its theatrical run the movie might have to take in well over $400 million, and likely more than $500 million--considerably more than the original Shazam.
By contrast the sequel seems likely to make a good deal less in the wake of the opening weekend's numbers--in the absence of better legs or international responsiveness relative to what it got at home than its predecessor, finishing up under $80 million domestically, and perhaps not even making $200 million globally.
Even allowing for the studio's having laid out less than the full sum that went into making and marketing the picture, and there being revenue besides the box office gross, Shazam 2 looks a long way from breaking even, never mind success by any other measure. Indeed, its numbers can make it seem as if Ant-Man 3 did not do so badly by comparison--though in considering any such possibility one has to remember that movie may have cost twice as much to make and market.* Additionally, where Shazam 2 can seem a matter of Warner Bros. (to use the terminology of television rather than film) "burning off" the last "episodes" of its DC Universe "show" prior to the much-ballyhooed James Gunn-led overhaul of its management of the franchise Ant-Man 3 was to be the start of Marvel's Phase Five, which Disney/Marvel desperately need to go very well after the disasters of the last three years (the pandemic's battering of the movie business generally, Marvel Phase Four's underperformance, the losses amassed in the foolish splurging on streaming, etc., etc.).
The result is that these two releases, which came just a month apart, testify to crisis in both of the great comic book film adaptation universes--and in the larger industry that has become so dependent on them.
* This weekend Ant-Man 3 took in about $4.1 million (another more than 40 percent drop) and seems to me still on track to finish up with about $220 million or less.
Saturday, March 18, 2023
Just How Well Would Shazam 2 Have to Do to Be a Winner?
What would it take for Shazam 2 (Shazam! Fury of the Gods) to be a hit?
None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.
My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.
Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.
Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.
So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.
With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.
Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.
Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.
The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.
None but the insiders can say that about this or any other film with any great precision given the opacity of film budgeting and revenue. Yes, we are often given figures for production budgets--but these both overstate and understate the expenditure. Dubious accounting can be used to either end--overstating a budget to make sure "there is no net," overstating it to make sure there is net enough for bonuses to the principals. Also on the side of overstatement one may add that as a practical matter government subsidies and product placement often cover part of the cost; and on the side of understatement the bill for publicity--which in this utterly insane attention economy matches the cost of production.
My response to this is to play it safe and to assume that whatever the production budget was, twice as much was spent making it by somebody.
Meanwhile there is that matter of the revenue. We hear that a movie "grossed" so much. But that is the total in ticket sales, much of which may not go to the studio. Theaters keep their share of their revenue (typically getting a higher cut the longer they run a movie in their theater), while where overseas earnings are concerned there are foreign distributors to think of--all as the details of particular deals may vary. (For instance, a studio with what everyone thinks is a sure-fire winner may demand an exceptionally large share of the opening weekend sales.) No one knows how the staggering number of not-so-little deals entailed in getting a movie screened all over the world will wind up working out until they actually do work out. Still, the studio usually gets the equivalent of 40 to 50 percent of the gross.
Of course, there are other, non-theatrical revenue streams. Like merchandising. And TV rights. But the value of these is often related to the expectations, and actuality, of its theatrical run--the merchandise for a movie no one liked not likely to sell very well. Moreover, if those revenues are often hugely important, as the straight-to-streaming experiments of the last few years demonstrated, there is little that can compete with a $20 ticket for cash-in-hand. The result is that a profitable theatrical run generally remains the standard of judgment, the movie that (these days, at least) only covers its cost after years of money trickling in from other sources understandably deemed a failure financially.
So, back to the 40-50 percent figure. According to my rule of thumb that 40 percent should equal twice the production budget--so that one can say that the global gross should be five times that budget.
With Shazam we have a movie with a budget reported as $100-$125 million. That could see the movie be profitable with $400 million taken in (or even less, depending on how it was financed). However, it would seem safer to call the movie profitable if it crosses the $600 million mark.
Does it have much chance of that? Well, Shazam took in $366 million in early 2019. This is, to go by the Consumer Price Index, about equal to $430 million at last (February 2023) check. Should the new movie merely match the original's performance then it might make it--but I suspect it would fall short of really being a success.
Meanwhile the actual expectations for the new movie's gross are . . . somewhat lower. The current Boxoffice Pro prediction for the opening weekend is $35 million. Should the film, like many a sequel to such a movie, take in forty percent of its money on opening weekend it would have a hard time grossing $100 million in North America. Should the balance between domestic and international gross approximate that of the original one would see the movie make perhaps $250 million--rather less than its predecessor globally as well as domestically. And it is far from implausible that it could do less business than that.
The result is that, while there was never a very good chance of a Shazam 3 given the big shake-up at DC, I doubt the executives at WBD will be tempted to give the franchise a third shot very soon.
Comparing and Contrasting Box Office Underperformance: Black Panther 2 vs. Atom-Man 3
When looking over Black Panther 2's box office numbers back in the autumn I argued that there was a case for regarding the film as a commercial failure--as it had made, in inflation-adjusted terms, only half what its predecessor had, while given its big budget (reported as $250 million) it may have been short of covering its cost at the end of its theatrical run.
While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.
It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.
While writing for the Internet tends to have a very short half-life generally I have seen some of my posts here go on steadily accumulating views (and even comments) for years, and others get very little attention after their initial appearance. Comments on "this weekend's" box office, to no one's surprise, tend to go in the latter category. The result was that I was surprised when that particular item had a significant uptick in views months after the fact. Perhaps this was a function of the fuss over the Oscars (which left some feeling that Angela Bassett's performance in that movie was snubbed when Jamie Lee Curtis took home the statue). However, it also seems likely that this is a matter of how the next Marvel movie, Ant-Man 3, became an object of a similar "hit or flop?" argument--a more conspicuous one.
It seems to me that one can say that both movies' performance was not all that was hoped for by their backers and fans given the expectations ordinarily attaching to Marvel releases, the particularly strong reception for the original Black Panther, the resources invested in the movies--and of course, the feeling that Marvel was overdue for a "win." However, people seemed more inclined to say this in the case of Ant-Man 3, with this, the culture wars aside, a matter of the raw numbers. Black Panther 2 took in over $400 million domestically and $800 million globally--numbers not ordinarily associated with the word "flop," with, again, my suggestion that it could be seen as a failure a matter of contrast with the extreme success of the first movie, and the extremely high cost of the second. By contrast Ant-Man 3 (which is not very much less expensive than Black Panther 2, budgeted as it is at $200 million) has, after a month, barely scored $200 million in North America, while its prospects of reaching the half billion dollar mark are in doubt. Again, those numbers are not ordinarily associated with flops, but, helped by a less-strong-than-hoped-for second weekend, more easily let the media commentariat come to that conclusion.
Ant-Man 3's Underperformance and the Trajectory of Marvel
In the view of the analysts I have read the disappointing box office performance of Ant-Man 3 was a measure of the viability not only of a relatively minor superhero movie, but rather the bigger Marvel Cinematic Universe of which it is a part--with the disappointment the more significant because of how the franchise has fared in recent years. Once going from strength to strength, all the way up to the two-part confrontation of the MCU's assembled heroes with Thanos (a two-film, $5 billion grossing whose second part became the "highest-grossing film in history") it has been very different with the subsequent "Phase Four." Certainly its problems (and particularly those of its first three movies--Black Widow, The Eternals, and Shangi-Chi and the Legend of the Ten Rings) have been partially attributable to the pandemic, and at the international level, also to Phase Four's shutout from China's market. However, it is undeniable that the content itself has been an issue, with anything after Phase Three's build-up to and realization of the battle with Thanos an anticlimax--no "going bigger" really plausible here--while there was a pervasive sense of the pursuit of diminishing returns, be it in the turn toward prequels with Black Widow, to more obscure characters in The Eternals, and the all too characteristic weariness and tonal confusion of a series gone on too long in Thor: Love and Thunder, even before one considers such awkwardness as the excessive eagerness of Disney to tie its feature films to its small-screen streaming series in Dr. Strange 2 and the presentation of a Black Panther sequel without . . . Black Panther. The result was that only the Spider-Man sequel, which was again a one-of-a-kind event in its drawing together the stuff of three separate big-screen Spider-Man series into one sprawling "multiverse" narrative really felt like an event. And unsurprisingly, only that one performed really "above and beyond" at the box office (in fact becoming the movie that, with almost $2 billion banked globally, proved "filmgoing is back").
The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.
Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.
The result has been that even the claqueurs of the entertainment press no longer applaud Marvel so loudly as they once did, instead admitting that all is not well. One could see this as setting up audiences for a story of fall and redemption ("Hooray for Phase Five! Hooray for the return of Bob Iger! And boo Bob Chapek and Kevin Feige! Boo!"). However, Ant-Man 3 never looked to me a particularly likely source of redemption (even if it succeeded it was a long shot for the kind of $2 billion+ success that would really make people say "Marvel is back!"), and, again, it has only contributed to the sense of a franchise in trouble.
Of course, one should also qualify that. Before the (justified) turn of the view of Ant-Man 3 as hit into a view of it as a flop the film did have that strong opening weekend, testifying to the continued readiness of a significant audience to come out for even a second-string Marvel movie. The drop-off after that first weekend suggests that they were less than impressed with what they saw--and that rather than people giving up on Marvel, Marvel is simply doing a poorer job of satisfying the expectations it raises, such that they will in time be less likely to come out that way once more. After all, as we have been reminded time and again, no franchise is bulletproof, with Star Wars now reminding us of the fact--and Marvel, which in its dominance of blockbuster filmmaking for so many years looked like Star Wars in its heyday, perhaps now beginning to remind us of Star Wars in its now conspicuous and advanced decline.
Subscribe to:
Posts (Atom)