As of the time of this writing we have seen the box office gross for Captain America in not just the first but the second weekend of its release. The movie's opening was within the range of common expectation, but unspectacular--some $89 million for the Friday-to-Sunday period, and then $100 million when we added in the subsequent Monday.
Compare that to what Ant-Man 3 took in on the first three days of its release, and the four day weekend as a whole exactly two years earlier--some $106 million over the first three days, and $120 million over the first four. Adjusting for inflation Captain America's grosses were, for both periods, down about a fifth from what that movie did, with all that means for the film up to that point, even though one would conventionally expect more people to come out for Captain America than for Scott Lang.
However, an opening weekend tells us only so much. Even a weak opening may be redeemed by good week-to-week holds--and this was where Ant-Man 3 really suffered. The movie's gross fell 70 percent from the first weekend to the second--a number generally recognized as terrible, and which did indeed prove a sign of things to come, with the movie ultimately barely doubling the gross of its first three days over its entire run (instead of multiplying it by two-and-a-half or even three times in the way so many Marvel movies have done).
And as it happened, in spite of a less than wholly enthusiastic audience response there was some expectation that Captain America 4 would do better, with the folks at Boxoffice Pro anticipating a drop of just 50-60 percent, and in the process a gross of $37-$44 million.
Alas, the sub-Ant Man 3 gross has been followed by a weekend-on-weekend fall in the take about as bad as Ant-Man's, the current report of a $28 million gross working out to a 68 percent fall. It also works out to a ten day take of $141 million--which, again, is about a fifth lower than Ant-Man's inflation-adjusted ten day gross (the $177 million its $167 million would be in today's dollars).
It therefore does not seem unreasonable to anticipate Captain America following an Ant-Man 3-like trajectory from an initial take lower than Ant-Man's, leaving the movie one-fifth down from Ant-Man's final gross. Some $225-$230 million in today's terms, think a finish for Captain America in the North American market of about $180 million. Even should the film do a little better--should it, by grace of better holds, prove to have Guardians of the Galaxy 2-like legs, for example--it would still only match Ant-Man 3, not best it. Meanwhile there seems little chance of rescue by the international market, which may not do more than match the North American gross.
The result is that the gross of $400-$500 million, which I suggested as the likely range for the film's global gross last month, seems eminently plausible.
Still, in considering the consequences of the movie's gross for Marvel's bottom line, one should note that even as these have fallen so has the report of the film's budget (?!). In 2024 the press for the film held that the budget for the production (the actual making of the movie) was $350 million, or even more, which was what I went with in my initial discussion of the film. In late January, however, it became commonplace to say that this was a gross overstatement, that the film really cost just a little over half that, some $180 million--a claim that profoundly changes the possibility of the movie turning a profit, or at least getting to the break-even point, if true. But is it? Most of the press has passed on the word uncritically. However, some are skeptical (World of Reel's Jordan Ruimy, citing the authoritative reports of Joanna Robinson and Dave Gonzalez, calls the lower figure "too silly to believe"), while the imprecision of the discussion leaves at least some room for reconciling the different figures. (For example, consider the distinction drawn by Deadline where it presents "production" and "interest and overhead" on separate lines, even though the latter are undeniably part of the bill, all as it is far from impossible that an initially intended $180 million production's cost exploded during the much-talked about reshoots.) Moreover, Marvel has had a history of flat-out underreporting its outlays for movies, and then having the facts given away by, for example, the documentation entailed in its claiming government subsidy for the production (as seen on many occasions when it held out its hand for a rebate from the United Kingdom). Perhaps we will see such a revelation down the road in the case of this film, perhaps not, and should it really prove to be a $180 million movie, and the backers not spent too much on promotion and the rest, the movie would still have some chance of at least breaking even with the help of the post-theatrical revenue streams.
Whether the gross is very promising for the subsequent Marvel movies lining up for their chance at the box office, however, is another matter entirely.
Thursday, February 27, 2025
What the YR4 Drama Tells us About Techno-Hype
The past several weeks have seen the danger posed to Earth by Near Earth Objects return to the news in connection with what, for a short time, seemed the exceptional level of risk posed by the asteroid YR4 (scientists briefly estimating an unprecedented 3 percent chance of the possibly "city-killing" object hitting Earth in 2032, since fallen to a 0.0005 percent level of risk).
I suspect that most of the public will forget all about YR4 in a short period of time, as the whole issue it represents passes out of its mind. Still, the prospect of a real threat was sufficiently well-publicized that some asked what could be done were the threat of a collision with such an object to prove genuine--and got an answer I suspect many found disappointing, namely "Not much." The reason was that the successful execution of such a mission might require as much as a decade's lead time, and an impact in 2032 meant that the "window of opportunity" might already have closed.
As is commonly the case when the "experts" tell the public that a very desirable thing cannot be done it seemed that the misanthropes and pseudo-intellectual snobs who people the web's fora had a big laugh at the expense of those who simply do not appreciate the complicated, exacting and unforgiving nature of such a mission, while sneering at those who get their notions about technical feasibility from Michael Bay movies (perhaps the more in as the was not exactly new information).
Still, it seems to me that those disappointed to learn this have better grounds for thinking so than the mockery would suggest. After all, the threat is one that has been well-known to the public for decades. They are told of the existence of "Space Forces," and even agencies devoted to planetary defense specifically. They are well aware that their governments, whose spokespersons endlessly tell them "There's no money" for anything that might improve their daily lives as they inflict austerity without cease upon them, officially spend well over $2 trillion a year on what is (usually euphemistically) called defense, and less officially much more, with much of the money going to programs that seem relevant--as they are devoted to watching the sky, putting things into space, making big bangs. And of course, they are always being told about "INNOVATION!" which will "CHANGE EVERYTHING!" After all, consider the miracles of space access and space development people were told "Newspace" would accomplish simply because the "entrepreneurs" that the George Gilders treat as God on Earth were now on the job.
On the basis of all that one is not entirely unreasonable to expect that with all that the capacity to protect the Earth against existential risks of this nature might at least be making greater strides than has been seen to date--and indeed to see in the "actually existing" level of planetary defense capability something not only disappointing, or worrying, but an indictment of the status quo in a way that makes worthwhile another mention of disaster movies like Armageddon. It is certainly true that such works have misled the public by giving them the impression that our technology is far more advanced than it really is, our means for generating needed new technologies more swiftly responsive than they really are, and above all the willingness or ability of "entrepreneurs" to "make it happen." (Contrary to the NewSpace fantasy so dear to a particular kind of strident web-permeating ideologue, what business, and government, have, in spite of a few interesting developments, like rockets that land themselves on their launch pads Golden Age sci-fi-style, mostly they have been retreading old ground, and even looking to reconstruct capabilities that we had a half century ago, exemplary of which is the struggle to build a new booster merely matching the lift capability of the Saturn V!) However, there is still a worse way in which such depictions have misled them, namely in making them think that the officialdom which oversees their lives consists of essentially intelligent, conscientious people sufficiently earnest about keeping the world in one piece that they can be counted on to rise to the challenge when presented with such a crisis. As government response to climate change, pandemics and much, much else shows--not least, the problem of planetary defense--that image could not possibly be further from the reality.
I suspect that most of the public will forget all about YR4 in a short period of time, as the whole issue it represents passes out of its mind. Still, the prospect of a real threat was sufficiently well-publicized that some asked what could be done were the threat of a collision with such an object to prove genuine--and got an answer I suspect many found disappointing, namely "Not much." The reason was that the successful execution of such a mission might require as much as a decade's lead time, and an impact in 2032 meant that the "window of opportunity" might already have closed.
As is commonly the case when the "experts" tell the public that a very desirable thing cannot be done it seemed that the misanthropes and pseudo-intellectual snobs who people the web's fora had a big laugh at the expense of those who simply do not appreciate the complicated, exacting and unforgiving nature of such a mission, while sneering at those who get their notions about technical feasibility from Michael Bay movies (perhaps the more in as the was not exactly new information).
Still, it seems to me that those disappointed to learn this have better grounds for thinking so than the mockery would suggest. After all, the threat is one that has been well-known to the public for decades. They are told of the existence of "Space Forces," and even agencies devoted to planetary defense specifically. They are well aware that their governments, whose spokespersons endlessly tell them "There's no money" for anything that might improve their daily lives as they inflict austerity without cease upon them, officially spend well over $2 trillion a year on what is (usually euphemistically) called defense, and less officially much more, with much of the money going to programs that seem relevant--as they are devoted to watching the sky, putting things into space, making big bangs. And of course, they are always being told about "INNOVATION!" which will "CHANGE EVERYTHING!" After all, consider the miracles of space access and space development people were told "Newspace" would accomplish simply because the "entrepreneurs" that the George Gilders treat as God on Earth were now on the job.
On the basis of all that one is not entirely unreasonable to expect that with all that the capacity to protect the Earth against existential risks of this nature might at least be making greater strides than has been seen to date--and indeed to see in the "actually existing" level of planetary defense capability something not only disappointing, or worrying, but an indictment of the status quo in a way that makes worthwhile another mention of disaster movies like Armageddon. It is certainly true that such works have misled the public by giving them the impression that our technology is far more advanced than it really is, our means for generating needed new technologies more swiftly responsive than they really are, and above all the willingness or ability of "entrepreneurs" to "make it happen." (Contrary to the NewSpace fantasy so dear to a particular kind of strident web-permeating ideologue, what business, and government, have, in spite of a few interesting developments, like rockets that land themselves on their launch pads Golden Age sci-fi-style, mostly they have been retreading old ground, and even looking to reconstruct capabilities that we had a half century ago, exemplary of which is the struggle to build a new booster merely matching the lift capability of the Saturn V!) However, there is still a worse way in which such depictions have misled them, namely in making them think that the officialdom which oversees their lives consists of essentially intelligent, conscientious people sufficiently earnest about keeping the world in one piece that they can be counted on to rise to the challenge when presented with such a crisis. As government response to climate change, pandemics and much, much else shows--not least, the problem of planetary defense--that image could not possibly be further from the reality.
Christopher Nolan's Odyssey Movie: A Few Thoughts
After the commercial (and critical) triumph that Christopher Nolan enjoyed with Oppenheimer, which was as colossal as it was unlikely-seeming, there was quite some buzz about what project Nolan would take up next.
I suspect few expected that the next Nolan movie would be an adaptation of Homer's The Odyssey--the actual production of which has been confirmed by a much-publicized "first look" this past week.
Frankly I'm a bit disappointed by the revelation. With Oppenheimer Nolan used a weight that very few artists within Hollywood have to make a very serious movie about serious subjects (Big Science, nuclear war, the Anti-Communist hysteria and political persecution of the Cold War period) in a way we don't see much these days, in part because any noticing that there is such a thing as society and that it may not be perfect is so little tolerated--and one might have hoped that he would use his even stronger position after making the movie to do so again.
However, if at this point so little has been said about the Odyssey movie that the interested have little to do but discuss Matt Damon's anachronistic-looking helmet, in choosing to make this particular movie Nolan can seem to be turning away from that kind of challenge, perhaps retreating into the "safer" territory he has so long occupied, of myth and psycho-babble (we got plenty of that with Batman, and Inception), and history-as-spectacle, Ridley Scott-style (such as we had with Dunkirk).
For the record I hope to be wrong about that.
I suspect few expected that the next Nolan movie would be an adaptation of Homer's The Odyssey--the actual production of which has been confirmed by a much-publicized "first look" this past week.
Frankly I'm a bit disappointed by the revelation. With Oppenheimer Nolan used a weight that very few artists within Hollywood have to make a very serious movie about serious subjects (Big Science, nuclear war, the Anti-Communist hysteria and political persecution of the Cold War period) in a way we don't see much these days, in part because any noticing that there is such a thing as society and that it may not be perfect is so little tolerated--and one might have hoped that he would use his even stronger position after making the movie to do so again.
However, if at this point so little has been said about the Odyssey movie that the interested have little to do but discuss Matt Damon's anachronistic-looking helmet, in choosing to make this particular movie Nolan can seem to be turning away from that kind of challenge, perhaps retreating into the "safer" territory he has so long occupied, of myth and psycho-babble (we got plenty of that with Batman, and Inception), and history-as-spectacle, Ridley Scott-style (such as we had with Dunkirk).
For the record I hope to be wrong about that.
Should There Even be a Bond 26?
For many years what has passed for "news" about the James Bond franchise in the entertainment press has been a mix of stale gossip, banal commentary (what somebody you never heard of thinks they should do with the next Bond film), and baseless speculations about what the next movie may be like--keeping "James Bond" in the public consciousness with clickbaity pseudo-news obscuring the fact that the franchise has been effectively moribund for many years now.
However, February 2025 did bring a genuine piece of news about the franchise, namely that "creative control" of the Everything or Nothing film series (which, excluding such marginal oddities as the eventually MST3K'ed Operation Kid Brother , made all the Bond feature films save for Charles K. Feldman's spoof Casino Royale and Jack Schwartzman's Thunderball remake Never Say Never Again) has passed from the Broccoli film series, where it has been since the series began, to Amazon.
What I have seen of the coverage of the event from the more established quarters of that press reports the fact rather than opinionates about it--perhaps not unreasonably, given that nothing has been said of just what Amazon would be likely to do with that control. Yet, as anyone with any brain function whatsoever would guess, much of the attentive part of the public have been less than delighted with the news, with, indeed, the web quickly becoming awash in recollections of Amazon's thus far highly divisive handling of the Rings of Power series as they make cracks about the James Bond franchise being treated in the same manner. ("Here comes the 'James Bond universe!" for instance, goes one of the less offensive variations on the theme.)
However, the fact of the passing of creative control from one group to another seems to me secondary next to the more fundamental fact that there is really nowhere left to go with the James Bond franchise, which has become superfluous in the pop cultural landscape of today in the way that such creations inevitably do. Once again, from the standpoint of pop cultural history James Bond mattered for two reasons, namely that
1. James Bond does not represent the invention of the fictional spy as "international man of mystery," but rather was an update of the Duckworth Drew-Bulldog Drummond tradition of such espionage adventure for a new generation. Ian Fleming brought the then half century old and fairly shopworn type into the world of the 1950s, while the film series gave him a makeover for the 1960s (and one might add, that made it salable to a far, far broader audience than the one Fleming aimed for when he sat down to write Casino Royale). Thus does that type of character and his associated adventures remain familiar to us today.
2. In providing that update the film series also did much to shape, if not invent, the action movie, and action-adventure franchise, as we know it. Indeed, if most people think of "James Bond" and think "spy," what really made the films about this spy stand so tall at the box office over his predecessors, contemporaries and imitators was that moviegoers thought "James Bond" and thought "action," the films offering a kind and scale of action-adventure spectacle that Hollywood would not even show any understanding of until Star Wars, or seriously compete with until the 1980s.
Both of those contributions are undeniably significant, but the series made them over a half century in the past at this point, as what was unique to the Bond series has become commonplace. After all, not only are we positively saturated in action movies now, but in the '90s Hollywood began cranking out big-budget action-adventures starring spies specifically, with considerable box office success, as seen in the familiarity of Jack Ryan, Harry Tasker, Ethan Hunt, Jason Bourne, Xander Cage, Dom Toretto and company, and Nick Fury and the rest of the Marvel universe's many secret agents to moviegoers. Since then the continuation of the series has sold tickets on the strength of the "brand name" the series established in its earlier day, viewers' nostalgia for the more memorable moments that have only grown rarer with the years, those occasions when it engaged fans with a more successful reuse of its well-known elements, and the continuous update of the surface gloss in a way that has seen the series-runners following rather than leading the way they did in their heyday (already evident in 1973's Live and Let Die's use of "blaxploitation"). Indeed, on many an occasion the film series has in its imitativeness often seemed to be on the verge of losing everything that gave it any distinction at all (as with 1989's Licence to Kill, which looked and felt like a standard "Someone Killed My Favorite Second Cousin" '80s action movie down to the extra-brutal violence that earned the first cut an R rating, and less popular as this view may be, the ballyhooed reboot of the twenty-first century).
If one accepts that then the right thing to do would be to grant that the series has had a good run and gracefully set it aside--but that, of course, is not how movieland works. The vulgar Dauriats, after all, are in even more complete command of the world of the world than they are the world of publishing, where, as Balzac wrote, the monument a writer "reared with . . . life-blood" is nothing more than "a good or a bad speculation" for a businessman concerned only with how much money he can make, how quickly, with as little effort or risk as possible--all as, alas, the businessmen in question are not necessarily very good at even that.
As it happens the state of the market is not very attractive to anyone seriously thinking about putting out a new Bond film. The reality is that if the biggest blockbusters make as much as ever they did the reduced level of moviegoing compared with even a half decade ago means that with only so many ticket sales to go round there are more flops--with the record in 2023 very instructive here, 2024 less striking that way mainly because the release slate was so thin that year, and 2025 looking very much as if it could be a repeat of 2023 (with the weak January already behind us, and the luke-warm reception of Captain America 4, only affirming such a reading of the situation). Moreover, the Hollywood spy movie boom that began in the '90s gives every impression of unraveling as the latest Mission: Impossible and Fast and Furious films underperform, just as the last Bond movie did--North American audiences, certainly, not having much time for No Time to Die, the weakest performer in that market since 1989's flop Licence to Kill, with, especially worrying here, the analysis of the demographics of those who did buy tickets leaving ever less doubt that the Bond franchise was surviving on the interest of an aging fan base, the young scarcely aware and even less interested.
Indeed, all this has likely played its role in delaying the franchise's continuation to date. Still, where a franchise has been successful in the past studio executives are very reluctant to give it up, so much so that they have to lose a lot of money, and that maybe more than once, before they so much as let a franchise lie fallow after a while--and if its real glory days are far behind it no franchise has been as successful for as long as the Bond films. At the same time this tendency reinforced by the fact that never have such decision-makers been so flattered by their courtiers, with the predictable result the increase of their stupidity and arrogance; that so many have been prepared to grade the performance of movies at the box office on a curve, and indeed pass off money-losing flops as great hits; and that there is such an unprecedented hostility to taking chances on new material, and still more, revising the model for pursuing cinematic success in the way that the tightened market and exhaustion of old formulas seems to demand. The result is that, even if we do not see another Bond movie for many, many more years, there is no chance of those who call the shots gracefully letting 007 go.
However, February 2025 did bring a genuine piece of news about the franchise, namely that "creative control" of the Everything or Nothing film series (which, excluding such marginal oddities as the eventually MST3K'ed Operation Kid Brother , made all the Bond feature films save for Charles K. Feldman's spoof Casino Royale and Jack Schwartzman's Thunderball remake Never Say Never Again) has passed from the Broccoli film series, where it has been since the series began, to Amazon.
What I have seen of the coverage of the event from the more established quarters of that press reports the fact rather than opinionates about it--perhaps not unreasonably, given that nothing has been said of just what Amazon would be likely to do with that control. Yet, as anyone with any brain function whatsoever would guess, much of the attentive part of the public have been less than delighted with the news, with, indeed, the web quickly becoming awash in recollections of Amazon's thus far highly divisive handling of the Rings of Power series as they make cracks about the James Bond franchise being treated in the same manner. ("Here comes the 'James Bond universe!" for instance, goes one of the less offensive variations on the theme.)
However, the fact of the passing of creative control from one group to another seems to me secondary next to the more fundamental fact that there is really nowhere left to go with the James Bond franchise, which has become superfluous in the pop cultural landscape of today in the way that such creations inevitably do. Once again, from the standpoint of pop cultural history James Bond mattered for two reasons, namely that
1. James Bond does not represent the invention of the fictional spy as "international man of mystery," but rather was an update of the Duckworth Drew-Bulldog Drummond tradition of such espionage adventure for a new generation. Ian Fleming brought the then half century old and fairly shopworn type into the world of the 1950s, while the film series gave him a makeover for the 1960s (and one might add, that made it salable to a far, far broader audience than the one Fleming aimed for when he sat down to write Casino Royale). Thus does that type of character and his associated adventures remain familiar to us today.
2. In providing that update the film series also did much to shape, if not invent, the action movie, and action-adventure franchise, as we know it. Indeed, if most people think of "James Bond" and think "spy," what really made the films about this spy stand so tall at the box office over his predecessors, contemporaries and imitators was that moviegoers thought "James Bond" and thought "action," the films offering a kind and scale of action-adventure spectacle that Hollywood would not even show any understanding of until Star Wars, or seriously compete with until the 1980s.
Both of those contributions are undeniably significant, but the series made them over a half century in the past at this point, as what was unique to the Bond series has become commonplace. After all, not only are we positively saturated in action movies now, but in the '90s Hollywood began cranking out big-budget action-adventures starring spies specifically, with considerable box office success, as seen in the familiarity of Jack Ryan, Harry Tasker, Ethan Hunt, Jason Bourne, Xander Cage, Dom Toretto and company, and Nick Fury and the rest of the Marvel universe's many secret agents to moviegoers. Since then the continuation of the series has sold tickets on the strength of the "brand name" the series established in its earlier day, viewers' nostalgia for the more memorable moments that have only grown rarer with the years, those occasions when it engaged fans with a more successful reuse of its well-known elements, and the continuous update of the surface gloss in a way that has seen the series-runners following rather than leading the way they did in their heyday (already evident in 1973's Live and Let Die's use of "blaxploitation"). Indeed, on many an occasion the film series has in its imitativeness often seemed to be on the verge of losing everything that gave it any distinction at all (as with 1989's Licence to Kill, which looked and felt like a standard "Someone Killed My Favorite Second Cousin" '80s action movie down to the extra-brutal violence that earned the first cut an R rating, and less popular as this view may be, the ballyhooed reboot of the twenty-first century).
If one accepts that then the right thing to do would be to grant that the series has had a good run and gracefully set it aside--but that, of course, is not how movieland works. The vulgar Dauriats, after all, are in even more complete command of the world of the world than they are the world of publishing, where, as Balzac wrote, the monument a writer "reared with . . . life-blood" is nothing more than "a good or a bad speculation" for a businessman concerned only with how much money he can make, how quickly, with as little effort or risk as possible--all as, alas, the businessmen in question are not necessarily very good at even that.
As it happens the state of the market is not very attractive to anyone seriously thinking about putting out a new Bond film. The reality is that if the biggest blockbusters make as much as ever they did the reduced level of moviegoing compared with even a half decade ago means that with only so many ticket sales to go round there are more flops--with the record in 2023 very instructive here, 2024 less striking that way mainly because the release slate was so thin that year, and 2025 looking very much as if it could be a repeat of 2023 (with the weak January already behind us, and the luke-warm reception of Captain America 4, only affirming such a reading of the situation). Moreover, the Hollywood spy movie boom that began in the '90s gives every impression of unraveling as the latest Mission: Impossible and Fast and Furious films underperform, just as the last Bond movie did--North American audiences, certainly, not having much time for No Time to Die, the weakest performer in that market since 1989's flop Licence to Kill, with, especially worrying here, the analysis of the demographics of those who did buy tickets leaving ever less doubt that the Bond franchise was surviving on the interest of an aging fan base, the young scarcely aware and even less interested.
Indeed, all this has likely played its role in delaying the franchise's continuation to date. Still, where a franchise has been successful in the past studio executives are very reluctant to give it up, so much so that they have to lose a lot of money, and that maybe more than once, before they so much as let a franchise lie fallow after a while--and if its real glory days are far behind it no franchise has been as successful for as long as the Bond films. At the same time this tendency reinforced by the fact that never have such decision-makers been so flattered by their courtiers, with the predictable result the increase of their stupidity and arrogance; that so many have been prepared to grade the performance of movies at the box office on a curve, and indeed pass off money-losing flops as great hits; and that there is such an unprecedented hostility to taking chances on new material, and still more, revising the model for pursuing cinematic success in the way that the tightened market and exhaustion of old formulas seems to demand. The result is that, even if we do not see another Bond movie for many, many more years, there is no chance of those who call the shots gracefully letting 007 go.
The Prospect of a Sixth Jason Bourne Movie
I was surprised to hear of plans for a "Jason Bourne 6" some time ago.
Little has been heard of the project for some time, and it may be that the movie is at this point stuck in development hell. But that there had been any seriousness about the project is in itself surprising--at least, if one overlooks the extreme franchise addiction of the studio executives.
After all, it has been almost a quarter of a century since the first Jason Bourne film, and now nearly a decade since the last entry in the series, 2016's Jason Bourne--all as the franchise hit its commercial peak with the third film way back in 2007 with The Bourne Ultimatum (not incidentally, also the logical ending of the series), with all that implies for the franchise's vitality and cachet. Indeed, at this point it is common for a series' runners to be thinking "reboot" rather than "sequel," and that the producers have not gone that course due to the fact that the attempt at a reboot they already made with the fourth film (again, because number three was the logical ending) failed, making the executives hasten to get Matt Damon back into the series, with, again, what that implies about the shakiness of the series' standing.
However, as if this were not enough there is the reality that in the significantly contracted post-pandemic franchise action movies are offering more risk and less return than they used to do, and the Bourne movies are not particularly strong contenders here. At the box office they were respectable earners, not spectacular earners, Bourne Ultimatum taking in a mere $443 million, not quite $700 million in January 2025 dollars, and the last film, Jason Bourne, with a gross of $550 million in today's terms (such that it did not make Deadline's list of the year's top twenty profit-makers), leaving the margins less impressive than they might be. Meanwhile the boom in Hollywood spy-fi that got underway in the '90s seems to have run its course--for the last entries in the James Bond, Mission: Impossible and Fast and Furious franchises have all underperformed so badly as to give the impression of those franchises sputtering out altogether, and the Jack Ryan franchise spared that fate mainly by its having shifted to the small screen as a Netflix series years earlier (after, again, another underperformance by the 2015 film that represented a third reboot for the series).
The result is that a Bourne 6 is a very, very big gamble today. That doesn't mean that it won't happen--after all, Gladiator 2 happened (and went on to validate the predictions of the skeptics)--but the road to an actual film is likely to be long and rocky at best, the receipts disappointing, and if on a more modest scale than Gladiator 2, the result altogether a reminder that the fact that studio executives keep making losing bets on such films has absolutely nothing to do with the executives being slaves to consumer demand.
Little has been heard of the project for some time, and it may be that the movie is at this point stuck in development hell. But that there had been any seriousness about the project is in itself surprising--at least, if one overlooks the extreme franchise addiction of the studio executives.
After all, it has been almost a quarter of a century since the first Jason Bourne film, and now nearly a decade since the last entry in the series, 2016's Jason Bourne--all as the franchise hit its commercial peak with the third film way back in 2007 with The Bourne Ultimatum (not incidentally, also the logical ending of the series), with all that implies for the franchise's vitality and cachet. Indeed, at this point it is common for a series' runners to be thinking "reboot" rather than "sequel," and that the producers have not gone that course due to the fact that the attempt at a reboot they already made with the fourth film (again, because number three was the logical ending) failed, making the executives hasten to get Matt Damon back into the series, with, again, what that implies about the shakiness of the series' standing.
However, as if this were not enough there is the reality that in the significantly contracted post-pandemic franchise action movies are offering more risk and less return than they used to do, and the Bourne movies are not particularly strong contenders here. At the box office they were respectable earners, not spectacular earners, Bourne Ultimatum taking in a mere $443 million, not quite $700 million in January 2025 dollars, and the last film, Jason Bourne, with a gross of $550 million in today's terms (such that it did not make Deadline's list of the year's top twenty profit-makers), leaving the margins less impressive than they might be. Meanwhile the boom in Hollywood spy-fi that got underway in the '90s seems to have run its course--for the last entries in the James Bond, Mission: Impossible and Fast and Furious franchises have all underperformed so badly as to give the impression of those franchises sputtering out altogether, and the Jack Ryan franchise spared that fate mainly by its having shifted to the small screen as a Netflix series years earlier (after, again, another underperformance by the 2015 film that represented a third reboot for the series).
The result is that a Bourne 6 is a very, very big gamble today. That doesn't mean that it won't happen--after all, Gladiator 2 happened (and went on to validate the predictions of the skeptics)--but the road to an actual film is likely to be long and rocky at best, the receipts disappointing, and if on a more modest scale than Gladiator 2, the result altogether a reminder that the fact that studio executives keep making losing bets on such films has absolutely nothing to do with the executives being slaves to consumer demand.
The Upcoming The Madalorian & Grogu: Thoughts
It seems that after many, many false starts Disney-Lucasfilm will finally be getting a new Star Wars film into theaters in May 2026--six and a half years after the last Star Wars movie (The Rise of Skywalker), and almost exactly eight years after the last really comparable movie, the side story Solo.
Not much is known about the movie, bar its being an obvious extension of the flagship TV series of the now rather sprawling small screen side of the franchise. Still, one thing about it caught my eye, namely the budget: $120 million. By no means low, it is also far from bank-busting for a Star Wars feature film, with Solo again a useful point of comparison. That film cost $250 million to produce back in 2018--about the equivalent of $315 million in today's money after we adjust for inflation. The Mandalorian film is thus running Disney a rough third of what the earlier movie did in real terms, with the more limited nature of the investment (likely to be complemented by a lower outlay for publicity) seeming to me relevant given the way the film market is going. Rather than putting up a tentpole and expecting "everyone" to show up for it, Disney may be aiming for the more limited portion of the audience that they might hope would really care about a Mandalorian movie--the project a limited investment for a limited but relatively enthusiastic market just big enough to let them turn a profit on a "low" budget. If so then whatever the film ultimately proves to be artistically, there is at least some soundness to the logic, this seeming to me exactly how movie producers need to play the game much more often if they are to keep making money at it.
Not much is known about the movie, bar its being an obvious extension of the flagship TV series of the now rather sprawling small screen side of the franchise. Still, one thing about it caught my eye, namely the budget: $120 million. By no means low, it is also far from bank-busting for a Star Wars feature film, with Solo again a useful point of comparison. That film cost $250 million to produce back in 2018--about the equivalent of $315 million in today's money after we adjust for inflation. The Mandalorian film is thus running Disney a rough third of what the earlier movie did in real terms, with the more limited nature of the investment (likely to be complemented by a lower outlay for publicity) seeming to me relevant given the way the film market is going. Rather than putting up a tentpole and expecting "everyone" to show up for it, Disney may be aiming for the more limited portion of the audience that they might hope would really care about a Mandalorian movie--the project a limited investment for a limited but relatively enthusiastic market just big enough to let them turn a profit on a "low" budget. If so then whatever the film ultimately proves to be artistically, there is at least some soundness to the logic, this seeming to me exactly how movie producers need to play the game much more often if they are to keep making money at it.
What Does the Release Slate for 2026 Say About How Hollywood is Handling its Crisis?
As of February 2025 the release slate for 2026 is far from complete. Still, anyone who might have thought that the shrinking market for theatrically released films and the signs that audiences have been getting fed up with Hollywood retreads of the same old franchises would nudge the studios in a different direction (I put in my two cents about what I think they need to be doing simply from a bottom-line perspective) will be quickly disabused of such expectations looking at the expected offerings for that year as reported by the relevant Wikipedia page (one page I have found a reliable guide in the past).
It is, after all, a year chock-full of movies in the same genres, from the same franchises. Yes, we have superheroes--with at least three Marvel movies not counting the fourth Spider-Man film, and three movies coming our way from DC Studios--along with a Star Wars movie, a Mummy movie, more Jumanji and Hunger Games, and another crack at the Street Fighter franchise. There is also animation of the familiar types aplenty, with Minions 3, Ice Age 6, Shrek 5, Toy Story 5, and of course, Super Mario 2, as well as a feature-length animated Cat in the Hat, and an animated-to-live-action remake of Moana to boot (less than two years after Moana 2!) due out. And of course, what does not belong to the big action movies and animated extravaganzas is substantially accounted for as horror also from the same franchises, including 28 Days Later, Scream, Exorcist and Insidious, and more parody of the same from the Scary Movie series.
If there is any possibility of adaptation on the part of the studios' behavior it would seem to lie not in the kinds of movies they make, which reflect their addiction to their long-established (if ever less reliable) model for generating blockbusters, but the management of the individual projects. While the publicly released data seems spotty, I get the impression that the studios, if still "big spenders" rather than "wise spenders" (to turn about a phrase of Tony Blair's), may be getting a little more careful with their money, if not with their expenditure on productions then at least their expenditures on promotion and the rest. (Indeed, looking at Deadline's numbers last year, if it was undeniably the case that the year was packed with flops, the numbers for films like Indiana Jones 5 and The Flash and Captain Marvel 2 not quite so bad as I thought they might be given the dismal grosses.) All the same, in finding their savings where they can as they stay an increasingly dubious course they may, especially as 2025, and now 2026, are shaping up to look like 2023, prove to be "penny wise but pound foolish."
It is, after all, a year chock-full of movies in the same genres, from the same franchises. Yes, we have superheroes--with at least three Marvel movies not counting the fourth Spider-Man film, and three movies coming our way from DC Studios--along with a Star Wars movie, a Mummy movie, more Jumanji and Hunger Games, and another crack at the Street Fighter franchise. There is also animation of the familiar types aplenty, with Minions 3, Ice Age 6, Shrek 5, Toy Story 5, and of course, Super Mario 2, as well as a feature-length animated Cat in the Hat, and an animated-to-live-action remake of Moana to boot (less than two years after Moana 2!) due out. And of course, what does not belong to the big action movies and animated extravaganzas is substantially accounted for as horror also from the same franchises, including 28 Days Later, Scream, Exorcist and Insidious, and more parody of the same from the Scary Movie series.
If there is any possibility of adaptation on the part of the studios' behavior it would seem to lie not in the kinds of movies they make, which reflect their addiction to their long-established (if ever less reliable) model for generating blockbusters, but the management of the individual projects. While the publicly released data seems spotty, I get the impression that the studios, if still "big spenders" rather than "wise spenders" (to turn about a phrase of Tony Blair's), may be getting a little more careful with their money, if not with their expenditure on productions then at least their expenditures on promotion and the rest. (Indeed, looking at Deadline's numbers last year, if it was undeniably the case that the year was packed with flops, the numbers for films like Indiana Jones 5 and The Flash and Captain Marvel 2 not quite so bad as I thought they might be given the dismal grosses.) All the same, in finding their savings where they can as they stay an increasingly dubious course they may, especially as 2025, and now 2026, are shaping up to look like 2023, prove to be "penny wise but pound foolish."
Remembering Barb Wire
With the Pamela Anderson film The Last Showgirl recently getting critical claim, and even Oscar buzz (but alas, no actual nominations) it seems a more than usually opportune time to remember her shot at the cinematic big time way back in 1996--Barb Wire.
As one might guess from how long ago that was, and how she hasn't exactly been in a lot of movies since, the film was a flop on its release in May 1996, just before the start of the summer movie season that gave us the original Twister, Independence Day and the first Mission: Impossible movie. Still, Barb Wire did become a cult hit in time, and it is not hard to see why. Even at the time Roger Ebert, in his far from rave review, acknowledged the film's "high energy level, and . . . sense of deranged fun," and how it bespoke just how much effort and craft the makers really did put into a comparatively low-budgeted ($9 million) B-movie that made for a rather more than passable piece of sci-fi action, assembled efficiently and tightly and with some flair.
Today the movie also has an interest as the kind of superhero film we used to get much more often then than now, and not just its efficiency and briskness compared with today's lumbering spectacles costing orders of magnitude as much. There is, too, its utilizing material drawn from outside the DC and Marvel A-lists, and indeed from outside DC and Marvel altogether (Barb Wire is a Dark Horse comics character, brought to you by the folks who also gave us The Rocketeer, and The Mask), the movie's being thoroughly R-rated (and not just for the copious violence, profanity and ribald humor)--and at the same time its being jokey and "meta" enough that I imagine many of those who really remember the movies of the '90s wonder at the fuss over Deadpool. After all, would you believe that for the plot we get a gender-switched pastiche of Casablanca in which Pamela is in the role of Rick, and Bobba Fett is in the position of Ilsa in a Kickpuncher-ish version of the then distant-seeming date of 2017?
As one might guess from how long ago that was, and how she hasn't exactly been in a lot of movies since, the film was a flop on its release in May 1996, just before the start of the summer movie season that gave us the original Twister, Independence Day and the first Mission: Impossible movie. Still, Barb Wire did become a cult hit in time, and it is not hard to see why. Even at the time Roger Ebert, in his far from rave review, acknowledged the film's "high energy level, and . . . sense of deranged fun," and how it bespoke just how much effort and craft the makers really did put into a comparatively low-budgeted ($9 million) B-movie that made for a rather more than passable piece of sci-fi action, assembled efficiently and tightly and with some flair.
Today the movie also has an interest as the kind of superhero film we used to get much more often then than now, and not just its efficiency and briskness compared with today's lumbering spectacles costing orders of magnitude as much. There is, too, its utilizing material drawn from outside the DC and Marvel A-lists, and indeed from outside DC and Marvel altogether (Barb Wire is a Dark Horse comics character, brought to you by the folks who also gave us The Rocketeer, and The Mask), the movie's being thoroughly R-rated (and not just for the copious violence, profanity and ribald humor)--and at the same time its being jokey and "meta" enough that I imagine many of those who really remember the movies of the '90s wonder at the fuss over Deadpool. After all, would you believe that for the plot we get a gender-switched pastiche of Casablanca in which Pamela is in the role of Rick, and Bobba Fett is in the position of Ilsa in a Kickpuncher-ish version of the then distant-seeming date of 2017?
Of Jamie Lee Curtis' Oscar "Snub"
When hypothesizing about upcoming movies' box office performance I don't give much thought to whether those movies are "good" or not according to some artistic criterion. As I am speculating about movies weeks or even months away from release there is no way of knowing that--all as, to be fair, cinematic quality according to any standard has an at best shaky relationship to box office receipts.
Likewise when I consider a film's Oscar prospects I do not go by quality (again, the nominations come out often before the general public has had a chance to see the films in question), but rather the boxes ticked by the film individually and the ceremony as a whole given the extreme politicization of the affair--aspects of which have been formalized in this age of "representation and inclusion standards" you can read right on the Academy's own web site.
So does it go as I look at the Academy's "snub" of Jamie Lee Curtis, "failing" to accord her the nomination many (including the folks at Variety) took for granted as coming to her for her role in The Last Showgirl. In fairness the film's makers ticked at least some of the right boxes, and so would have such recognition--given the obvious push to build up Gia Coppola as a filmmaker the way the Hollywood community built up her aunt Sofia a generation ago, as well as the desire to recognize the film for its thoroughly Academy-friendly theme. This is all the more the case given the fact that the film failed to make the cut in the other categories, all as "Best Supporting Actor/Actress" prizes in particular fulfill a "consolation prize" function in such situations. However, there are only so many slots for all the contenders to go around (perhaps not an impressive bunch by many a critical standard, but a robust enough bunch where the Academy's priorities are concerned), Curtis had not just the nomination but the prize itself for her role in Everything Everywhere All At Once just two years ago (in which, not incidentally, her characterization, on-screen impression, etc. were also a pointed contrast with her on-screen image in her younger years)--all as Demi Moore's turn in The Substance may have made her a front-runner for Best Actress, and that film's similarity of concern at multiple levels (extending from the situation of those aging while working in show business to the self-referential casting of the lead) in turn pushed the Academy in the direction of doing something else with the Best Supporting Actress prize, leaving Curtis, and Showgirl, empty-handed.
Likewise when I consider a film's Oscar prospects I do not go by quality (again, the nominations come out often before the general public has had a chance to see the films in question), but rather the boxes ticked by the film individually and the ceremony as a whole given the extreme politicization of the affair--aspects of which have been formalized in this age of "representation and inclusion standards" you can read right on the Academy's own web site.
So does it go as I look at the Academy's "snub" of Jamie Lee Curtis, "failing" to accord her the nomination many (including the folks at Variety) took for granted as coming to her for her role in The Last Showgirl. In fairness the film's makers ticked at least some of the right boxes, and so would have such recognition--given the obvious push to build up Gia Coppola as a filmmaker the way the Hollywood community built up her aunt Sofia a generation ago, as well as the desire to recognize the film for its thoroughly Academy-friendly theme. This is all the more the case given the fact that the film failed to make the cut in the other categories, all as "Best Supporting Actor/Actress" prizes in particular fulfill a "consolation prize" function in such situations. However, there are only so many slots for all the contenders to go around (perhaps not an impressive bunch by many a critical standard, but a robust enough bunch where the Academy's priorities are concerned), Curtis had not just the nomination but the prize itself for her role in Everything Everywhere All At Once just two years ago (in which, not incidentally, her characterization, on-screen impression, etc. were also a pointed contrast with her on-screen image in her younger years)--all as Demi Moore's turn in The Substance may have made her a front-runner for Best Actress, and that film's similarity of concern at multiple levels (extending from the situation of those aging while working in show business to the self-referential casting of the lead) in turn pushed the Academy in the direction of doing something else with the Best Supporting Actress prize, leaving Curtis, and Showgirl, empty-handed.
How Did January Go at the Box Office?
As remarked here previously 2024 was a terrible year for the box office, weaker even than 2023, with this largely a matter of the first half of the year or so before, from June forward, a number of major hits came to the rescue (the sequels to Inside Out, Despicable Me, Deadpool, Beetlejuice and in the fall, Moana 2 and Wicked especially, the $2.8 billion collected by which six films represented a third of the whole year's box office gross). January had been especially weak, taking in about 40 percent of the 2015-2019 average in real terms, in part because the slate of holiday releases that normally help carry the box office in January was so tepidly received--all as the month's new releases failed to draw moviegoers to the theaters. (Thus, as Aquaman 2, the Color Purple remake, the Hunger Games prequel and the rest underperformed horribly did Wonka, with a mere $200 million, prove the holiday season's box office champion, while the month's supposedly most promising release, the musical remake of Mean Girls that nobody asked for, fell flat.)
The fall season of 2024 having been rather more robust (with four films at least roughly matching Wonka's gross, and Moana 2 and the musical Wicked doubling it) one might wonder if January did better. It was an improvement, but not much of an improvement. The January 2024 box office take was just a little under a half billion dollars (with $496 million), while the January 2025 box office has gone just a little over the mark, sufficiently little that year-on-year inflation (about 2.9 percent) left it only very marginally better than the prior year's performance (its $544 million total really just a scarce 6 percent improvement, leaving January 2025 with about 42-43 percent of the 2015-2019 average for the month).
Even granting how little is expected of January this was an inauspicious start to a year which, more like 2023 than 2024 in this respect, will sorely test Hollywood's tendency to "business as usual."
The fall season of 2024 having been rather more robust (with four films at least roughly matching Wonka's gross, and Moana 2 and the musical Wicked doubling it) one might wonder if January did better. It was an improvement, but not much of an improvement. The January 2024 box office take was just a little under a half billion dollars (with $496 million), while the January 2025 box office has gone just a little over the mark, sufficiently little that year-on-year inflation (about 2.9 percent) left it only very marginally better than the prior year's performance (its $544 million total really just a scarce 6 percent improvement, leaving January 2025 with about 42-43 percent of the 2015-2019 average for the month).
Even granting how little is expected of January this was an inauspicious start to a year which, more like 2023 than 2024 in this respect, will sorely test Hollywood's tendency to "business as usual."
Putting the Computing Revolution into Economic Perspective
For decades it has been common for economists to argue that the contributions of computing to productivity, value creation, growth must be far vaster than the figures suggest, to the point of rendering figures on manufacturing value added and even Gross Domestic Product meaningless as a basis for analysis of the economic trend as a result of their gross understatement of the progress they are sure we must be making--the more in, I suppose, because they are mesmerized (or want to be mesmerized) by images of wonders ceaselessly pouring forth from an ever-innovating Silicon Valley.
Alas, that image is at odds with the reality that the change in information technology is profoundly unrepresentative of the rest of the economy, as we see looking at the way our houses are built, our food is produced, our clothes are made--while even in areas where one can make a case for a revolution really being ongoing, like energy (where the rising productivity and plummeting cost of renewals has created enormous potentials), Big Oil and the rest have, with the help of their allies across the business world and government and the media, fought back ferociously and very successfully against that "disruption" that "business intellectuals" so love to talk about).
At the same time even in tech the progress is a far cry from the hype. Aside from the fact that really big developments have been few in recent years (just remember--the Iphone that is the favorite example of this innovation will be old enough to vote this year) there has been a very great deal to offset and limit the value delivered by these machines whether the issue is their contributions to the productivity of business, or the goods they bring to the consumer, sufficiently so as to dispel illusions of vast improvements in product and in life just not reported in the numbers. Below are ten of the more obvious:
1. Computing devices remain incapable of, on an autonomous basis, interacting with the physical world (comprehending it, navigating it, manipulating it with human dexterity, you know, like with a car that actually drives itself), with all that means when we are "living in a material world"--or even coping with situations of less physical kinds requiring great versatility (as you see every time you confront the vileness that companies serve up as customer service). The result is to severely limit the jobs they can do, as the pandemic reminded even the most obtuse--while the great wave of automation of physical and even mental work the pandemic was supposed to bring on has, as of 2025, yet to materialize, looking now like just another piece of techno-hype flogged by the hucksters in Silicon Valley and their courtiers in the media.
2. In most hands, at least, a computer all by itself is little more than a "brick." That is to say that computers' usefulness depends on (besides electricity, of course) computer software, Internet connections and, as a practical matter, outlays for a lot of the things you will hope to do online--all of which, of course, do register in the value added figures. (If you want online grocery delivery, for example, you are expected to pay a fee for the delivery, without which the computer renders no such service.) For comparison purposes, consider an analog-era television set--which so long as it was plugged into the wall could get a signal without the extras. Indeed, it seems plausible that the list of things one can do without paying extra charges is shrinking all the time in a period of ever more pervasive paywalls, decreasing site functionality, consumer exploitation-driven "platform decay," and of course, the general inflationary wave through which all but the super-rich are now suffering.
3. Many of the services that computing devices supply their users are imperfect substitutes for non-digital alternatives--and the disutility involved rarely remarked by such observers, and often not registering in the statistics. (Ordering your groceries online you are likely to find your selection more limited, and the method of ordering more cumbersome and less flexible, as many complain about getting a lower quality of item than they would get were they to shop in person.)
4. Where those things that computers can do are concerned there has been a tendency to diminishing returns that makes the "on paper" improvements in technological performance remote from the actual value delivered. Yes, your computer today is hundreds of times faster than those we had in the 1990s. But has it made you hundreds of times more productive as a user of e-mail? I doubt it. And the same could be said of a very great deal else, with all that means for the increased value delivered by the devices, and the productivity they yield--all as the list of really fundamental tasks they enable us to perform has not grown anywhere near so rapidly as the speeds and sizes in the performance specifications. (Indeed, many computer activities are less efficient than before, most obviously because of an increasing burden of computer security, more on which later in this list.)
5. Even when perfectly functional computers (and the software and Internet connections that go with them) remain far from user-friendly, and highly unreliable, with the fact exemplified by a recent Danish study showing that some 20 percent of users' time is absorbed in dealing with computer problems, with, again, obvious implications for any growth in value added, and the productivity gains yielded.
6. Computers' performance gains have been substantially absorbed by things that their users would regard as disutilities--specifically software devoted to the surveillance of the user, and the subjection of the user to advertising and other forms of marketing (the computer user does not want ads more "relevant" to their inferred needs, they want to not be spied on or marketed to at all)--and where not actual disutilities in the aforementioned matter, at best irrelevancies (as with many features of which almost all of the users will know and care nothing, but which absorb copious amounts of memory and at times interfere with the functioning of features they do care about, as happened when an update produced problems with Bitlocker for many last year).
7. The benefits of computer use go along with the high cost of computer crime--and the efforts required to combat such crime, reflected in the existence of an over $270 billion cybersecurity industry, and one might add, the amounts of users' computing power absorbed by such programs as antiviruses, and the periodic updates of software to patch up holes in the systems sold them. This extends to extreme inconvenience for computer users who as a matter of course are expected to display increasing sophistication in matters of "cybersecurity," whether coping with multiple strong passwords, understanding and using multi-factor authentication and passkeys, the operation of highly imperfect antivirus programs, assessing the risks in the unsolicited e-mail, QR codes and other material with which they are barraged, etc.. All of this too has significant costs from the standpoint of not just money, and time (which happens to be money as the adage has it), but well-being, with "cybersecurity fatigue" already a known problem a decade ago, and inducing many to make even less use of the Internet than they might otherwise do (avoiding this or that site just to not have to open another account with information the company will inevitably sell or leak, just to not have to deal with yet another wretched password).
8. The advent of the smart phone, which has admittedly been an important innovation, has entailed important trade-offs. Portability has come at the price of usability, with small touch screens no substitute for full-sized keyboards, while it is also the case that the mobility of the devices (with all the risk of damage or loss associated with carrying a little piece of plastic everywhere), and the usage of wireless communications that goes with them in public places, has exacerbated the security problem.
9. Computer hardware and software of all kinds tends to have short product cycles, translating to rapid depreciation--with this, again, detracting from the value delivered, as users are required to expensively replace the worn-out or simply outdated item. A fact that has been acknowledged as contributing to the difference between "Gross Domestic Product" and the less well-known but in many ways more illuminating "Net Domestic Product," it is also the case that replacement entails costs beyond simply buying and turning on the new equipment. One has to integrate that piece of equipment into an existing network (indeed, buy a personal computer and you are likely to find this part of the default version of the set-up process), transfer old files into it, and on top of that spend time learning to cope with what is likely to be a different model--and all that, again, the more often for the rapidity of depreciation.
10. Those enthusing over the benefits of computing for the public tends to overlook the still extant "digital divide"--a very large part of the population left out of the benefits of computer use because of poverty or old age (in societies which are both increasingly poor and increasingly elderly), a situation highlighted by the David Cameron government's cynically making the "Universal Credit" system with which it replaced much of its social safety net "online only" (rendering access more difficult for exactly that part of the population which needs the system most, as it is least equipped to deal with such a system).
Would I say that these hard facts of the computer age means computers have not, on the whole, brought net benefits? Even very significant net benefits? I would not. However, when we remember the limits as well as the wonders of the digital revolution, the costs of computing as well as the gains--and the fact that, in line with the calculation of economic growth using methods that make the proverbial divorced cancer patient an economic hero, many of the costs, as with fixing the problems of unwieldy and unreliable machines or the cost of computer security, are registered as gains--the view that computers are somehow yielding radical gains far beyond what is discernible in the economic statistics seem much more questionable.
Of course, in saying that I think it fair that one could imagine things to have been otherwise in a different situation--one where, for example, governments took steps to protect computer users' privacy rather than leaving information traffickers free to monetize their personal information, or businesses bravely pursued such potentially productivity-enhancing measures as telecommuting rather than fighting against them every step of the way (while the free rein of anticompetitive practice in an age in which antitrust is a dead letter, the locust-like short-termism that is the great legacy of "shareholder activism," and an intellectual property regime that has gone utterly out of control, were not all doing their parts to stifle any sort of meaningful technical progress in this sector as in so many others). However, intriguing as that possibility may be, and valid and indeed useful its consideration, we can only speculate about that--and such speculation a whole other discussion than this one, an appraisal of the place of "tech" in the world that we have actually got.
Alas, that image is at odds with the reality that the change in information technology is profoundly unrepresentative of the rest of the economy, as we see looking at the way our houses are built, our food is produced, our clothes are made--while even in areas where one can make a case for a revolution really being ongoing, like energy (where the rising productivity and plummeting cost of renewals has created enormous potentials), Big Oil and the rest have, with the help of their allies across the business world and government and the media, fought back ferociously and very successfully against that "disruption" that "business intellectuals" so love to talk about).
At the same time even in tech the progress is a far cry from the hype. Aside from the fact that really big developments have been few in recent years (just remember--the Iphone that is the favorite example of this innovation will be old enough to vote this year) there has been a very great deal to offset and limit the value delivered by these machines whether the issue is their contributions to the productivity of business, or the goods they bring to the consumer, sufficiently so as to dispel illusions of vast improvements in product and in life just not reported in the numbers. Below are ten of the more obvious:
1. Computing devices remain incapable of, on an autonomous basis, interacting with the physical world (comprehending it, navigating it, manipulating it with human dexterity, you know, like with a car that actually drives itself), with all that means when we are "living in a material world"--or even coping with situations of less physical kinds requiring great versatility (as you see every time you confront the vileness that companies serve up as customer service). The result is to severely limit the jobs they can do, as the pandemic reminded even the most obtuse--while the great wave of automation of physical and even mental work the pandemic was supposed to bring on has, as of 2025, yet to materialize, looking now like just another piece of techno-hype flogged by the hucksters in Silicon Valley and their courtiers in the media.
2. In most hands, at least, a computer all by itself is little more than a "brick." That is to say that computers' usefulness depends on (besides electricity, of course) computer software, Internet connections and, as a practical matter, outlays for a lot of the things you will hope to do online--all of which, of course, do register in the value added figures. (If you want online grocery delivery, for example, you are expected to pay a fee for the delivery, without which the computer renders no such service.) For comparison purposes, consider an analog-era television set--which so long as it was plugged into the wall could get a signal without the extras. Indeed, it seems plausible that the list of things one can do without paying extra charges is shrinking all the time in a period of ever more pervasive paywalls, decreasing site functionality, consumer exploitation-driven "platform decay," and of course, the general inflationary wave through which all but the super-rich are now suffering.
3. Many of the services that computing devices supply their users are imperfect substitutes for non-digital alternatives--and the disutility involved rarely remarked by such observers, and often not registering in the statistics. (Ordering your groceries online you are likely to find your selection more limited, and the method of ordering more cumbersome and less flexible, as many complain about getting a lower quality of item than they would get were they to shop in person.)
4. Where those things that computers can do are concerned there has been a tendency to diminishing returns that makes the "on paper" improvements in technological performance remote from the actual value delivered. Yes, your computer today is hundreds of times faster than those we had in the 1990s. But has it made you hundreds of times more productive as a user of e-mail? I doubt it. And the same could be said of a very great deal else, with all that means for the increased value delivered by the devices, and the productivity they yield--all as the list of really fundamental tasks they enable us to perform has not grown anywhere near so rapidly as the speeds and sizes in the performance specifications. (Indeed, many computer activities are less efficient than before, most obviously because of an increasing burden of computer security, more on which later in this list.)
5. Even when perfectly functional computers (and the software and Internet connections that go with them) remain far from user-friendly, and highly unreliable, with the fact exemplified by a recent Danish study showing that some 20 percent of users' time is absorbed in dealing with computer problems, with, again, obvious implications for any growth in value added, and the productivity gains yielded.
6. Computers' performance gains have been substantially absorbed by things that their users would regard as disutilities--specifically software devoted to the surveillance of the user, and the subjection of the user to advertising and other forms of marketing (the computer user does not want ads more "relevant" to their inferred needs, they want to not be spied on or marketed to at all)--and where not actual disutilities in the aforementioned matter, at best irrelevancies (as with many features of which almost all of the users will know and care nothing, but which absorb copious amounts of memory and at times interfere with the functioning of features they do care about, as happened when an update produced problems with Bitlocker for many last year).
7. The benefits of computer use go along with the high cost of computer crime--and the efforts required to combat such crime, reflected in the existence of an over $270 billion cybersecurity industry, and one might add, the amounts of users' computing power absorbed by such programs as antiviruses, and the periodic updates of software to patch up holes in the systems sold them. This extends to extreme inconvenience for computer users who as a matter of course are expected to display increasing sophistication in matters of "cybersecurity," whether coping with multiple strong passwords, understanding and using multi-factor authentication and passkeys, the operation of highly imperfect antivirus programs, assessing the risks in the unsolicited e-mail, QR codes and other material with which they are barraged, etc.. All of this too has significant costs from the standpoint of not just money, and time (which happens to be money as the adage has it), but well-being, with "cybersecurity fatigue" already a known problem a decade ago, and inducing many to make even less use of the Internet than they might otherwise do (avoiding this or that site just to not have to open another account with information the company will inevitably sell or leak, just to not have to deal with yet another wretched password).
8. The advent of the smart phone, which has admittedly been an important innovation, has entailed important trade-offs. Portability has come at the price of usability, with small touch screens no substitute for full-sized keyboards, while it is also the case that the mobility of the devices (with all the risk of damage or loss associated with carrying a little piece of plastic everywhere), and the usage of wireless communications that goes with them in public places, has exacerbated the security problem.
9. Computer hardware and software of all kinds tends to have short product cycles, translating to rapid depreciation--with this, again, detracting from the value delivered, as users are required to expensively replace the worn-out or simply outdated item. A fact that has been acknowledged as contributing to the difference between "Gross Domestic Product" and the less well-known but in many ways more illuminating "Net Domestic Product," it is also the case that replacement entails costs beyond simply buying and turning on the new equipment. One has to integrate that piece of equipment into an existing network (indeed, buy a personal computer and you are likely to find this part of the default version of the set-up process), transfer old files into it, and on top of that spend time learning to cope with what is likely to be a different model--and all that, again, the more often for the rapidity of depreciation.
10. Those enthusing over the benefits of computing for the public tends to overlook the still extant "digital divide"--a very large part of the population left out of the benefits of computer use because of poverty or old age (in societies which are both increasingly poor and increasingly elderly), a situation highlighted by the David Cameron government's cynically making the "Universal Credit" system with which it replaced much of its social safety net "online only" (rendering access more difficult for exactly that part of the population which needs the system most, as it is least equipped to deal with such a system).
Would I say that these hard facts of the computer age means computers have not, on the whole, brought net benefits? Even very significant net benefits? I would not. However, when we remember the limits as well as the wonders of the digital revolution, the costs of computing as well as the gains--and the fact that, in line with the calculation of economic growth using methods that make the proverbial divorced cancer patient an economic hero, many of the costs, as with fixing the problems of unwieldy and unreliable machines or the cost of computer security, are registered as gains--the view that computers are somehow yielding radical gains far beyond what is discernible in the economic statistics seem much more questionable.
Of course, in saying that I think it fair that one could imagine things to have been otherwise in a different situation--one where, for example, governments took steps to protect computer users' privacy rather than leaving information traffickers free to monetize their personal information, or businesses bravely pursued such potentially productivity-enhancing measures as telecommuting rather than fighting against them every step of the way (while the free rein of anticompetitive practice in an age in which antitrust is a dead letter, the locust-like short-termism that is the great legacy of "shareholder activism," and an intellectual property regime that has gone utterly out of control, were not all doing their parts to stifle any sort of meaningful technical progress in this sector as in so many others). However, intriguing as that possibility may be, and valid and indeed useful its consideration, we can only speculate about that--and such speculation a whole other discussion than this one, an appraisal of the place of "tech" in the world that we have actually got.
The Politics of the Term "Entrepreneur"
The rather tongue-twisting four-syllable term "entrepreneur" denotes a person who launches a business venture.
Annoying as its pronunciation may be the term is essentially unproblematic as a descriptor. However, it has become much more than a functional descriptor in recent decades, instead become a word to conjure with this no accident, ultra-right economists and their epigones having gone to great lengths to make it so.
Consider the "entrepreneur"-centric vision of the economy they propound ceaselessly, virtually without challenge in the mainstream. The proponents of that vision endlessly direct our attention to the shiny new "startup" (for them, another word to conjure with), and those who launch them in acts of visionary risk-taking so important that they can say that "never have so many owed everything to so few" as they exalt them as the only makers in a world of unappreciative and generally worthless takers. Hyperbolic as this may sound, all this is in fact quite explicitly stated in the work of George Gilder. The Sexual Revolution-and-"women's-lib"-hating kulturkampfer-turned-Reagan-White House-court-intellectual-and-ultimately-would-be-prophet of "tech," "Silicon valley" and the "information age" who did so much to make the current American cult of the entrepreneur with books like his "theology of capitalism" Wealth and Poverty and The Spirit of Enterprise, in the latter hailing entrepreneurs as "the heroes of economic life," declared them the ones who "know the rules of the world and the laws of God," who "sustain the world," and expressed the hope that "the entitled children around the world" would "come to see and follow their example and earn their redemption and their happiness, reconciled with the world of work and risk"--such that when he fuses religion and capitalism that way one scarcely does any mental work at all drawing an analogy between how, just as Hegel made the state "God on Earth," Gilder and those who follow him make the entrepreneur into that.
As one might guess from such extreme and blatantly irrational (I suspect many would say, utterly unhinged) rhetoric the view I discuss here is not a very good guide to reality. Those fixated on startups give us an image of the economy bearing nearly no relation whatsoever to a scene dominated by long-established corporate giants--all as their "technostructures" are what make those companies and the economic world generally go round, as the startup scene, where it exists at all, so often amounts to nothing, and frequently less (as seen in the fraud that was "genius" Elizabeth Holmes' once ballyhooed Theranos). On those rare occasions when companies are launched in the fashion now associated with the startup, and actually develop into something of consequence, one often finds that those who did the launching relied, even at that stage and at the highest level, on the skills of other less celebrated persons, and frequently many of them--because the job is a big one that only a superhuman would really be able to do by themselves, and because even by ordinary human standards the "entrepreneur" who ends up with all the credit so often brings rather less in the way of technical and organizational skills, business sense, personal effort, than they are given credit for (as John Kenneth Galbraith showed in busting the myth of Henry Ford as such a superman). And all that is before one even considers how even then nothing would have happened without all the rest of their workers down to the plant floor, the intricately developed system of societal and global market relations that even in the eighteenth century had Adam Smith remarking how at a certain level the thousand people who might be involved in the making of a single coat, the equally societal and global machinery of scientific and technological progress (how often do the "entrepreneurs" actually have any inventions to their credit?), and the government which furnishes innumerable public goods and often much more than that to the successful "enterprise" in a way that makes ridiculous Gilder-like claims for entrepreneurial "visionary risk-taking" rather than "optimizing calculation." Empirical examination of how business operates (and plain and simple logic, too) makes it very clearly that risky visions are something businesspersons flee from in real life, as instead their "risk-taking" consists mainly of asset-trafficking as they rent-seek and speculate in a global economy turned casino, placing their bets on stock and real estate and other such assets, then unfailingly scream for bailout by governments which fill their coffers by taxing everyone but them--all while the record where "bringing good things to life" is concerned makes it very clear that government had to bring much to the table to encourage the "entrepreneurs" (an industrial economy doesn't happen without industrial policy), sometimes strong-armed them into getting lines of business (sometimes, very gangster-style indeed in its choice of method), and frequently just did the job itself (with America's extraordinary World War II defense production happening specifically because government did not wait for the "entrepreneurs" to get their act together, but went out and built the plant itself, leased it to business to produce what was needed, and then sold it off to the private sector at a deep discount), with all that means for just who was really the maker, who the taker.
Indeed, it is exemplary of the dissonances between the "Cult of the Entrepreneur" and lived reality that, desperate for anything in it that would seem to correspond to their image of the world at all, the propagandists pay such disproportionate attention to Silicon Valley relative to the rest of the economy--and at the same time provide such a wildly distorted image of the development, contributions, standing of American tech. (Cutting government and established business out of the picture they present the computer age as having all come out of Bay Area garages, exaggerate both the impact of their goods and their macroeconomic contribution to the economy in such areas as employment or the trade deficit, constantly overhype their work to produce hyperbolic images of the rate of progress they are making as they uncritically pass on the sector's stupid hucksterism, and sell fantasies of its supposedly matchless prowess punctured again and again by the competition--most recently, by DeepSeek.)
In this context the word "entrepreneur" and its derivatives (like entrepreneurship) have been one-word summations of this market fundamentalist dogma, and their utterance repetition and affirmation of that dogma--all as anyone attentive to the discourse of the times well knows, respectful utterance of the words ceaseless, making of a fashionable buzzword a holy chant of the market fundamentalist. ("Entrepreneurship, entrepreneurship, entrepreneurship . . . entrepreneur, entrepreneur, entrepreneur.") But few are those who point this out, the utterances of the truth, unlikely to get much attention because of how much they are shut out of the mainstream, and those few occasions when they can get any sort of hearing for the truth buried again and again under an avalanche of Conventional Foolishness, which as with so much Conventional Foolishness endures because it is so flattering to and convenient for those in power. Thus are businessmen who on close inspection were not just profoundly lucky Josiah Bounderbys but deeply venal incompetents deified--as at the same time we see not only exploding inequality, but technological stagnation and deindustrialization, with all they have meant for our world.
Annoying as its pronunciation may be the term is essentially unproblematic as a descriptor. However, it has become much more than a functional descriptor in recent decades, instead become a word to conjure with this no accident, ultra-right economists and their epigones having gone to great lengths to make it so.
Consider the "entrepreneur"-centric vision of the economy they propound ceaselessly, virtually without challenge in the mainstream. The proponents of that vision endlessly direct our attention to the shiny new "startup" (for them, another word to conjure with), and those who launch them in acts of visionary risk-taking so important that they can say that "never have so many owed everything to so few" as they exalt them as the only makers in a world of unappreciative and generally worthless takers. Hyperbolic as this may sound, all this is in fact quite explicitly stated in the work of George Gilder. The Sexual Revolution-and-"women's-lib"-hating kulturkampfer-turned-Reagan-White House-court-intellectual-and-ultimately-would-be-prophet of "tech," "Silicon valley" and the "information age" who did so much to make the current American cult of the entrepreneur with books like his "theology of capitalism" Wealth and Poverty and The Spirit of Enterprise, in the latter hailing entrepreneurs as "the heroes of economic life," declared them the ones who "know the rules of the world and the laws of God," who "sustain the world," and expressed the hope that "the entitled children around the world" would "come to see and follow their example and earn their redemption and their happiness, reconciled with the world of work and risk"--such that when he fuses religion and capitalism that way one scarcely does any mental work at all drawing an analogy between how, just as Hegel made the state "God on Earth," Gilder and those who follow him make the entrepreneur into that.
As one might guess from such extreme and blatantly irrational (I suspect many would say, utterly unhinged) rhetoric the view I discuss here is not a very good guide to reality. Those fixated on startups give us an image of the economy bearing nearly no relation whatsoever to a scene dominated by long-established corporate giants--all as their "technostructures" are what make those companies and the economic world generally go round, as the startup scene, where it exists at all, so often amounts to nothing, and frequently less (as seen in the fraud that was "genius" Elizabeth Holmes' once ballyhooed Theranos). On those rare occasions when companies are launched in the fashion now associated with the startup, and actually develop into something of consequence, one often finds that those who did the launching relied, even at that stage and at the highest level, on the skills of other less celebrated persons, and frequently many of them--because the job is a big one that only a superhuman would really be able to do by themselves, and because even by ordinary human standards the "entrepreneur" who ends up with all the credit so often brings rather less in the way of technical and organizational skills, business sense, personal effort, than they are given credit for (as John Kenneth Galbraith showed in busting the myth of Henry Ford as such a superman). And all that is before one even considers how even then nothing would have happened without all the rest of their workers down to the plant floor, the intricately developed system of societal and global market relations that even in the eighteenth century had Adam Smith remarking how at a certain level the thousand people who might be involved in the making of a single coat, the equally societal and global machinery of scientific and technological progress (how often do the "entrepreneurs" actually have any inventions to their credit?), and the government which furnishes innumerable public goods and often much more than that to the successful "enterprise" in a way that makes ridiculous Gilder-like claims for entrepreneurial "visionary risk-taking" rather than "optimizing calculation." Empirical examination of how business operates (and plain and simple logic, too) makes it very clearly that risky visions are something businesspersons flee from in real life, as instead their "risk-taking" consists mainly of asset-trafficking as they rent-seek and speculate in a global economy turned casino, placing their bets on stock and real estate and other such assets, then unfailingly scream for bailout by governments which fill their coffers by taxing everyone but them--all while the record where "bringing good things to life" is concerned makes it very clear that government had to bring much to the table to encourage the "entrepreneurs" (an industrial economy doesn't happen without industrial policy), sometimes strong-armed them into getting lines of business (sometimes, very gangster-style indeed in its choice of method), and frequently just did the job itself (with America's extraordinary World War II defense production happening specifically because government did not wait for the "entrepreneurs" to get their act together, but went out and built the plant itself, leased it to business to produce what was needed, and then sold it off to the private sector at a deep discount), with all that means for just who was really the maker, who the taker.
Indeed, it is exemplary of the dissonances between the "Cult of the Entrepreneur" and lived reality that, desperate for anything in it that would seem to correspond to their image of the world at all, the propagandists pay such disproportionate attention to Silicon Valley relative to the rest of the economy--and at the same time provide such a wildly distorted image of the development, contributions, standing of American tech. (Cutting government and established business out of the picture they present the computer age as having all come out of Bay Area garages, exaggerate both the impact of their goods and their macroeconomic contribution to the economy in such areas as employment or the trade deficit, constantly overhype their work to produce hyperbolic images of the rate of progress they are making as they uncritically pass on the sector's stupid hucksterism, and sell fantasies of its supposedly matchless prowess punctured again and again by the competition--most recently, by DeepSeek.)
In this context the word "entrepreneur" and its derivatives (like entrepreneurship) have been one-word summations of this market fundamentalist dogma, and their utterance repetition and affirmation of that dogma--all as anyone attentive to the discourse of the times well knows, respectful utterance of the words ceaseless, making of a fashionable buzzword a holy chant of the market fundamentalist. ("Entrepreneurship, entrepreneurship, entrepreneurship . . . entrepreneur, entrepreneur, entrepreneur.") But few are those who point this out, the utterances of the truth, unlikely to get much attention because of how much they are shut out of the mainstream, and those few occasions when they can get any sort of hearing for the truth buried again and again under an avalanche of Conventional Foolishness, which as with so much Conventional Foolishness endures because it is so flattering to and convenient for those in power. Thus are businessmen who on close inspection were not just profoundly lucky Josiah Bounderbys but deeply venal incompetents deified--as at the same time we see not only exploding inequality, but technological stagnation and deindustrialization, with all they have meant for our world.
"Personal Responsibility!" They Snarl in Your Face
The term "personal responsibility" can be used in a lot of ways. However, there is a particular usage that grates on a great many nerves.
Consider the flinging of the words "Personal responsibility!" in the face of those who, for example, have lost their jobs in an economic downturn and have every reason to expect an even more than usually hard and long search before they find a new one, an outcome that is never certain and often means harder work for lower wages--all as the executives who caused the downturn, insisting on their bonuses as their companies extort the government for a bailout paid by the taxes of working people like those who have been losing their jobs, never hear such talk from the deferential legislators before whom they testify.
In that image, with its combination of "bound but unprotected/protected but unbound" inequality with a sanctimonious callousness in a situation where the fortunes of most are not reducible to individual morality, we have the essence of that utterance, and its grating. Here "Personal responsibility!" means leaving individuals to cope entirely on their own with systemic failures in which they had no say because those who have power derive short-term benefit from the situation and thus have no interest in altering the situation whatsoever, and indeed every interest in resisting any alteration to the situation whatsoever--climate change, pandemic, the insanities attendant on hyper-financialization, the ever-worsening nightmare of personal cyber-security, etc., etc., ad infinitum and ad nauseam--with the communication of this the nastier for the attack on those who would criticize the situation in any way being dressed up as morality by those too stupid to know the difference between "morality" and "moralizing," or simply taking too much pleasure in striking poses of superiority as they say mean and hurtful things to others to respect that difference.
Consider the flinging of the words "Personal responsibility!" in the face of those who, for example, have lost their jobs in an economic downturn and have every reason to expect an even more than usually hard and long search before they find a new one, an outcome that is never certain and often means harder work for lower wages--all as the executives who caused the downturn, insisting on their bonuses as their companies extort the government for a bailout paid by the taxes of working people like those who have been losing their jobs, never hear such talk from the deferential legislators before whom they testify.
In that image, with its combination of "bound but unprotected/protected but unbound" inequality with a sanctimonious callousness in a situation where the fortunes of most are not reducible to individual morality, we have the essence of that utterance, and its grating. Here "Personal responsibility!" means leaving individuals to cope entirely on their own with systemic failures in which they had no say because those who have power derive short-term benefit from the situation and thus have no interest in altering the situation whatsoever, and indeed every interest in resisting any alteration to the situation whatsoever--climate change, pandemic, the insanities attendant on hyper-financialization, the ever-worsening nightmare of personal cyber-security, etc., etc., ad infinitum and ad nauseam--with the communication of this the nastier for the attack on those who would criticize the situation in any way being dressed up as morality by those too stupid to know the difference between "morality" and "moralizing," or simply taking too much pleasure in striking poses of superiority as they say mean and hurtful things to others to respect that difference.
What Makes a Victim "Worthy" in the Eyes of the Media?
In discussing the "worthiness" the media accords the suffering of some victims as against others many point to how far away they are. However, those specific criteria cannot and do not account for everything, with Ed Herman and Noam Chomsky in Manufacturing Consent selecting for comparison purposes two very similar incidents--the murder of a civilian, and indeed specifically a Catholic priest who had become politically active, by forces of state repression. One such case was in Poland, the other in El Salvador--and going by their attempt to quantify the quantity and quality of the coverage the two incidents received Herman and Chomsky determined that a priest killed by state personnel in (Communist) Poland was, as victims go, at least a hundred times as "worthy" as a priest killed by state personnel in (Anti-Communist) El Salvador in the eyes of the media. What made the difference from the standpoint of "Establishment" media was whether or not coverage of the incident supported or called into question the prevailing ideology, and with it served or hindered the maintenance of public support for the prevailing policy. The victim whose suffering's being publicized serves the interests of authority a worthy victim, and vice-versa--in this case, how discussion of that suffering stood in relation to the promulgation of Anti-Communist ideology and the sustenance of support for a Cold War then going into overdrive.
The Media Waxes Philosophical About Victims, Worthy and Unworthy
One of the more famous aspects of Ed Herman and Noam Chomsky's critique of the news media in their classic Manufacturing Consent is their discussion of the media's double standard regarding "worthy" and "unworthy" victims. The worthy victim is a victim worthy of our interest and sympathy, and a fight for justice; and the unworthy victim is the opposite, "undeserving" of such concern; with the coverage they each get matching that.
The worthy victim gets far more coverage, far more prominent coverage, and far more emotive coverage than the unworthy victim, with the last in particular worth expanding on.
The worthy victim is humanized. The account of the crime against them, in its intense detailing, emphasizes the brutality done them. We are made aware of the indignation of others over the incident. And there is a demand for justice.
Thus when the victim is a worthy one the media is apt to present all of the above in emotive language, and give us too those pictures worth a thousand words, showing us their faces, likely smiling, in photos taken back when they were alive--while we are warned about the graphic images that show what remains of them now that make such a contrast with those living images we saw earlier. The journalist's report, and the images, will also make clear to us that they had attachments--that they had a family--that they loved and were loved, by parents, spouses, children, friends, coworkers, in whose lives there is now a gaping hole, and who cry out for answers and accountability. (Indeed, rather than just humanizing them the media gives the impression of canonizing them.)
By contrast, if the media speaks of unworthy victims at all, it refrains from emotive language, often does not mention individuals, and if it mentions individuals says little about them. Their faces remain unknown to us as the report alludes to what happened to them only briefly, providing such details to which the journalists deign to offer (of which there are unlikely to be many) in the most distant and impersonal and "clinical" fashion. They display no interest in those they left behind, and there is certainly no concern for any aftermath. Indeed, Herman and Chomsky contrast the coverage of worthy victims so "generous with gory details and quoted expressions of outrage and demands for justice" with the "low-keyed" character of what little coverage the unworthy get that is "designed to keep the lid on emotions."
I find the essentials of this characterization of the media's conduct indisputable, though I think that with a few decades of additional hindsight (and one might see, the decline of certain sensitivities that even in the '80s were more developed than now, to go by how affairs like the Iran-Contra episode could still outrage, and seem awfully small time next to what we have today) it is worth enlarging on what seems a minor point in their original discussion. This is that those reporting the incident not only treat the incident very differently depending on whether the victim is worthy or unworthy, but strike a different attitude toward the meaning and value of human life altogether as they do so. Herman and Chomsky remarked this in some degree, noting how the "low-keyed" character of the coverage of unworthy victims tends to entail "regretful and philosophical generalities on the omnipresence of violence and the inherent tragedy of human life" telling us "Don't get too worked up over such things."
Today it seems to me that the philosophical double standard has gone much further. With the worthy victims not only is it that we get lectures on the sanctity of human life, each and every last one, and the absolute unjustifiability of any violence against any human, ever, but in discussion of the unworthy we get not just talk of the omnipresence of violence and tragedy in sad and regretful and "philosophical" tones but, increasingly, a hastening to justify what was done that often goes as far as their assuming an attitude of that "Welcome to the real world!" swagger to which small men who want to feel big are so addicted as they condescendingly tell us in that way feminists call "mansplaining" that that real world is not just a place where violence is omnipresent and tragic but a harsh and brutal place that requires even the gentlest of us to compromise so-called principles for the sake of making the world go round, with consequences that aren't always intended. And anyway, what do you care about those people you're calling victims anyway--people who don't look or sound or think or believe like you, people you would never want to meet in a country you would never want to see--getting abused, tortured, killed? The very people whose very animality forces us to do these things--making us the real victims? The fact of the matter is that to keep you safe sometimes we have to do nasty things, and the truth is that deep down even those unappreciative liberal scum who can't handle the truth want those things done, because deep down they know that's the way it is. ("You want them on that wall. You need them on that wall.")
As logic goes it's as laughable as it is appalling, a mass of begged questions as dense as a neutron star. But they aren't looking to win with logic. Instead they appeal to selfishness, ignorance, fear, bigotry, shame, misanthropy, moral nihilism--to the worst in us--while rubbing the faces of those humanized worthy victims in the faces of their audience as they dehumanize the unworthy, and all this backed up by outright bullying as they make anyone who would speak for the unworthy victim a traitor against whom the public is to be whipped up, and in dealing with them authority may not be in their view obliged to regard itself as wholly constrained by "democratic norms." Many see right through it, and even refuse to be intimidated--but the media in question makes sure that they don't get a platform from which they might challenge the Narrative, all as they pat themselves on the back for their "objectivity" and "responsibility," and their giving their audience all sides of the question.
The worthy victim gets far more coverage, far more prominent coverage, and far more emotive coverage than the unworthy victim, with the last in particular worth expanding on.
The worthy victim is humanized. The account of the crime against them, in its intense detailing, emphasizes the brutality done them. We are made aware of the indignation of others over the incident. And there is a demand for justice.
Thus when the victim is a worthy one the media is apt to present all of the above in emotive language, and give us too those pictures worth a thousand words, showing us their faces, likely smiling, in photos taken back when they were alive--while we are warned about the graphic images that show what remains of them now that make such a contrast with those living images we saw earlier. The journalist's report, and the images, will also make clear to us that they had attachments--that they had a family--that they loved and were loved, by parents, spouses, children, friends, coworkers, in whose lives there is now a gaping hole, and who cry out for answers and accountability. (Indeed, rather than just humanizing them the media gives the impression of canonizing them.)
By contrast, if the media speaks of unworthy victims at all, it refrains from emotive language, often does not mention individuals, and if it mentions individuals says little about them. Their faces remain unknown to us as the report alludes to what happened to them only briefly, providing such details to which the journalists deign to offer (of which there are unlikely to be many) in the most distant and impersonal and "clinical" fashion. They display no interest in those they left behind, and there is certainly no concern for any aftermath. Indeed, Herman and Chomsky contrast the coverage of worthy victims so "generous with gory details and quoted expressions of outrage and demands for justice" with the "low-keyed" character of what little coverage the unworthy get that is "designed to keep the lid on emotions."
I find the essentials of this characterization of the media's conduct indisputable, though I think that with a few decades of additional hindsight (and one might see, the decline of certain sensitivities that even in the '80s were more developed than now, to go by how affairs like the Iran-Contra episode could still outrage, and seem awfully small time next to what we have today) it is worth enlarging on what seems a minor point in their original discussion. This is that those reporting the incident not only treat the incident very differently depending on whether the victim is worthy or unworthy, but strike a different attitude toward the meaning and value of human life altogether as they do so. Herman and Chomsky remarked this in some degree, noting how the "low-keyed" character of the coverage of unworthy victims tends to entail "regretful and philosophical generalities on the omnipresence of violence and the inherent tragedy of human life" telling us "Don't get too worked up over such things."
Today it seems to me that the philosophical double standard has gone much further. With the worthy victims not only is it that we get lectures on the sanctity of human life, each and every last one, and the absolute unjustifiability of any violence against any human, ever, but in discussion of the unworthy we get not just talk of the omnipresence of violence and tragedy in sad and regretful and "philosophical" tones but, increasingly, a hastening to justify what was done that often goes as far as their assuming an attitude of that "Welcome to the real world!" swagger to which small men who want to feel big are so addicted as they condescendingly tell us in that way feminists call "mansplaining" that that real world is not just a place where violence is omnipresent and tragic but a harsh and brutal place that requires even the gentlest of us to compromise so-called principles for the sake of making the world go round, with consequences that aren't always intended. And anyway, what do you care about those people you're calling victims anyway--people who don't look or sound or think or believe like you, people you would never want to meet in a country you would never want to see--getting abused, tortured, killed? The very people whose very animality forces us to do these things--making us the real victims? The fact of the matter is that to keep you safe sometimes we have to do nasty things, and the truth is that deep down even those unappreciative liberal scum who can't handle the truth want those things done, because deep down they know that's the way it is. ("You want them on that wall. You need them on that wall.")
As logic goes it's as laughable as it is appalling, a mass of begged questions as dense as a neutron star. But they aren't looking to win with logic. Instead they appeal to selfishness, ignorance, fear, bigotry, shame, misanthropy, moral nihilism--to the worst in us--while rubbing the faces of those humanized worthy victims in the faces of their audience as they dehumanize the unworthy, and all this backed up by outright bullying as they make anyone who would speak for the unworthy victim a traitor against whom the public is to be whipped up, and in dealing with them authority may not be in their view obliged to regard itself as wholly constrained by "democratic norms." Many see right through it, and even refuse to be intimidated--but the media in question makes sure that they don't get a platform from which they might challenge the Narrative, all as they pat themselves on the back for their "objectivity" and "responsibility," and their giving their audience all sides of the question.
The Ideological Side of Media Bias
Media critics from Thorstein Veblen to Theodore Dreiser to Upton Sinclair to, more recently, Ed Herman and Noam Chomsky, Norman Solomon, Thomas Frank and Matt Taibbi have made it clear over and over again that economics is the key to the news media's functioning in a society such as ours--who owns the media outlets and what their interests are, and how they make their investments pay; who subscribes to them and advertises in them; how they come by their news, who they fear as able to inflict financial pain on them by withholding advertising money, credit and means of circulation, or striking back with a lawsuit; how they hire and fire the staff that does the job they demand of them.
All of that, of course, has made the news media in a "free" society as much a propaganda organ operated by and for the powerful for their ends as the press in any more officially dictatorial one--and indeed it seems that with so many having covered this ground before so thoroughly, with results that are incontestable to any even slightly aware person, there is little anyone can add to all of the foregoing. Still, if economics dictates the fundamental task there is the distance to be covered between the fundamental economic interests behind the media and the specifics of the media's standards, procedures, practices, legitimacy--as seen in, for instance, the news media's notions of "objective" coverage, its stress on politics relative to policy, its understanding of the uses of expertise, its propensity for "both sidesism"--and it has seemed to me that "centrist" ideology has been hugely important in the mainstream media's mode of operation, the more in as that ideology has been a constant even as so much else has changed technologically, organizationally, commercially and even socially and politically as the institution evolved from the overwhelmingly local and print periodical-dominated press of Dreiser and Veblen's day, to the multinational corporation-controlled digital media of which Taibbi writes.
All of that, of course, has made the news media in a "free" society as much a propaganda organ operated by and for the powerful for their ends as the press in any more officially dictatorial one--and indeed it seems that with so many having covered this ground before so thoroughly, with results that are incontestable to any even slightly aware person, there is little anyone can add to all of the foregoing. Still, if economics dictates the fundamental task there is the distance to be covered between the fundamental economic interests behind the media and the specifics of the media's standards, procedures, practices, legitimacy--as seen in, for instance, the news media's notions of "objective" coverage, its stress on politics relative to policy, its understanding of the uses of expertise, its propensity for "both sidesism"--and it has seemed to me that "centrist" ideology has been hugely important in the mainstream media's mode of operation, the more in as that ideology has been a constant even as so much else has changed technologically, organizationally, commercially and even socially and politically as the institution evolved from the overwhelmingly local and print periodical-dominated press of Dreiser and Veblen's day, to the multinational corporation-controlled digital media of which Taibbi writes.
The Fart-Smeller and the Swaggerer
At least from the standpoint of life in the year 2025, legitimizing "things as they are" unavoidably means getting people to go along with a great deal that is blatantly and undeniably stupid and vile. (Indeed, this is one of the principal jobs of those whom conventional wisdom hails as "experts" in such areas as finance or national security.)
Those who do the legitimizing go about their jobs in different ways, with the poses they strike as they speak their "truths" striking me as worthy of comparison and contrast at the moment.
There is the pose of thoughtful humanity facing a world that is not what it would like it to be but from which it bravely refuses to shrink, and which after much soul-searching it has found cannot be changed. ("Human nature" and all that.) I wish it were otherwise, they say, but "it is what it is," they tell us--often as they pull that "smelled a fart" expression that stupid people think conveys gravitas.
Then there is the gleeful "Welcome to the real world!" swagger that says "This is the way the world is, and if you're not okay with it that's your problem because you're not clear-eyed and tough like me." Of course it is just the pseudo-machismo of an idiot bully who will prove anything but clear-eyed and tough when they deservedly find themselves the ones at the sharp end, but they are probably not smart enough to know that their obnoxious pose is a pose.
Considering those poses I don't know which hypocrisy is worse.
Those who do the legitimizing go about their jobs in different ways, with the poses they strike as they speak their "truths" striking me as worthy of comparison and contrast at the moment.
There is the pose of thoughtful humanity facing a world that is not what it would like it to be but from which it bravely refuses to shrink, and which after much soul-searching it has found cannot be changed. ("Human nature" and all that.) I wish it were otherwise, they say, but "it is what it is," they tell us--often as they pull that "smelled a fart" expression that stupid people think conveys gravitas.
Then there is the gleeful "Welcome to the real world!" swagger that says "This is the way the world is, and if you're not okay with it that's your problem because you're not clear-eyed and tough like me." Of course it is just the pseudo-machismo of an idiot bully who will prove anything but clear-eyed and tough when they deservedly find themselves the ones at the sharp end, but they are probably not smart enough to know that their obnoxious pose is a pose.
Considering those poses I don't know which hypocrisy is worse.
Subscribe to:
Posts (Atom)