Like a great many others in discussing the upcoming return of Deadpool I have written of the film as "Deadpool 3," not just because this was a lot more convenient than having to write Deadpool & Wolverine over and over again, but because the film actually seemed to me to actually be a Deadpool 3. However, the film's director Shawn Levy took issue with this, insisting that this is not merely Deadpool 3 but properly speaking, a Deadpool and Wolverine movie that should be referred to accordingly (as Deadpool & Wolverine). I took this for just the usual pretension of film publicity, but since then the studio has released a new trailer for the film--which, I think, really does justify the view of the film as not merely a Deadpool movie with Wolverine in it, but a genuine team-up of the two characters in a movie really about the both of them and the big adventure they have together (Deadpool and Wolverine's Excellent Adventure!).
Having raised the subject it seems an opportune moment to look back at my prediction for the film's box office gross from January. Back then I suggested an extreme range for the global gross of $150 million-$700 million, and $400-$450 million the territory in which the film would most likely finish its theatrical run. Given that the box office remains depressed compared to what it was in 2023, never mind the 2015-2019 period, and that this has been significantly connected with franchise fatigue extending in particular to superhero fatigue, I still think caution is in order--the more in as Deadpool and especially X-Men were both traveling a path of diminishing returns commercially. (Recall the reception X-Men: Dark Phoenix got?) Going by the more easily quantifiable factors my conclusion still seems to me to be sound.
However, there are also the less easily quantifiable factors that undeniably matter. First and foremost it seems to me that there is a real measure of affection out there for both these characters, and for Ryan Reynolds and Hugh Jackman in these roles, and plausibly for the idea of a team-up between the two--and that this may all be the stronger for the years that have passed since audiences last saw either of them. I will add that the expletive-filled trailer for the film promises to "deliver the goods" here with respect to both action and the Deadpool brand of humor, while the makers of the film seem to be making a real effort to appeal to casual viewers (assuring them that this is one Marvel movie for which people will "not have to do homework", a requirement that was probably a liability for movies like Dr. Strange 2 and Captain Marvel 2), while at the same time offering something to hardcore fans (like the fact that, as this trailer confirms, Wolverine will for the first time ever appear in the costume from the comics and cartoons). All significant points in the film's favor, that going by the Internet chatter seem to be mattering with the audience, the movie is also likely to benefit from the same thing that Top Gun 2 did back in 2022--weaker than usual competition. (It is symbolic that this summer actually opens with Ryan Gosling in an adaptation of The Fall Guy.) This seems all the more the case in that last year's Barbie and Oppenheimer showed how a late summer release can really clean up, especially if what came before did not get theatergoers too excited.
Taking this into account I feel more bullish about the film. Do I think that it will take the Marvel Cinematic Universe back to its Phase Three glory days? No, that is more than any one movie can do, especially at this stage of things, with the market sharply contracted from what it was in 2017-2019, and this particular movie some way off the main line of the franchise.* Do I think it will make a billion dollars the way the first Deadpool did (adjusted for inflation)? That seems to me a very tall order for the same reasons. But looking at the numbers I presented earlier I think that as things stand now I would be less surprised if the film went over the $450 million mark than if it undershot the $400 million mark; that, in fact, a gross approaching or even reaching $700 million is more plausible than I would have thought three months ago. Pretty respectable these days, even with the hefty reported budget it might just let the film turn a respectable nine figure profit, and in the process restore some of the luster to the Marvel brand, not taking us back to 2019 but giving them a chance to keep what is probably the best they can hope for, a franchise that may continue to be a reliable (if more modest) earner even after its peak has passed.
* In 2017-2019 North American ticket sales stood at 3.5 per capita. In 2023 they were more like 2.2 per capita, a 38 percent drop, and 2024 will be lucky to match it.
Tuesday, April 30, 2024
Just How Profitable Was Guardians of the Galaxy 3?
It's that time of year again--Deadline's Most Valuable Blockbuster tournament!
At the time of this writing Deadline has provided only the films making the tenth and ninth spots on the list. The #10 position was taken by PAW Patrol: The Mighty Movie--a feature film tie-in with the TV cartoon.
I admit that this one was not on my radar at all, and have nothing to say about it. By contrast I had expected the #9 movie, Guardians of the Galaxy 3, to place here--because if an underperformer in comparison with its predecessors and some of the more bullish expectations held out for the film in the months prior to its release in a year in which so many big-budget films flopped it still looked like a respectable earner, with my reading of the figures back in May making me think a $100 million take in even the more pessimistic scenario I had in mind.
The movie's making the list means that Deadline offers us the numbers about this one. According to these the final bill on Guardians of the Galaxy 3 came to $550 million--about 2.2 times the $250 million production budget, counting the $90 million absorbed by participations and residuals and the like. The movie's total revenue was $674 million, about 55 percent of that theatrical revenue (thanks to the film doing a bit better than the bottom end of the range I discussed, pulling in, instead of $700-$750 million, about $860 million globally). The result was a net profit of $124 million according to Deadline's calculation. Computed another way the profit equals 23 percent of the outlay.
How do these figures compare with the numbers on the prior two Guardians films?
For Guardians of the Galaxy 2 the figure Deadline computed was a net profit of $154 million (a bit under $200 million in today's terms) on total costs of $515 million--yielding a 30 percent return.
For the original Guardians of the Galaxy the figures were a profit of $204 million (some $270 million in today's terms) on an outlay of $520 million, working out to a 39 percent return.
The result is that the original film was, both in terms of the return on expenditure and absolute return, the most profitable, the second film less profitable than that, the third film least profitable of all--affirming a trend of diminishing returns on the franchise from the monetary standpoint that seems reflective of the diminishing returns on the bigger Marvel Cinematic Universe (MCU) of which it was a part. Of course, that said Guardians of the Galaxy 3's $100 million+ profit still makes it a very worthwhile investment--but with the direction in which this series has moved (a 40 percent drop in relative return, a more than half drop in absolute return), to say nothing of the broader trend with regard to the MCU, the superhero film and the high concept franchise film as we have long known it, should give any astute investor pause in regard to the idea of a Guardians of the Galaxy 4.
Of course, we are likely to be getting one anyway.
At the time of this writing Deadline has provided only the films making the tenth and ninth spots on the list. The #10 position was taken by PAW Patrol: The Mighty Movie--a feature film tie-in with the TV cartoon.
I admit that this one was not on my radar at all, and have nothing to say about it. By contrast I had expected the #9 movie, Guardians of the Galaxy 3, to place here--because if an underperformer in comparison with its predecessors and some of the more bullish expectations held out for the film in the months prior to its release in a year in which so many big-budget films flopped it still looked like a respectable earner, with my reading of the figures back in May making me think a $100 million take in even the more pessimistic scenario I had in mind.
The movie's making the list means that Deadline offers us the numbers about this one. According to these the final bill on Guardians of the Galaxy 3 came to $550 million--about 2.2 times the $250 million production budget, counting the $90 million absorbed by participations and residuals and the like. The movie's total revenue was $674 million, about 55 percent of that theatrical revenue (thanks to the film doing a bit better than the bottom end of the range I discussed, pulling in, instead of $700-$750 million, about $860 million globally). The result was a net profit of $124 million according to Deadline's calculation. Computed another way the profit equals 23 percent of the outlay.
How do these figures compare with the numbers on the prior two Guardians films?
For Guardians of the Galaxy 2 the figure Deadline computed was a net profit of $154 million (a bit under $200 million in today's terms) on total costs of $515 million--yielding a 30 percent return.
For the original Guardians of the Galaxy the figures were a profit of $204 million (some $270 million in today's terms) on an outlay of $520 million, working out to a 39 percent return.
The result is that the original film was, both in terms of the return on expenditure and absolute return, the most profitable, the second film less profitable than that, the third film least profitable of all--affirming a trend of diminishing returns on the franchise from the monetary standpoint that seems reflective of the diminishing returns on the bigger Marvel Cinematic Universe (MCU) of which it was a part. Of course, that said Guardians of the Galaxy 3's $100 million+ profit still makes it a very worthwhile investment--but with the direction in which this series has moved (a 40 percent drop in relative return, a more than half drop in absolute return), to say nothing of the broader trend with regard to the MCU, the superhero film and the high concept franchise film as we have long known it, should give any astute investor pause in regard to the idea of a Guardians of the Galaxy 4.
Of course, we are likely to be getting one anyway.
Disney Dethroned?
As I remarked in a prior post Deadline is now in the midst of its annual "Most Valuable Blockbuster" tournament. Starting from the bottom of the top ten they have already identified the tenth and ninth-ranking films--the latter of which is Guardians of the Galaxy, which they make clear is Disney's only film to make the list in 2023.
Just one film--and that a mere #9 ranker.
Compare that to how Disney fared in the glory years of the mid- and late 2010s. Disney had the #1 film in 2015 (Star Wars: The Force Awakens), and four in the top ten. In 2016 it missed the #1 position, with its highest-ranking film Rogue One getting just the number three spot, but had five films in the top ten. And then it claimed the #1 position three years running, all while maintaining a disproportionate presence generally, with three films in 2017 and 2018 each (the top three in 2018's case), and then in the annus mirabilis of 2019, having an astonishing seven of the top ten, including, again, the top three (Avengers: Endgame, Frozen 2, The Lion King, Captain Marvel, Toy Story 4, Aladdin, Star Wars: The Rise of Skywalker).*
Of course, that things went this way is unlikely to be very surprising to anyone who has paid attention to the box office during the last year or so, given how even in a year of flops Disney's offerings were conspicuous disappointments--Indiana Jones 5 and Captain Marvel 2 most notoriously, but also The Little Mermaid remake, Ant-Man 3, Wish and Elemental (which may have managed to turn a profit, but still looked a weak performer by Pixar's long exceptional standard). Indeed, it was my guess that on the list of the biggest loss-makers that always accompanies Deadline's list of the biggest profit-makers, Captain Marvel 2 and Indiana Jones 5 would occupy two of the top three spots--all as Disney would be represented elsewhere.
Still, it seems only fair to ask how this came about. Simply put, during the twenty-first century Disney pursued a particular strategy--"Buy up every top-line franchise and brand you can and exploit them ruthlessly." Doing this with Lucasfilm, Marvel and Pixar, as well as its own animated classics-heavy brand, it scored much more than its share of hits. (Again, seven of the top ten most profitable films of 2019 as reckoned at the time!) But the market has changed--so much so that the question now seems to be whether Disney, and for that matter everyone else in Hollywood, is capable of changing with it.
* One may, of course, quibble with the numbers in hindsight given that the procedures of the British film subsidy system keep indicating higher outlays than are usually reported in these lists (as was the case with Avengers 3 and 4), but Disney is unlikely to be the only studio playing games here, and it seems to me that even taking this into account this still broadly reflects how things stood at the time.
Just one film--and that a mere #9 ranker.
Compare that to how Disney fared in the glory years of the mid- and late 2010s. Disney had the #1 film in 2015 (Star Wars: The Force Awakens), and four in the top ten. In 2016 it missed the #1 position, with its highest-ranking film Rogue One getting just the number three spot, but had five films in the top ten. And then it claimed the #1 position three years running, all while maintaining a disproportionate presence generally, with three films in 2017 and 2018 each (the top three in 2018's case), and then in the annus mirabilis of 2019, having an astonishing seven of the top ten, including, again, the top three (Avengers: Endgame, Frozen 2, The Lion King, Captain Marvel, Toy Story 4, Aladdin, Star Wars: The Rise of Skywalker).*
Of course, that things went this way is unlikely to be very surprising to anyone who has paid attention to the box office during the last year or so, given how even in a year of flops Disney's offerings were conspicuous disappointments--Indiana Jones 5 and Captain Marvel 2 most notoriously, but also The Little Mermaid remake, Ant-Man 3, Wish and Elemental (which may have managed to turn a profit, but still looked a weak performer by Pixar's long exceptional standard). Indeed, it was my guess that on the list of the biggest loss-makers that always accompanies Deadline's list of the biggest profit-makers, Captain Marvel 2 and Indiana Jones 5 would occupy two of the top three spots--all as Disney would be represented elsewhere.
Still, it seems only fair to ask how this came about. Simply put, during the twenty-first century Disney pursued a particular strategy--"Buy up every top-line franchise and brand you can and exploit them ruthlessly." Doing this with Lucasfilm, Marvel and Pixar, as well as its own animated classics-heavy brand, it scored much more than its share of hits. (Again, seven of the top ten most profitable films of 2019 as reckoned at the time!) But the market has changed--so much so that the question now seems to be whether Disney, and for that matter everyone else in Hollywood, is capable of changing with it.
* One may, of course, quibble with the numbers in hindsight given that the procedures of the British film subsidy system keep indicating higher outlays than are usually reported in these lists (as was the case with Avengers 3 and 4), but Disney is unlikely to be the only studio playing games here, and it seems to me that even taking this into account this still broadly reflects how things stood at the time.
Taylor Swift and Five Nights at Freddy's Made Deadline's Top Ten Most Valuable Blockbusters of 2023?
Previously thinking about the likely makeup of Deadline's list of the biggest profit-makers of 2023 I suggested the Taylor Swift concert film and Five Nights at Freddy's would get some recognition--but on the list of smaller films that made relatively large profits, rather than the list of "most valuable blockbusters." However, we see Taylor Swift: The Era Tour ranked at #7, and Five Nights at Freddy's at #8 on that very list--a development very surprising given that, where overall gross is concerned, they did not rank all that highly in that year. Where the listed global grosses at Box Office Mojo are concerned Taylor Swift's film not only did not make the top ten, but failed even to make the top twenty, ending up at only #23--behind that notorious flop The Flash (#21), for example.* However, Swift's film was also relatively low-cost, a $15 million production (one-twentieth of what The Flash cost). The result was that when all was said and done the movie, between its $261 million in ticket sales, and subsequent and associated revenues, made a $172 million profit. Thus did it also go for Five Nights at Freddy's--that movie, between costing a little more money and making a little more money than The Eras Tour ending up just a little less profitable than Swift's film (pulling in $161 million according to the Deadline calculation).
Those who anticipated the films would do well seem to be saying "I told you so" --but this seems mainly because they were confident in those movies' prospects, in large part because of personal enthusiasm about them, and focusing on that would make us miss two very large, important developments here.
One is that these comparatively low-grossing movies outdid so many bigger-grossing movies profit-wise, with all that suggests about the declining viability of those movies that were for so long Hollywood's principal moneymakers (like The Flash, which made more money than the Taylor Swift film, but seems certain to produce a nine figure loss for WBD).
The other is that the bar for making the top ten list is a lot lower than it used to be. Back in 2019 Taylor Swift's film would not only not have ranked at #7 with a profit of $172 million, but would not have made the top ten at all. In 2019 the tenth-ranked film on Deadline's list was the Jumanji sequel, with $236 million, while the seventh-ranked film, the live-action adaptation of Aladdin, reportedly made a profit of $356 million, twice what Taylor Swift's film did, even before we consider the considerable inflation of these years (a more than 20 percent rise in prices since 2019).
Together the two factors are indicative of a situation of a shrunken film market--doing well in which may mean playing by very different rules from those that prevailed just a few years ago when Disney was the champion.
Those who anticipated the films would do well seem to be saying "I told you so" --but this seems mainly because they were confident in those movies' prospects, in large part because of personal enthusiasm about them, and focusing on that would make us miss two very large, important developments here.
One is that these comparatively low-grossing movies outdid so many bigger-grossing movies profit-wise, with all that suggests about the declining viability of those movies that were for so long Hollywood's principal moneymakers (like The Flash, which made more money than the Taylor Swift film, but seems certain to produce a nine figure loss for WBD).
The other is that the bar for making the top ten list is a lot lower than it used to be. Back in 2019 Taylor Swift's film would not only not have ranked at #7 with a profit of $172 million, but would not have made the top ten at all. In 2019 the tenth-ranked film on Deadline's list was the Jumanji sequel, with $236 million, while the seventh-ranked film, the live-action adaptation of Aladdin, reportedly made a profit of $356 million, twice what Taylor Swift's film did, even before we consider the considerable inflation of these years (a more than 20 percent rise in prices since 2019).
Together the two factors are indicative of a situation of a shrunken film market--doing well in which may mean playing by very different rules from those that prevailed just a few years ago when Disney was the champion.
Will Xander Cage Come Back for One Last Adventure? (Thoughts on the Possibility of a XXX4)
The Xander Cage (or XXX) spy-fi franchise has had rather a rocky history. Very much of the '90s in its origins, bringing together as it did the American spy-fi boom and resurgence of the 007 franchise in that decade with the "extreme" craze, the idea behind it was to update the James Bond-style spy for "a new generation."
The resulting, first, XXX film did well, but not that well (a $142 million domestic gross, it qualified only for 15th place on the list of the year's top grossers), and the film's star Vin Diesel walked away, the sequel proceeded without him (indeed, Diesel's character Xander Cage was supposedly killed off), and flopped, after which plans for a third movie were, as I remarked in 2012, stuck in development hell.
What brought back the franchise back was the way that Vin Diesel's career resurged largely on the strength of the second wind that the Fast and Furious franchise got with its fourth film (from which point it went from strength to strength through the next several films), which was in good part a matter of how well the movies were doing in the Chinese market. The peak of the series, Furious 7, saw the movie take in a sensational $1.5 billion worldwide, twice as much as the preceding movie that had previously been the series' best performance, with more of that money actually coming from China than from North America, well as it did there. (The grosses in the two countries were $390 million and $353 million, respectively.)
The result was that that next XXX film finally happened--hitting theaters in North America in the dump month of January 2017, an expression of little confidence in the movie, but also the expectation that amid relatively weak competition it might clean up. It did not quite do that in Hollywood's home market, taking in a mere $45 million--but making almost six times that much internationally, adding a bit over $300 million to the total with over half that ($164 million) coming from just China.
Accordingly we heard about plans for a fourth film--because back then it still made some sense to back a relatively costly film on the expectation of a big gross in China. However, seven years later any such film is still just being talked about, and it is not hard to imagine why. The disruptions of the pandemic apart, and the way that the Fast and Furious films' decline cannot but suggest that Diesel having this kind of adventure is not the draw it was just a short time ago, franchise films generally are doing less well. Indeed, if we hear less about the decline of spy-fi than we do the decline of the superhero film that the commercial decline of that form is nonetheless real (you can see it in the gross of No Time to Die, of Mission: Impossible 7, of again those Fast and Furious films), while the XXX franchise was pretty marginal to begin with by its standard (Xander Cage never coming close to rivaling 007, Ethan Hunt or even, after he and his gang turned superspies, Dom Toretto), and the Chinese market that was so important to the last film's profitability has become far less open to Hollywood.
The result is that all those factors that made a fourth film in the series a plausible business decision have vanished. On that basis one might think that as a result the franchise will not manage to escape development hell again. However, Hollywood's determination to exploit every franchise far past the point of diminishing and even negative returns for lack of inclination to do anything else seems to only be growing more pathological. The result is that I cannot see the greenlighting of a new Xander Cage being a sound business decision on the part of the Suits--but then again, neither does a Twister 2 or a Gladiator 2 seem to be that, and both those movies are scheduled to come to a theater near you later this year.
The resulting, first, XXX film did well, but not that well (a $142 million domestic gross, it qualified only for 15th place on the list of the year's top grossers), and the film's star Vin Diesel walked away, the sequel proceeded without him (indeed, Diesel's character Xander Cage was supposedly killed off), and flopped, after which plans for a third movie were, as I remarked in 2012, stuck in development hell.
What brought back the franchise back was the way that Vin Diesel's career resurged largely on the strength of the second wind that the Fast and Furious franchise got with its fourth film (from which point it went from strength to strength through the next several films), which was in good part a matter of how well the movies were doing in the Chinese market. The peak of the series, Furious 7, saw the movie take in a sensational $1.5 billion worldwide, twice as much as the preceding movie that had previously been the series' best performance, with more of that money actually coming from China than from North America, well as it did there. (The grosses in the two countries were $390 million and $353 million, respectively.)
The result was that that next XXX film finally happened--hitting theaters in North America in the dump month of January 2017, an expression of little confidence in the movie, but also the expectation that amid relatively weak competition it might clean up. It did not quite do that in Hollywood's home market, taking in a mere $45 million--but making almost six times that much internationally, adding a bit over $300 million to the total with over half that ($164 million) coming from just China.
Accordingly we heard about plans for a fourth film--because back then it still made some sense to back a relatively costly film on the expectation of a big gross in China. However, seven years later any such film is still just being talked about, and it is not hard to imagine why. The disruptions of the pandemic apart, and the way that the Fast and Furious films' decline cannot but suggest that Diesel having this kind of adventure is not the draw it was just a short time ago, franchise films generally are doing less well. Indeed, if we hear less about the decline of spy-fi than we do the decline of the superhero film that the commercial decline of that form is nonetheless real (you can see it in the gross of No Time to Die, of Mission: Impossible 7, of again those Fast and Furious films), while the XXX franchise was pretty marginal to begin with by its standard (Xander Cage never coming close to rivaling 007, Ethan Hunt or even, after he and his gang turned superspies, Dom Toretto), and the Chinese market that was so important to the last film's profitability has become far less open to Hollywood.
The result is that all those factors that made a fourth film in the series a plausible business decision have vanished. On that basis one might think that as a result the franchise will not manage to escape development hell again. However, Hollywood's determination to exploit every franchise far past the point of diminishing and even negative returns for lack of inclination to do anything else seems to only be growing more pathological. The result is that I cannot see the greenlighting of a new Xander Cage being a sound business decision on the part of the Suits--but then again, neither does a Twister 2 or a Gladiator 2 seem to be that, and both those movies are scheduled to come to a theater near you later this year.
Daniel Bessner and the "Bad Writing" on the Marvel Cinematic Universe's latest
A year ago the press subjected us to quite a bit of drivel about various studio blockbusters suffering because they did not have "good writers"--as if those running the studios backing $300 million productions said "You know what? Let's get the not-good writers for this impossible task of making this umpteenth sequel to a long stale franchise that nobody ever asked for palatable to the public."
It was as stupid as it was shabby, the more in as with a writer's strike on it looked the more obviously like the cheap worker-bashing by bosses it obviously was, but the various figures who made such remarks in the press would not stop offering such comment, even extending it to other artists (as with Robert Iger's all too characteristically foolish crack about "unsupervised directors").
Reading Daniel Bessner's extraordinary article in Harper's about the lot of the writer in neoliberal, post-Great Recession, post-COVID, post-strike Hollywood makes it all the clearer just how illegitimate such nonsense is, given that, again, the studios have robbed the writers of their scope to do their jobs, with Marvel at the absolute forefront of the process. (Thus has Marvel brought the TV-type "writer's room" into film-making, and at the same time, expected writers to write whole seasons of shows without letting them know what the ending is supposed to be because the executives want to hold on to the maximum latitude to generate crappy spin-offs--and then blamed the writers for the failings of the material.)
For those seriously interested in the subject, and interested in the reality of the situation rather than the claqueurs' praises, the piece seems to me essential reading.
It was as stupid as it was shabby, the more in as with a writer's strike on it looked the more obviously like the cheap worker-bashing by bosses it obviously was, but the various figures who made such remarks in the press would not stop offering such comment, even extending it to other artists (as with Robert Iger's all too characteristically foolish crack about "unsupervised directors").
Reading Daniel Bessner's extraordinary article in Harper's about the lot of the writer in neoliberal, post-Great Recession, post-COVID, post-strike Hollywood makes it all the clearer just how illegitimate such nonsense is, given that, again, the studios have robbed the writers of their scope to do their jobs, with Marvel at the absolute forefront of the process. (Thus has Marvel brought the TV-type "writer's room" into film-making, and at the same time, expected writers to write whole seasons of shows without letting them know what the ending is supposed to be because the executives want to hold on to the maximum latitude to generate crappy spin-offs--and then blamed the writers for the failings of the material.)
For those seriously interested in the subject, and interested in the reality of the situation rather than the claqueurs' praises, the piece seems to me essential reading.
Merchandising and the Bottom Line for Today's Movies: Estimating Profitability
It has long been the case that merchandising has been crucial to the revenue stream for Hollywood films for a long time--for so long that many seeing Spaceballs over the years must have been unclear on just why Mel Brooks so pointedly raised the issue in that film. (The reason is that Star Wars was the watershed here.)
In its calculations Deadline has income from merchandising accounting for over a fifth of the revenue for PAW Patrol: The Mighty Movie (which made the #10 spot in its Most Valuable Blockbusters of 2023 tournament). Indeed, the publication's analyst Anthony D'Alessandro notes that merchandising was decisive where profitability was concerned, and indeed the film "greenlit" here because of this.
This is a more common story than one might think--but it seems to me that taking merchandising into account here and not in the case of other films (it is not mentioned in the report on Guardians of the Galaxy 3, of the huge importance of merchandising to Marvel's profits) is distortive of the rankings this time around.
In its calculations Deadline has income from merchandising accounting for over a fifth of the revenue for PAW Patrol: The Mighty Movie (which made the #10 spot in its Most Valuable Blockbusters of 2023 tournament). Indeed, the publication's analyst Anthony D'Alessandro notes that merchandising was decisive where profitability was concerned, and indeed the film "greenlit" here because of this.
This is a more common story than one might think--but it seems to me that taking merchandising into account here and not in the case of other films (it is not mentioned in the report on Guardians of the Galaxy 3, of the huge importance of merchandising to Marvel's profits) is distortive of the rankings this time around.
Might Profits for Movies Mean Losses for Streamers? (Or Vice-Versa?)
Reviewing the numbers Deadline offers on the most (and least) profitable movies of the year one sees a differentiation of the revenue streams for those films by type, with streaming indicated separately.
As it happens, streaming platforms are largely studio-owned--with Disney owning Disney Plus, for example. And so an important part of the revenue for Disney movies consists of Disney Plus buying the rights to stream a Disney-made movie. (As Anthony D'Alessandro explains in the item devoted to Guardians of the Galaxy 3 the $180 million Deadline reports for the revenue from streaming "includes the amount for which Disney sells the film to itself, Disney+ being the pay-one window for the studio's movies.")
In and of itself this should shock no one who has been paying attention. In the current world economy, where things like antitrust law are so often dead letters, "related-party" trade in its various forms makes up a very high proportion of the world's total trade. Indeed, of the U.S.' import-export trade in goods in 2021 and 2022 over 40 percent was represented by related-party trade of this kind. Still, as Investopedia explains, while "there are rules and standards for related-party transactions," there are conflicts of interest, and particular difficulties for auditing the transactions, with "improperly inflated earnings, even fraud" something that can--does--happen.
Thus some see such numbers, and wonder. After all, no more are Hollywood studios than the rest of the corporate world known for their impeccable accounting, or for their forthrightness about financial reality--and it is all too easy for a studio to fatten the bottom line for a movie by selling it to its streaming service at an inflated price, or should the alternative prove more desirable at the time, sell the movie at a below-market price to help out the streaming service. And if it seems to me that the entertainment press does not much interest itself in such questions (like the rest of the press they are more likely to do PR for big companies than properly cover them), I have seen in comment threads and various other fora that the "armchair movie executive" at which that press sneers do see, and doubt, far from unreasonably.
As it happens, streaming platforms are largely studio-owned--with Disney owning Disney Plus, for example. And so an important part of the revenue for Disney movies consists of Disney Plus buying the rights to stream a Disney-made movie. (As Anthony D'Alessandro explains in the item devoted to Guardians of the Galaxy 3 the $180 million Deadline reports for the revenue from streaming "includes the amount for which Disney sells the film to itself, Disney+ being the pay-one window for the studio's movies.")
In and of itself this should shock no one who has been paying attention. In the current world economy, where things like antitrust law are so often dead letters, "related-party" trade in its various forms makes up a very high proportion of the world's total trade. Indeed, of the U.S.' import-export trade in goods in 2021 and 2022 over 40 percent was represented by related-party trade of this kind. Still, as Investopedia explains, while "there are rules and standards for related-party transactions," there are conflicts of interest, and particular difficulties for auditing the transactions, with "improperly inflated earnings, even fraud" something that can--does--happen.
Thus some see such numbers, and wonder. After all, no more are Hollywood studios than the rest of the corporate world known for their impeccable accounting, or for their forthrightness about financial reality--and it is all too easy for a studio to fatten the bottom line for a movie by selling it to its streaming service at an inflated price, or should the alternative prove more desirable at the time, sell the movie at a below-market price to help out the streaming service. And if it seems to me that the entertainment press does not much interest itself in such questions (like the rest of the press they are more likely to do PR for big companies than properly cover them), I have seen in comment threads and various other fora that the "armchair movie executive" at which that press sneers do see, and doubt, far from unreasonably.
The Erosion of Movie Ticket Sales Before the Pandemic
Writing in this blog in the past I have repeatedly discussed how after the collapse of theatergoing in the 1950s and 1960s North American theatergoing stabilized at about 4-5 per capita ticket sales a year.
I made that calculation over a decade ago, and it seemed to me that the situation did not change so dramatically as to compel redoing that calculation.
Reexamining the numbers for the 2010s I am no longer convinced of that. Certainly there has been nothing so dramatic as the collapse of filmgoing when Americans became TV-watchers. Still, consider these figures for 1995-2019 as split into five year periods.*
In 1995-1999 the average was 4.4 tickets per capita per year.
In 2000-2004 it was 4.6.
In 2005-2009 it was 4.2.
So far, so good. Of course, after 2009 the figure fell below 4, but that was in the wake of the shock of the Great Recession, after which the figure got back up to about 4 in 2012, so it looked like the old pattern held up to that point, 4-5 trips a year plausibly remaining the norm. Still, 2012 proved the anomaly just a few short years on. When we take the two five-year blocks after 2009 as wholes, we find that in 2010-2014 the figure was just 3.8 tickets per capita, and if one could think of the number as depressed by the Great Recession, in 2015-2019 the figure was 3.6. Of course, even this rounds up to four, so that one could say that North Americans saw four movies theatrically a year, but that is not the same as 4 to 5, while within the 2015-2019 block one could see the decline continue, the figure hitting 3.4 in 2019--a pre-pandemic low.
The shift from 4-5 in 1995-2009 to 3-4 in 2010-2019, a slippage of one trip a year, can seem a small thing relative to the drop in the '50s and '60s from thirty per capita movie tickets a year to just four. But proportionately it is still quite a bit, a decline of a fifth or a quarter from what it had been in the early 2000s to the late 2010s.
Of course, 3.6 per capita ticket sales, or even 3.4, is now quite a bit more than the business Hollywood is doing. In 2023, which plausibly looked like the "new normal" for the industry in light of moviegoers' habits and the abundance of major releases on offer, ticket sales were more like 2.2 per capita--2 to 2.5 rather than 3 to 4, another big drop, which may not represent the end of the fall. After all, 2 to 2.5 per capita ticket sales a year still indicates a good deal more moviegoing than we see in some other advanced industrialized countries (in Germany and Japan it is more like 1-2), suggesting that still less is far from implausible, while the factors that are driving moviegoing down continue, not least competition from other sources of entertainment (why not just stay home and watch streaming?), and of course, the reality that Hollywood's model for making hits seems to be in a state of collapse, with any shift to a successful new model at best in its earliest stages--and very possibly pure fantasy in the face of the financial, technological and cultural trend.
* For the 1995-2019 calculations I derived the figures on tickets sold from the web site The Numbers, the population figures for the countries encompassed in the North American film market from the World Bank's DataBank.
I made that calculation over a decade ago, and it seemed to me that the situation did not change so dramatically as to compel redoing that calculation.
Reexamining the numbers for the 2010s I am no longer convinced of that. Certainly there has been nothing so dramatic as the collapse of filmgoing when Americans became TV-watchers. Still, consider these figures for 1995-2019 as split into five year periods.*
In 1995-1999 the average was 4.4 tickets per capita per year.
In 2000-2004 it was 4.6.
In 2005-2009 it was 4.2.
So far, so good. Of course, after 2009 the figure fell below 4, but that was in the wake of the shock of the Great Recession, after which the figure got back up to about 4 in 2012, so it looked like the old pattern held up to that point, 4-5 trips a year plausibly remaining the norm. Still, 2012 proved the anomaly just a few short years on. When we take the two five-year blocks after 2009 as wholes, we find that in 2010-2014 the figure was just 3.8 tickets per capita, and if one could think of the number as depressed by the Great Recession, in 2015-2019 the figure was 3.6. Of course, even this rounds up to four, so that one could say that North Americans saw four movies theatrically a year, but that is not the same as 4 to 5, while within the 2015-2019 block one could see the decline continue, the figure hitting 3.4 in 2019--a pre-pandemic low.
The shift from 4-5 in 1995-2009 to 3-4 in 2010-2019, a slippage of one trip a year, can seem a small thing relative to the drop in the '50s and '60s from thirty per capita movie tickets a year to just four. But proportionately it is still quite a bit, a decline of a fifth or a quarter from what it had been in the early 2000s to the late 2010s.
Of course, 3.6 per capita ticket sales, or even 3.4, is now quite a bit more than the business Hollywood is doing. In 2023, which plausibly looked like the "new normal" for the industry in light of moviegoers' habits and the abundance of major releases on offer, ticket sales were more like 2.2 per capita--2 to 2.5 rather than 3 to 4, another big drop, which may not represent the end of the fall. After all, 2 to 2.5 per capita ticket sales a year still indicates a good deal more moviegoing than we see in some other advanced industrialized countries (in Germany and Japan it is more like 1-2), suggesting that still less is far from implausible, while the factors that are driving moviegoing down continue, not least competition from other sources of entertainment (why not just stay home and watch streaming?), and of course, the reality that Hollywood's model for making hits seems to be in a state of collapse, with any shift to a successful new model at best in its earliest stages--and very possibly pure fantasy in the face of the financial, technological and cultural trend.
* For the 1995-2019 calculations I derived the figures on tickets sold from the web site The Numbers, the population figures for the countries encompassed in the North American film market from the World Bank's DataBank.
"Jobs for People Who Don't Need to be Paid"
Looking at recent reports about the working conditions faced by film and TV writers, actors, musicians and other artists one thing that has come up again and again is that the severity in the deterioration of conditions for all but the superstars has made it decreasingly possible for even those who have supposedly "made it" (landed a regular role on a hit show, signed with a major label) to live from their work. One consequence is that everyone must have another source of income--"you had your . . . day jobs or you had a trust fund" as Mad Men veteran Jason Grote put it. Especially as the demands, and costs, of trying to hold down a "day job" while making a career as an artist should not be trivialized (perhaps especially in these times, when wages' purchasing power have plummeted so), those who do not need to work for such an income have an advantage, even at this level, with the result that actual working people are finding less and less opportunity to pursue such a career. The result is that if the arts have always been dominated by the privileged, it may be that in our time this is becoming appreciably more rather than less the case.
Miserable as this is for all those who have artistic aspirations who committed the crime of not being born rich, it seems quite plausible that this has played its part in the extreme remoteness of the arts from the lives of the vast majority of the population in our time--to the impoverishment of both the arts, and society at large.
Miserable as this is for all those who have artistic aspirations who committed the crime of not being born rich, it seems quite plausible that this has played its part in the extreme remoteness of the arts from the lives of the vast majority of the population in our time--to the impoverishment of both the arts, and society at large.
"Raising Eyebrows"
I have never cared much for the journalistic cliché that some remark or act has been "raising eyebrows"--their way of conveying that said remark or act has prompted mildly disapproving surprise. Apart from the ambiguity about whose eyebrows were raised, precisely, and why (weasel words time!), there is the fact that it is so often a poor descriptor. If at times muting the reaction to something quite serious, it is much more often a matter of insufferable priggishness at some trivial breach of "the proprieties," undeserving of the respect implicit in the journalistic coverage. It seems more fitting to speak of "dropping monocles" than "raising eyebrows" in such cases.
Thursday, April 25, 2024
What's Happening with Boxoffice Pro?
For quite some time I have found Boxoffice Pro's comprehensive forecasts of upcoming and recently released films' grosses and their systematic updating of great interest, especially in these times in which so many movies surprise us amid what may be historic changes in the business of film--big franchise films crashing and burning (like The Flash), or unlikely-seeming movies becoming colossal hits (like Oppenheimer), as Hollywood confronts the possible end of the high concept-dominated post-New Hollywood era.
Alas, for some weeks now the forecasting section of the site has been much reduced, with just a handful of long-range (or even weekend) forecasts appearing this past month. (Thus The Fall Guy forecast I cited has not had a single update.)
I imagine that the impoverishment of the forecasting section is a temporary situation. (After all, Boxoffice Pro is a major film industry trade publication in circulation since 1920, before "movies" became "talkies.") Still, I have no idea when things will get back to normal that way--and can only look forward to when they do so.
Alas, for some weeks now the forecasting section of the site has been much reduced, with just a handful of long-range (or even weekend) forecasts appearing this past month. (Thus The Fall Guy forecast I cited has not had a single update.)
I imagine that the impoverishment of the forecasting section is a temporary situation. (After all, Boxoffice Pro is a major film industry trade publication in circulation since 1920, before "movies" became "talkies.") Still, I have no idea when things will get back to normal that way--and can only look forward to when they do so.
Daniel Bessner's New Article in Harper's
Daniel Bessner's "The Life and Death of Hollywood" is exactly what journalism ought to be but all too rarely is--a deeply informed piece of reporting that goes beyond mentioning some facts or passing on some impressions to putting together a picture of its subject and explaining how things stand to the reader with, in this case, the analysis of Hollywood's development in recent decades, especially as seen from the standpoint of its writers, benefiting greatly from its combination of critical perspective, historical background and attentiveness to contemporary economic fact.* In this piece Bessner quite rightly explains Hollywood as undergoing its own piece of the broader neoliberal turn, affecting it in the same way that it has affected the rest of America's economy and society. Just as has happened elsewhere, the convergence of deregulation (in this case, particularly the collapse of antitrust enforcement under the Reagan administration, and the Telecommunications Act of 1996 under Clinton), combined with creditist monetary policy to produce an extraordinary concentration of ownership, production and market power, and a triumph of speculation and short-termism, that has had catastrophic effects for the workers involved and society at large as the game became one of "winner take all." Thus did the early '00s see a half dozen vertically integrated giants (Disney, Time Warner and the other usual suspects) "raking in more than 85 percent of all film revenue and producing more than 80 percent of American prime-time television," while the ultra-loose monetary policy Ben Bernanke ushered in enabled further concentration as a mere three asset-management companies (BlackRock, Vanguard, State Street) to concentrate in their hands the ownership of, well, nearly everything ("becoming the largest shareholders of 88 percent of the S&P 500"), with the media giants no exception to the pattern. (As of the end of 2023 Vanguard "owned the largest stake in Disney, Netflix, Comcast, Apple, and Warner Bros. Discovery," while also having "substantial share[s] of Amazon and Paramount Global.") Indeed, even the talent agencies supposed to represent the writers to those companies similarly became an oligopoly of a few giants in turn owned by asset management firms.
In the wake of the reduction of film and TV production to an oligopoly owned by a few financial firms, and the inevitable ascent of a pseudo-efficiency-minded short-termist control freak mentality among studio executives, it was no accident that, in line with his cite of Shawna Kidman, the share of "franchise movies" in "studios' wide-release features" surged from 25 percent in 2000 to 64 percent in 2017--the age of "multistep deals" and "spec scripts" increasingly a thing of the past, exemplified by the crassness of Disney, and perhaps especially its Marvel operation, which "pioneered a production apparatus in which writers were often separated from the conception and creation of a movie’s overall story." (Thus did the writers have to generate a whole season of scripts for WandaVision without knowing how the season was supposed to end, because the "executives had not yet decided what other stories they might spin off from the show." Thus did Marvel bring the TV-style "writer's room" into filmmaking--if one calls the result filmmaking--as the TV networks replaced their writer's rooms with "mini-writer's rooms" offering those who work in them still less.) Amid the declining readiness to support the development of many ideas in the hope that one would pay off (thirty scripts to get one which might be made into a movie in the old days), amid the declining creative control and bargaining power of writers ever at a disadvantage in relation to the executives whose media courtiers celebrate them as "geniuses" (as with a Kevin Feige), amid the studios playing the old game of exploiting technological changes to contractually cut writers out of any share in the profits (in this case, streaming and its revenues, as Bessner shows through the case of Dickinson creator Alena Smith), writers lost in just about every way. Moreover, the post-pandemic economic shock has dealt them another blow as media companies that had, however exploitatively, been funding a lot of production retrenched, and all too predictably won the labor battle of 2023--at the end of which they may be said to have imposed an at best thinly veiled Carthaginian peace, with what is to come likely to be worse.
Given all that Bessner naturally declined to end his story on a note of false optimism, instead going from the "Life" to the "Death" part of his title. As he notes one could picture various palliatives helping, like government regulation of asset ownership (or indeed, mere application of antitrust law here), but all this seems unthinkable--precisely because what is happening here is so much of a piece with what is happening in all the rest of the economy, and it is all too clear how that is going. Indeed, one can picture things becoming even worse still in the fairly near term if the Hollywood executive class manages to realize its fantasies of using artificial intelligence to dispense with its workers entirely in even a small degree--or even in spite of the technology's inadequacies, simply tries and fails to do so, but destroys a lot of careers and "disrupts" the industry in all the wrong ways in the process.
* Reading Dr. Bessner's biography I found that he is not a reporter but a professor of International Studies--and cannot help wondering if that did not make all the difference for the quality of the item, especially where the avoidance of the hypocrisy of "objectivity" and the contempt for context that characterizes so much contemporary journalism are concerned.
In the wake of the reduction of film and TV production to an oligopoly owned by a few financial firms, and the inevitable ascent of a pseudo-efficiency-minded short-termist control freak mentality among studio executives, it was no accident that, in line with his cite of Shawna Kidman, the share of "franchise movies" in "studios' wide-release features" surged from 25 percent in 2000 to 64 percent in 2017--the age of "multistep deals" and "spec scripts" increasingly a thing of the past, exemplified by the crassness of Disney, and perhaps especially its Marvel operation, which "pioneered a production apparatus in which writers were often separated from the conception and creation of a movie’s overall story." (Thus did the writers have to generate a whole season of scripts for WandaVision without knowing how the season was supposed to end, because the "executives had not yet decided what other stories they might spin off from the show." Thus did Marvel bring the TV-style "writer's room" into filmmaking--if one calls the result filmmaking--as the TV networks replaced their writer's rooms with "mini-writer's rooms" offering those who work in them still less.) Amid the declining readiness to support the development of many ideas in the hope that one would pay off (thirty scripts to get one which might be made into a movie in the old days), amid the declining creative control and bargaining power of writers ever at a disadvantage in relation to the executives whose media courtiers celebrate them as "geniuses" (as with a Kevin Feige), amid the studios playing the old game of exploiting technological changes to contractually cut writers out of any share in the profits (in this case, streaming and its revenues, as Bessner shows through the case of Dickinson creator Alena Smith), writers lost in just about every way. Moreover, the post-pandemic economic shock has dealt them another blow as media companies that had, however exploitatively, been funding a lot of production retrenched, and all too predictably won the labor battle of 2023--at the end of which they may be said to have imposed an at best thinly veiled Carthaginian peace, with what is to come likely to be worse.
Given all that Bessner naturally declined to end his story on a note of false optimism, instead going from the "Life" to the "Death" part of his title. As he notes one could picture various palliatives helping, like government regulation of asset ownership (or indeed, mere application of antitrust law here), but all this seems unthinkable--precisely because what is happening here is so much of a piece with what is happening in all the rest of the economy, and it is all too clear how that is going. Indeed, one can picture things becoming even worse still in the fairly near term if the Hollywood executive class manages to realize its fantasies of using artificial intelligence to dispense with its workers entirely in even a small degree--or even in spite of the technology's inadequacies, simply tries and fails to do so, but destroys a lot of careers and "disrupts" the industry in all the wrong ways in the process.
* Reading Dr. Bessner's biography I found that he is not a reporter but a professor of International Studies--and cannot help wondering if that did not make all the difference for the quality of the item, especially where the avoidance of the hypocrisy of "objectivity" and the contempt for context that characterizes so much contemporary journalism are concerned.
Upton Sinclair on the Greeks
Discussing the literary legacy of the ancient Greeks in Mammonart Upton Sinclair asks just "how much do we really admire Greek literature and Greek art, and how much do we just pretend to admire it?" Recalling Samuel Johnson's quip about a dog walking on two legs, and how "it is not well done, but we are surprised that it is done at all," he suggested that the conventional exaltation of the Greeks is based not on a clear-headed judgment of the poetry and drama by modern standards, next to which these works are technically crude and representative of a world-view so unacceptable to modern people that as a matter of course we can seem "in denial" about it (far from Greek art being an art "of joy and freedom," their theme was invariably "the helplessness of the human spirit in the grip of fate," and at that, a fate which destroyed them), but rather a matter of "superstition" that has been "maintained by gentlemen who have acquired honorific university degrees, which represent to them a meal ticket," and the snobbery to which they cater (Classicism in itself become so "leisure class" by his time as to make "Homer . . . to the British world of culture what the top-hat is to the British sartorial"). Indeed, in Sinclair's view those who (like a Matthew Arnold or William Gladstone) "write volumes of rhapsody about Homer" testify less to Homer's superlativeness as a poet than his works' accordance with their prejudices--his "ha[ving] the aristocratic point of view, and giv[ing] the aristocratic mind what it craves," namely a vision of life in which aristocrats "unrestrained in their emotions and limitless in their desires" act out and are flattered in their self-importance by seeing the gods themselves take an interest in their personal tales, all as, one might add, they generally sing conservatives' militarist, patriotic, values.
This is not to say that Sinclair contemptuously dismissed it all (let alone was in any way a promoter of the kind of "cancel culture" that today seeks to "cancel" the Classics along with so much else). If Sinclair stresses the Greek pessimism he thinks too much overlooked, he is historically-minded enough to acknowledge, as others of like ideology have done, that to the ancient mind the world easily seemed "a place of blind cruelty, the battle-ground of forces which he did not understand," and out of this "made . . . a philosophy of stern resignation, and an art of beautiful but mournful despair"--and not confuse all this with the "dispensation of official pessimism" in more enlightened modern times (as with the apparent "'Classical' attitude" of "pathetic and heroic" "resignation to the pitiful fate of mankind on earth" of the aforementioned Arnold), which he saw as coming from quite other sources. Indeed, for all the limitations of that ancient mind Sinclair owns to pleasure in being able to see for ourselves "the beginnings of real thinking, of mature attitudes toward life" in in their early writings, and much else besides. If Sinclair finds much in Homer ridiculous and repellent, it seemed to him that one still did find "beautiful emotion"--albeit not the ones that moved lovers of the "heroism" of an Achilles, but "the mothers and fathers, the wives of children of those heroes [who] express for them an affection of which they are unworthy." He finds much to admire in the satire of Euripedes, "jeering at militarism and false patriotism, denouncing slavery and the subjection of women in the home, rebuking religious bigotry, undermining the noble and wealthy classes." Yet, if it seems to Sinclair appropriate to qualify one's criticisms, and that the works have their worthwhile aspects, that exaltation of the Classics in the familiar way as a matter of superstition and snobbery above all else stands.
Reading Sinclair it seems to me that there is an enormous amount of truth in this statement--which, indeed, ought to be evident to anyone who actually tries looking at those writings with clear eyes, as others have done, not all of them inclined to see things his way on most issues. (The critic John Crowe Ransom, who as one of the twelve writers of the Southern Agrarian Manifesto was far, far removed from Sinclair politically, seems to me to have grasped very well, and put even more poetically and succinctly, just how remote the Greek view of life, or at least the early Greek view, was from one of "joy and freedom.") Indeed, present a professor of literature today with the hard facts of just what is in "this stuff" and you are likely to get as their answer a sort of embarrassed agreement that reminds you just how much the Canon, and indeed what we say about the Canon, is the product of timid deference to received judgment, which one is expected to, in the words of Oscar Wilde, "endure . . . as the inevitable." And so the superstition and snobbery of which Sinclair wrote significantly prevail a century on--qualified, of course, by the decreasing extent to which anyone is paying attention to the Classics or to the humanities or to culture at all.
This is not to say that Sinclair contemptuously dismissed it all (let alone was in any way a promoter of the kind of "cancel culture" that today seeks to "cancel" the Classics along with so much else). If Sinclair stresses the Greek pessimism he thinks too much overlooked, he is historically-minded enough to acknowledge, as others of like ideology have done, that to the ancient mind the world easily seemed "a place of blind cruelty, the battle-ground of forces which he did not understand," and out of this "made . . . a philosophy of stern resignation, and an art of beautiful but mournful despair"--and not confuse all this with the "dispensation of official pessimism" in more enlightened modern times (as with the apparent "'Classical' attitude" of "pathetic and heroic" "resignation to the pitiful fate of mankind on earth" of the aforementioned Arnold), which he saw as coming from quite other sources. Indeed, for all the limitations of that ancient mind Sinclair owns to pleasure in being able to see for ourselves "the beginnings of real thinking, of mature attitudes toward life" in in their early writings, and much else besides. If Sinclair finds much in Homer ridiculous and repellent, it seemed to him that one still did find "beautiful emotion"--albeit not the ones that moved lovers of the "heroism" of an Achilles, but "the mothers and fathers, the wives of children of those heroes [who] express for them an affection of which they are unworthy." He finds much to admire in the satire of Euripedes, "jeering at militarism and false patriotism, denouncing slavery and the subjection of women in the home, rebuking religious bigotry, undermining the noble and wealthy classes." Yet, if it seems to Sinclair appropriate to qualify one's criticisms, and that the works have their worthwhile aspects, that exaltation of the Classics in the familiar way as a matter of superstition and snobbery above all else stands.
Reading Sinclair it seems to me that there is an enormous amount of truth in this statement--which, indeed, ought to be evident to anyone who actually tries looking at those writings with clear eyes, as others have done, not all of them inclined to see things his way on most issues. (The critic John Crowe Ransom, who as one of the twelve writers of the Southern Agrarian Manifesto was far, far removed from Sinclair politically, seems to me to have grasped very well, and put even more poetically and succinctly, just how remote the Greek view of life, or at least the early Greek view, was from one of "joy and freedom.") Indeed, present a professor of literature today with the hard facts of just what is in "this stuff" and you are likely to get as their answer a sort of embarrassed agreement that reminds you just how much the Canon, and indeed what we say about the Canon, is the product of timid deference to received judgment, which one is expected to, in the words of Oscar Wilde, "endure . . . as the inevitable." And so the superstition and snobbery of which Sinclair wrote significantly prevail a century on--qualified, of course, by the decreasing extent to which anyone is paying attention to the Classics or to the humanities or to culture at all.
Alex Garland's Civil War: Some of the Critics' Views
Recently remarking Alex Garland's Civil War the movie theater trade publication Boxoffice Pro characterized the film's promotion as a "bait-and-switch marketing campaign that sold opening weekend audiences on the promise of an action movie" while "delivering a bleak drama that focuses a lot more on journalism instead."
It seems to me that many would regard this as not the only piece of bait-and-switch at work in the film, or even the most important one. As Forbes' Erik Kain remarked, the movie Civil War, in the publicity for which even the "action movie" aspect was arguably less important than the prospect of a drama about American political divisions (there are lots of action movies out there, not so many political dramas with a "high concept" draw like that one), ended up not really being about those divisions, Garland studiously "avoid[ing] the politics of the day in order to tell a more universal story." Kain's view of this approach was favorable, but others have been less impressed (finding in Garland's approach less universalism or "neutrality" than thoughtlessness and evasion and maybe worse in a piece of what one critic has described as a "gutless" cinematic "both sidesism").
However, whatever one makes of it, audiences would seem to have been pulled in by the offer of one thing, and given another. This weekend's gross will likely say something about the audience's ultimate reaction to that.
It seems to me that many would regard this as not the only piece of bait-and-switch at work in the film, or even the most important one. As Forbes' Erik Kain remarked, the movie Civil War, in the publicity for which even the "action movie" aspect was arguably less important than the prospect of a drama about American political divisions (there are lots of action movies out there, not so many political dramas with a "high concept" draw like that one), ended up not really being about those divisions, Garland studiously "avoid[ing] the politics of the day in order to tell a more universal story." Kain's view of this approach was favorable, but others have been less impressed (finding in Garland's approach less universalism or "neutrality" than thoughtlessness and evasion and maybe worse in a piece of what one critic has described as a "gutless" cinematic "both sidesism").
However, whatever one makes of it, audiences would seem to have been pulled in by the offer of one thing, and given another. This weekend's gross will likely say something about the audience's ultimate reaction to that.
Sunday, April 21, 2024
The New York Times, Right-Wing Publication
Recently reading the Columbia Journalism Review's analysis of pre-2022 midterm coverage of domestic affairs by the New York Times and Washington Post I was unsurprised (indeed, confirmed in my views) when they quantitatively demonstrated those papers' preference for politics to policy, and the rarity with which anything published in them actually attempted to explain the affairs of the day to the general public.
I was more surprised by the editorial contrast between the Times and its Washington-based counterpart. The Review article found that, if both papers were more prone to attend to the issues that Republicans care most about than those Democratic voters care most about, the Times was significantly more prone to do so than the Post. By the analysts count the Times favored the Republicans' concerns by a margin of over 5 to 1 (37 to 7 pieces, respectively), as against the 4 to 3 margin seen with the Post. It is an astonishing predominance, even in comparison with that other newspaper, that can seem to bespeak the political shift of the newspaper that publishes Ross Douthat, Bret Stephens, Christopher Caldwell (to say nothing of figures like Thomas Friedman).
I was more surprised by the editorial contrast between the Times and its Washington-based counterpart. The Review article found that, if both papers were more prone to attend to the issues that Republicans care most about than those Democratic voters care most about, the Times was significantly more prone to do so than the Post. By the analysts count the Times favored the Republicans' concerns by a margin of over 5 to 1 (37 to 7 pieces, respectively), as against the 4 to 3 margin seen with the Post. It is an astonishing predominance, even in comparison with that other newspaper, that can seem to bespeak the political shift of the newspaper that publishes Ross Douthat, Bret Stephens, Christopher Caldwell (to say nothing of figures like Thomas Friedman).
Media Biases: A Quantitative Analysis from the Columbia Journalism Review
Some time ago I extended my analysis of political centrism to the mainstream news media and argued that much of its conduct--and in particular much of the conduct that many view as problematic--is explicable in terms of centrist ideology. This particularly included the matters of the media's preference for politics to policy, and its declining to attempt to explain complex issues to the public.
As it happens a November 2023 piece in the Columbia Journalism Review setting out the results of a quantitative examination of the pre-2022 midterm coverage of domestic affairs by the New York Times and Washington Post (from September 2022 to Election Day) was consistent with exactly that reading. The analysts found that, apart from their both emphasizing campaign "horse race and . . . palace intrigue," only a very small proportion of the items had any policy content to speak of, particularly of the explicatory type. Of 219 front page stories about domestic politics during this period in the Times, "just ten . . . explained domestic policy in any detail," while of 215 stories in the Post just four "discussed any form of policy"--working out to less than 5 percent in the former case, less than 2 percent in the latter case. Indeed, the Post did not have a single front-page story about "policies that candidates aimed to bring to the fore or legislation they intended to pursue" during that whole critical period, opting instead for speculative pieces about the candidates and "where voter bases were leaning."
Remember, these are the most respected mainstream publications--the "papers of record"--the kind of publications that many of those complaining about "fake news" tell us to attend to instead of the rest of what we are apt to see online. Considering that fact, one can only judge these publications to be themselves a considerable part of the problem with regard to the deficiencies of the public's understanding of current events--with, indeed, the preference for politics over policy and the extreme paucity of explanation just the tip of the iceberg.
As it happens a November 2023 piece in the Columbia Journalism Review setting out the results of a quantitative examination of the pre-2022 midterm coverage of domestic affairs by the New York Times and Washington Post (from September 2022 to Election Day) was consistent with exactly that reading. The analysts found that, apart from their both emphasizing campaign "horse race and . . . palace intrigue," only a very small proportion of the items had any policy content to speak of, particularly of the explicatory type. Of 219 front page stories about domestic politics during this period in the Times, "just ten . . . explained domestic policy in any detail," while of 215 stories in the Post just four "discussed any form of policy"--working out to less than 5 percent in the former case, less than 2 percent in the latter case. Indeed, the Post did not have a single front-page story about "policies that candidates aimed to bring to the fore or legislation they intended to pursue" during that whole critical period, opting instead for speculative pieces about the candidates and "where voter bases were leaning."
Remember, these are the most respected mainstream publications--the "papers of record"--the kind of publications that many of those complaining about "fake news" tell us to attend to instead of the rest of what we are apt to see online. Considering that fact, one can only judge these publications to be themselves a considerable part of the problem with regard to the deficiencies of the public's understanding of current events--with, indeed, the preference for politics over policy and the extreme paucity of explanation just the tip of the iceberg.
What is a Superhero? An Attempt at a Definition
What is a superhero?
It does not seem unreasonable to define the term. Granted, a certain sort of person dismisses such attempts as bound to fail given that exceptions will always exist to any definition--but the existence of ambiguities does not, as such, keep definition from being a useful, in fact essential, tool for making sense of reality. (The map may not be the territory, but that does not make maps useless, no matter how much contrarians addicted to stupid epistemological nihilism natter on.)
A good starting point for such a definition is an examination of those criteria that may seem least dispensable to the conception of the superhero as we know them today--the superhero tradition as established in the comic book medium, especially from the advent of Superman forward.
1. They Must Be Both a Hero--and Super.
A superhero must obviously be a hero--a figure that does something both positive and extraordinary such as is worthy of admiration by others. Where the superhero's sort of heroism is concerned there is an expectation of service to the community of some physical kind, usually entailing courage in the face of physical danger, to save the lives of others because it is the right thing to do rather than a matter of self-interest; with such action a vocation rather than an exceptional incident. Of course, people pursue vocations that have them performing such acts--firefighters are the go-to example--but superheroes are distinguished from them in that they are super, their abilities exceeding the human norm in a way which is quantitative or qualitative or both. (Conventionally the firefighter running into a burning building must make do with "merely" human physical and mental capacities, but a superhero might be able to put out the fire with a blast of icy breath, as Superman can.)
2. They Have a Very Distinct and Very Public Persona, of which Distinct Powers, Codenames and Physical Appearances (Usually but Not Always a Function of Costuming) Tend to be a Part.
The above seems to me mostly self-explanatory. In contrast with, for example, heroes who are secret agents, who are likely to conceal their "heroic" aspect behind cover identities as they set about their work, the superhero's cover identity, their public persona, is itself heroic--and indeed advertised to the world at large, the more in as its elements include the aforementioned matters of a power, or powers, that tend to be very individualized; a similarly unique codename; and a unique appearance, usually but not always derived from a costume. (Barry Allen is a hero made a superhero by his possession of "speedster" capabilities such as even other DC superheroes tend not to have to fighting evil; is, appropriately, known as the Flash, after his speed; and wears a red costume with lightning bolt insignia, also evoking that speed. Other figures such as Marvel's The Hulk and the Thing have distinct powers and codenames, and their distinct appearance--albeit less reliant on any costume in their case.)
3. The Superhero's Activity is Highly Individualistic.
As implied by the exceptional character of the hero's capacity, and their conspicuous public persona, the conception of the superhero is individualistic in action. Consider, for example, the firefighter and secret agent both. Both figures are conventionally employees of a public agency, which pays them money to do a job in line with orders--which does not make a sense of duty irrelevant by any means, but still raises the question of the paycheck, career, etc. in ways that unavoidably give their activity a different texture. Both also rely on their organizations for their ability to do their jobs--unlikely to have equipment or other supports necessary to their hero task if they were not so employed. By contrast the superhero is ordinarily alone, answering to no one--taking no orders, and often having a complex and fraught relationship with authority (epitomized by stories in which Batman falls afoul of the law), all as what equipment they need is their own (one reason why superheroes are so often independently wealthy, as Batman or Iron Man is).
Moreover, when we see superheroes "team up," teaming up is exactly what they do--these individuals cooperating in a joint effort rather than relinquishing their identities to be good "organization" men and women. (Iron Man is first and foremost Iron Man, and never "just" an Avenger.)
Of course, much else tends to go with this. The public persona is often a way of concealing a private identity, and at least attempting to protect a private life. And of course there is apt to be the unusual origin story--for superhuman abilities, and the decision to put on a costume and fight a private war against crime or some other such evil are the kinds of things for which most people expect an explanation, which is likely to be extraordinary because of what has to be explained. (Thus is Superman literally from another planet, while Batman has been motivated by childhood trauma and equipped by a lifetime of preparation for his vigilante mission, aided by vast wealth as well as extraordinary talent.) One may add that the superhero almost always faces a supervillain at some point--because any other sort of villain is a less than worthy adversary. (How long would Superman remain interesting just catching small-timers like purse-snatchers?) However, those three items seem to me to constitute the indispensable minimum--with any character not meeting those three criteria, which I think enable us to distinguish between superheroes and non-superheroes of various kinds, without bounding the category so narrowly as to deprive it of analytical usefulness, and permitting distillation as the following:
It does not seem unreasonable to define the term. Granted, a certain sort of person dismisses such attempts as bound to fail given that exceptions will always exist to any definition--but the existence of ambiguities does not, as such, keep definition from being a useful, in fact essential, tool for making sense of reality. (The map may not be the territory, but that does not make maps useless, no matter how much contrarians addicted to stupid epistemological nihilism natter on.)
A good starting point for such a definition is an examination of those criteria that may seem least dispensable to the conception of the superhero as we know them today--the superhero tradition as established in the comic book medium, especially from the advent of Superman forward.
1. They Must Be Both a Hero--and Super.
A superhero must obviously be a hero--a figure that does something both positive and extraordinary such as is worthy of admiration by others. Where the superhero's sort of heroism is concerned there is an expectation of service to the community of some physical kind, usually entailing courage in the face of physical danger, to save the lives of others because it is the right thing to do rather than a matter of self-interest; with such action a vocation rather than an exceptional incident. Of course, people pursue vocations that have them performing such acts--firefighters are the go-to example--but superheroes are distinguished from them in that they are super, their abilities exceeding the human norm in a way which is quantitative or qualitative or both. (Conventionally the firefighter running into a burning building must make do with "merely" human physical and mental capacities, but a superhero might be able to put out the fire with a blast of icy breath, as Superman can.)
2. They Have a Very Distinct and Very Public Persona, of which Distinct Powers, Codenames and Physical Appearances (Usually but Not Always a Function of Costuming) Tend to be a Part.
The above seems to me mostly self-explanatory. In contrast with, for example, heroes who are secret agents, who are likely to conceal their "heroic" aspect behind cover identities as they set about their work, the superhero's cover identity, their public persona, is itself heroic--and indeed advertised to the world at large, the more in as its elements include the aforementioned matters of a power, or powers, that tend to be very individualized; a similarly unique codename; and a unique appearance, usually but not always derived from a costume. (Barry Allen is a hero made a superhero by his possession of "speedster" capabilities such as even other DC superheroes tend not to have to fighting evil; is, appropriately, known as the Flash, after his speed; and wears a red costume with lightning bolt insignia, also evoking that speed. Other figures such as Marvel's The Hulk and the Thing have distinct powers and codenames, and their distinct appearance--albeit less reliant on any costume in their case.)
3. The Superhero's Activity is Highly Individualistic.
As implied by the exceptional character of the hero's capacity, and their conspicuous public persona, the conception of the superhero is individualistic in action. Consider, for example, the firefighter and secret agent both. Both figures are conventionally employees of a public agency, which pays them money to do a job in line with orders--which does not make a sense of duty irrelevant by any means, but still raises the question of the paycheck, career, etc. in ways that unavoidably give their activity a different texture. Both also rely on their organizations for their ability to do their jobs--unlikely to have equipment or other supports necessary to their hero task if they were not so employed. By contrast the superhero is ordinarily alone, answering to no one--taking no orders, and often having a complex and fraught relationship with authority (epitomized by stories in which Batman falls afoul of the law), all as what equipment they need is their own (one reason why superheroes are so often independently wealthy, as Batman or Iron Man is).
Moreover, when we see superheroes "team up," teaming up is exactly what they do--these individuals cooperating in a joint effort rather than relinquishing their identities to be good "organization" men and women. (Iron Man is first and foremost Iron Man, and never "just" an Avenger.)
Of course, much else tends to go with this. The public persona is often a way of concealing a private identity, and at least attempting to protect a private life. And of course there is apt to be the unusual origin story--for superhuman abilities, and the decision to put on a costume and fight a private war against crime or some other such evil are the kinds of things for which most people expect an explanation, which is likely to be extraordinary because of what has to be explained. (Thus is Superman literally from another planet, while Batman has been motivated by childhood trauma and equipped by a lifetime of preparation for his vigilante mission, aided by vast wealth as well as extraordinary talent.) One may add that the superhero almost always faces a supervillain at some point--because any other sort of villain is a less than worthy adversary. (How long would Superman remain interesting just catching small-timers like purse-snatchers?) However, those three items seem to me to constitute the indispensable minimum--with any character not meeting those three criteria, which I think enable us to distinguish between superheroes and non-superheroes of various kinds, without bounding the category so narrowly as to deprive it of analytical usefulness, and permitting distillation as the following:
A superhero is a figure who, acting on their individual initiative and resources, and through a distinct public persona apt to entail codename and (usually costume-based) appearance, makes a vocation of defending the public from physical dangers such as accident, crime and "supervillainy," usually in a way requiring physical action and courage on their part, and drawing on abilities and/or equipment endowing them with more than ordinary human physical and/or mental capacities.
'90s Nostalgia-Mining and the Scandals of that Era
Amid the exploitation of memories of the '90s by the pop culture industry these past many years some have seized on the scandals of the era for material. Thus did we get a feature film about the Tonya Harding scandal, and FX miniseries' about the Bill Clinton impeachment and the O.J. Simpson trial.
I have no idea how much of a public response they really drew. What I can say is that these scandals absolutely do not make me nostalgic for the '90s. Quite the contrary, unlike, for example, 16-bit-era console gaming, the sci-fi shows of the era, the golden age of The Simpsons, or Baywatch, or any number of other things which really do make me feel nostalgic, what they bring to mind is the horror and disgust I felt when I looked away from the day's more amusing pop cultural products at the state of the real world, and the news media that brought it to us, the vileness of which played its part in the world's going from one catastrophe to the next.
I have no idea how much of a public response they really drew. What I can say is that these scandals absolutely do not make me nostalgic for the '90s. Quite the contrary, unlike, for example, 16-bit-era console gaming, the sci-fi shows of the era, the golden age of The Simpsons, or Baywatch, or any number of other things which really do make me feel nostalgic, what they bring to mind is the horror and disgust I felt when I looked away from the day's more amusing pop cultural products at the state of the real world, and the news media that brought it to us, the vileness of which played its part in the world's going from one catastrophe to the next.
Hiram Lee on O.J. Simpson
In the wake of the media response to O.J. Simpson's arrest back in 2007 Hiram Lee published a piece titled "The Media's Obsession with O.J. Simpson" (emphasis added).
As the title of Lee's item indicates the piece was about how the media, not the public, was obsessed with Simpson, even as with extreme stupidity and sanctimoniousness the media's talking heads relentlessly insisted that it was the public was obsessed, and forced the media against its will to attend to the matter to the neglect of all the rest of what was happening in the world. (As Lee wrote, the "anchors and pundits . . . occasionally pose the question: 'Why are we so interested in O.J. Simpson?' . . . lament the drawn out and salacious" coverage, and "[t]hen, with feigned regret . . . return to the tawdry story at hand.")
As Lee remarked, those in the media who did so "attribute[d] their own shameful behavior to the supposed demands of a coarsened, celebrity-obsessed audience," while totally eliding "their own role in cultivating and directing such attitudes toward celebrity culture."
An obscenity in 1995, their behavior was more obscene still in 2007. If, contrary to the celebration of the era as one of peace and prosperity the actual '90s were very troubled years, and the truth was that anyone of even slight intelligence knew it. (Indeed, as one of Mr. Lee's colleagues put it, "Despite the official triumphalism, America was coming apart at the seams.") However, the situation in 2007 was graver still amid the war in Iraq, and the first signs of a historic financial crisis, in the shadow of both of which we have lived ever since.
Talking about O.J. was a way of diverting public attention away from that, to say nothing of more broadly stultifying the public mind--and if some of the public went along with it that did not change the fact that it was the media, not the public, driving this particular piece of idiocy.
Remember that as the remembrances of the trial in which the media is now awash speak of us all having been "captivated" by the trial.
As the title of Lee's item indicates the piece was about how the media, not the public, was obsessed with Simpson, even as with extreme stupidity and sanctimoniousness the media's talking heads relentlessly insisted that it was the public was obsessed, and forced the media against its will to attend to the matter to the neglect of all the rest of what was happening in the world. (As Lee wrote, the "anchors and pundits . . . occasionally pose the question: 'Why are we so interested in O.J. Simpson?' . . . lament the drawn out and salacious" coverage, and "[t]hen, with feigned regret . . . return to the tawdry story at hand.")
As Lee remarked, those in the media who did so "attribute[d] their own shameful behavior to the supposed demands of a coarsened, celebrity-obsessed audience," while totally eliding "their own role in cultivating and directing such attitudes toward celebrity culture."
An obscenity in 1995, their behavior was more obscene still in 2007. If, contrary to the celebration of the era as one of peace and prosperity the actual '90s were very troubled years, and the truth was that anyone of even slight intelligence knew it. (Indeed, as one of Mr. Lee's colleagues put it, "Despite the official triumphalism, America was coming apart at the seams.") However, the situation in 2007 was graver still amid the war in Iraq, and the first signs of a historic financial crisis, in the shadow of both of which we have lived ever since.
Talking about O.J. was a way of diverting public attention away from that, to say nothing of more broadly stultifying the public mind--and if some of the public went along with it that did not change the fact that it was the media, not the public, driving this particular piece of idiocy.
Remember that as the remembrances of the trial in which the media is now awash speak of us all having been "captivated" by the trial.
Saturday, April 20, 2024
Harry Turtledove's Ruled Britannia: A Few Thoughts
For quite a number of years I was an avid reader of Harry Turtledove's alternate histories--running out and grabbing the latest installment in his Timeline-191 series when it became available year in, year out. I enjoyed his rigorous working out of his scenarios, which, in contrast with so much alternate history, made the basis of his story a genuinely compelling counterfactual, in contrast with the flimsy "What ifs?" on which so many of his colleagues have relied. I also enjoyed the "big picture" emphasis of his narratives, presented through his large but still manageable and strategically arranged casts of viewpoint characters, and the briskness of his narratives, which seemed to have been worked out mathematically but effectually (Turtledove cutting among twenty viewpoint characters, each of whom got six four-page scenes by the end of the volume).
Turtledove's Ruled Britannia was a very differently structured book, less oriented to the big picture, and more narrowly focused on a mere two characters rather than twenty--in this case, William Shakespeare and Spanish Golden Age playwright Lope de Vega, brought face to face by a successful Spanish conquest of Britain in 1588 that had de Vega a soldier of the occupying Spanish army who, because of his predilection for the theater, is tasked with keeping an eye on a Shakespeare who has become drawn into a plot by old Elizabethan loyalists to stage a rebellion which will drive the Spanish out. The comparatively original premise (neither Civil War nor World War II!) intrigued me, as did the promise to depict these actual historical figures in this altered timeline, an approach which has always appealed to me. (Of the Timeline-191 novels my favorite was and remains How Few Remain, precisely because of its stress on actual personages.) Still, I wondered if Turtledove would manage to hold my interest with this approach through the nearly five hundred page narrative.
For my part I thought the book longer than it had to be, and I could have done without a good many bits. (I think we saw more of de Vega the self-satisfied womanizer than we needed to, for instance, and did not care for the usage of Christopher Marlowe either, or for that matter the final confrontation between these two figures, which seemed to me entirely pointless.) Still, in spite of the unnecessary or bothersome patches Turtledove pleasantly surprised me by carrying this more focused narrative. In doing that it helped not only that Turtledove displayed some adroitness in developing the cat and mouse game between Shakespeare and the Spanish occupiers, but that Turtledove presents Shakespeare, and de Vega, as both human beings rather than pedestal-placed literary titans--Shakespeare in particular a man whose talent may have marked him out for greater things but in the here and now a jobbing actor and writer trying to make a living and thrust less than willingly into high intrigue. Particularly commendable was Turtledove's not shying away from the difficult task his own plot presented him--taking seriously Shakespeare's enlistment to produce a propaganda play in aid of the rebellion (about the life of Iceni queen Boudicca), and letting us see just enough of it dramatized at the climax to give the project on which the whole plot rests some solidity. The closing lines of the play ("No epilogue here, unless you make it/If you want freedom go and take it") struck me as a bit more Brecht than Bard, but on the whole the pastiche worked, with that close entirely logical in the circumstances, and all this to the good of a climax, and denouement, that drew all the narrative strands together in very satisfying fashion.
Turtledove's Ruled Britannia was a very differently structured book, less oriented to the big picture, and more narrowly focused on a mere two characters rather than twenty--in this case, William Shakespeare and Spanish Golden Age playwright Lope de Vega, brought face to face by a successful Spanish conquest of Britain in 1588 that had de Vega a soldier of the occupying Spanish army who, because of his predilection for the theater, is tasked with keeping an eye on a Shakespeare who has become drawn into a plot by old Elizabethan loyalists to stage a rebellion which will drive the Spanish out. The comparatively original premise (neither Civil War nor World War II!) intrigued me, as did the promise to depict these actual historical figures in this altered timeline, an approach which has always appealed to me. (Of the Timeline-191 novels my favorite was and remains How Few Remain, precisely because of its stress on actual personages.) Still, I wondered if Turtledove would manage to hold my interest with this approach through the nearly five hundred page narrative.
For my part I thought the book longer than it had to be, and I could have done without a good many bits. (I think we saw more of de Vega the self-satisfied womanizer than we needed to, for instance, and did not care for the usage of Christopher Marlowe either, or for that matter the final confrontation between these two figures, which seemed to me entirely pointless.) Still, in spite of the unnecessary or bothersome patches Turtledove pleasantly surprised me by carrying this more focused narrative. In doing that it helped not only that Turtledove displayed some adroitness in developing the cat and mouse game between Shakespeare and the Spanish occupiers, but that Turtledove presents Shakespeare, and de Vega, as both human beings rather than pedestal-placed literary titans--Shakespeare in particular a man whose talent may have marked him out for greater things but in the here and now a jobbing actor and writer trying to make a living and thrust less than willingly into high intrigue. Particularly commendable was Turtledove's not shying away from the difficult task his own plot presented him--taking seriously Shakespeare's enlistment to produce a propaganda play in aid of the rebellion (about the life of Iceni queen Boudicca), and letting us see just enough of it dramatized at the climax to give the project on which the whole plot rests some solidity. The closing lines of the play ("No epilogue here, unless you make it/If you want freedom go and take it") struck me as a bit more Brecht than Bard, but on the whole the pastiche worked, with that close entirely logical in the circumstances, and all this to the good of a climax, and denouement, that drew all the narrative strands together in very satisfying fashion.
On Cryptohistory, the Paranormal and William Shakespeare
One of the oddities of contemporary culture is how persons who ordinarily find history dull suddenly become attentive when someone mentions "aliens"--in the sense of extraterrestrials having been a part of it.
The explanation of this seems to me to be that those persons' interest is still not in history, but the possibility that the claims for the existence of extraterrestrials visiting Earth has been validated.
But that raises the question of why precisely they should care to prove that such visitations have happened. Why should so many people be invested in this?
About that I am not at all clear, but I have noticed a similar interest in much more down-to-Earth, less world-shaking, subjects, such as the possibility of the authorship of William Shakespeare's plays by someone other than William Shakespeare. It seems that, just as with history, people who are not normally interested in Shakespeare, and not knowledgeable about Shakespeare, get interested when, for example, Edward de Vere or Francis Bacon or Walter Raleigh or somesuch is mentioned as the real author of his works. The result is that if some are quite invested in the controversy and make detailed claims on the basis of Shakespeare's plays (all the evidence is indirect in nature), a significant number of people far from capable of making or appreciating such arguments still find interest in the essential claim--which comes down to people being fascinated by the thought that productions they have not read or seen or cared about that have been attributed to one historical figure of whom they know next to nothing actually being attributable to another figure of whom they likely know even less.
Considering both obsessions it can seem symbolic that Roland Emmerich, who made the aliens-visited-Earth-in-the-past movies Stargate (1994) and Independence Day (1996) later directed a movie dramatizing Edward de Vere's writing the plays and using Shakespeare as a front, Anonymous (2011). Didn't see it? That's okay, pretty much no one did, and it would not seem that you or they or anyone else missed much in not doing so--certainly to go by David Walsh's take on the film some years ago (which, as may be expected of those who have read Walsh at his best, is most certainly attentive toward and insightful into the pseudo-controversy over the authorship of Shakespeare's works, which he went so far as to follow up in a second item on the matter).
The explanation of this seems to me to be that those persons' interest is still not in history, but the possibility that the claims for the existence of extraterrestrials visiting Earth has been validated.
But that raises the question of why precisely they should care to prove that such visitations have happened. Why should so many people be invested in this?
About that I am not at all clear, but I have noticed a similar interest in much more down-to-Earth, less world-shaking, subjects, such as the possibility of the authorship of William Shakespeare's plays by someone other than William Shakespeare. It seems that, just as with history, people who are not normally interested in Shakespeare, and not knowledgeable about Shakespeare, get interested when, for example, Edward de Vere or Francis Bacon or Walter Raleigh or somesuch is mentioned as the real author of his works. The result is that if some are quite invested in the controversy and make detailed claims on the basis of Shakespeare's plays (all the evidence is indirect in nature), a significant number of people far from capable of making or appreciating such arguments still find interest in the essential claim--which comes down to people being fascinated by the thought that productions they have not read or seen or cared about that have been attributed to one historical figure of whom they know next to nothing actually being attributable to another figure of whom they likely know even less.
Considering both obsessions it can seem symbolic that Roland Emmerich, who made the aliens-visited-Earth-in-the-past movies Stargate (1994) and Independence Day (1996) later directed a movie dramatizing Edward de Vere's writing the plays and using Shakespeare as a front, Anonymous (2011). Didn't see it? That's okay, pretty much no one did, and it would not seem that you or they or anyone else missed much in not doing so--certainly to go by David Walsh's take on the film some years ago (which, as may be expected of those who have read Walsh at his best, is most certainly attentive toward and insightful into the pseudo-controversy over the authorship of Shakespeare's works, which he went so far as to follow up in a second item on the matter).
"All Art is Propaganda," Somebody Said, Somewhere
Looking back it seems that many an early twentieth century literary great claimed that "All art is propaganda." George Orwell seems the one most associated with the phrase, which he states in his 1940 essay on Charles Dickens. However, Upton Sinclair said exactly the same thing in exactly those words in his epic 1925 history of art, Mammonart, with which Orwell might have been familiar given Orwell's praises for Sinclair's work.
That far more people seem to associate the phrase with Orwell than with Sinclair, I suppose, reflects the greater respectability of the former than the latter these days. Like Orwell Sinclair shifted away from his earlier political stances, but Sinclair's greatest work, fiction and nonfiction, is generally associated with his time as a committed socialist, whereas Orwell is best known for, and celebrated for, producing a work that, in spite of his much more complex attitudes and intentions, came to be regarded as the supreme piece of Cold Warrior literature, overshadowing all the rest of his work--which has been all the better for his memory given the prejudices of the tastemakers.
That far more people seem to associate the phrase with Orwell than with Sinclair, I suppose, reflects the greater respectability of the former than the latter these days. Like Orwell Sinclair shifted away from his earlier political stances, but Sinclair's greatest work, fiction and nonfiction, is generally associated with his time as a committed socialist, whereas Orwell is best known for, and celebrated for, producing a work that, in spite of his much more complex attitudes and intentions, came to be regarded as the supreme piece of Cold Warrior literature, overshadowing all the rest of his work--which has been all the better for his memory given the prejudices of the tastemakers.
Jane Austen and Brazilian Jiu-Jitsu
As I remarked once, for me the most worthwhile passage in Jane Austen's Pride and Prejudice is the satirical dialogue about the idea of the "accomplished young lady."
Such accomplishment--if one goes by Thorstein Veblen's theory of the leisure class--is merely the leisured showing off their greater resources, and the supposed greater prowess which affords them that leisure, by having their young ladies display in a supposedly masterful fashion costly and time-consuming training in skills of no practical value in their lives for the sake of unsubtly conveying the message that "This is why we are here and you are there, plebs!"
Just as much as ever we are today bombarded with claims of such upper-class "accomplishment" (not least, by way of pop culture). Still, the precise skills in which today's young lady, and gentleman, are expected to display "accomplishment" are different, with one example the martial arts.
Lest this need be said I do not at all deny the validity of training in those arts for purposes of exercise, sport, self-defense and much else. However, if we are to be honest the reason we are so often (falsely and stupidly) given the impression that "everyone" is a black belt holder of some kind (preferably something fashionable at the moment) is because in the United States such training has acquired an association with upper classness. This is because the sustained commitment of significant disposable time and money required to complete such a training is likely to be beyond the reach of working persons and their children, with the association reinforced by the way that unthinking conformists imitate what better-off people do--and of course, many, many, many more lie about doing so--the more in as the persons in each and every one of these categories boast about their "accomplishment" ad nauseam, such that one is often left wondering if anyone has ever worn a belt of any color other than black. One sees just how much fakery is going on when hack thriller writers get away with having their heroes fight off attackers with a "judo kick," such an easy mistake and its complete failure to elicit comment--just one of those little proofs that, just as in Ms. Austen's time, the image of "accomplishment" is, much, much, much more often than not a façade, one as ridiculous as it is tiresome.
Such accomplishment--if one goes by Thorstein Veblen's theory of the leisure class--is merely the leisured showing off their greater resources, and the supposed greater prowess which affords them that leisure, by having their young ladies display in a supposedly masterful fashion costly and time-consuming training in skills of no practical value in their lives for the sake of unsubtly conveying the message that "This is why we are here and you are there, plebs!"
Just as much as ever we are today bombarded with claims of such upper-class "accomplishment" (not least, by way of pop culture). Still, the precise skills in which today's young lady, and gentleman, are expected to display "accomplishment" are different, with one example the martial arts.
Lest this need be said I do not at all deny the validity of training in those arts for purposes of exercise, sport, self-defense and much else. However, if we are to be honest the reason we are so often (falsely and stupidly) given the impression that "everyone" is a black belt holder of some kind (preferably something fashionable at the moment) is because in the United States such training has acquired an association with upper classness. This is because the sustained commitment of significant disposable time and money required to complete such a training is likely to be beyond the reach of working persons and their children, with the association reinforced by the way that unthinking conformists imitate what better-off people do--and of course, many, many, many more lie about doing so--the more in as the persons in each and every one of these categories boast about their "accomplishment" ad nauseam, such that one is often left wondering if anyone has ever worn a belt of any color other than black. One sees just how much fakery is going on when hack thriller writers get away with having their heroes fight off attackers with a "judo kick," such an easy mistake and its complete failure to elicit comment--just one of those little proofs that, just as in Ms. Austen's time, the image of "accomplishment" is, much, much, much more often than not a façade, one as ridiculous as it is tiresome.
In What Does Middle Classness Consist?
We hear about middle classness all the time, usually from commentators desirous of dissolving the reality of inequality in an image of generalized middleness. One approach to such dissolution is judging the matter not on the basis of material criteria actually having to do with the terms on which people live, but instead favoring educational levels, or the "values" they purport to espouse, and even self-identification (i.e. "If you think you're middle class, then you are!"). When they do acknowledge material criteria they often equate middle classness not with the income requirements permitting life at a middle class standard demarcated in some fashion (never mind the standard most seem to actually identify with middle classness), but the middle of the range of the income distribution, a very different thing that generally constitutes a far lower bar (depending on the context, one can be mid-income while being much less than middle class in any meaningful sense), in yet another shabby evasion of the sharp edges of social reality.
I tried to do better than that in my working papers on the subject. You can find these here.
I tried to do better than that in my working papers on the subject. You can find these here.
The Obscurantism of "Genius"
I recall encountering a lengthy discussion of the concept of "genius" in a popular news magazine a long time ago (TIME, perhaps).
The author of the piece chalked up the desire to believe in "genius" to a romantic desire for transcendence.
One may well grant an element of that existing within the contemporary cult of genius, but it seems to me far from being the whole of it. More important, I think, is the contradiction between the complexity and scale of modern life, and the prevailing individualistic intellectual and emotional attitudes.
Consider, for instance, scientific life. The scientific endeavor is old, vast, collective in spirit--sociologist Robert Merton, indeed, seeing the collectivist attitude toward scientific knowledge as in fact one of the key elements of the scientific ethos. Scientists build on the work of predecessors in conjunction with their colleagues, such that the individual efforts all flow into a common stream so much larger than any of them that it can seem foolishness to worry too much about whether this or that drop came from that particular tributary of the great river--with the depth and intricacy of the collective, collaborative, aspect getting only the more conspicuous to go by the sheer number of authors on single scientific papers today (a function of just how painfully specialized the work has become). However, a culture accustomed to individualism, indeed vehement about explaining results in terms of individual achievement, individual choice, individual contribution, is more likely to stress single, towering figures who, because so much more is credited to them than any one person ever actually did, or for that matter probably could have done (especially insofar as all the others who helped lead up to them fall by the wayside), can only seem superhuman, magical, in a word, "geniuses." Thus in a common view physics had that Isaac Newton guy who did it all pretty much by himself--and never mind those giants on whose shoulders he supposedly stood. And then not much happened until that Albert Einstein guy, who singlehandedly vaulted us into the relativistic era with a few papers. And so forth.
The tendency obscures rather than illuminates--which is plausibly just fine with those who prefer to see humanity as consisting of a tiny elite of superhumans who accomplish everything and a vast mass of dross who owe that elite everything and should accordingly be groveling before them in the dirt lest Atlas decide they are not worth the trouble, and shrug. Naturally the word is bandied about much in our time--and never more than in the case of persons who, one way or another, seem to amass a lot of money (in a reminder of what, infinitely more than knowledge, is really valued in this society).
Thus was Ken Lay a genius. And Jeffrey Epstein. And Sam Bankman-Fried. And Elizabeth Holmes. And many, many others just like them. In the haste to acclaim such persons such the real reasons for the desperate attachment of persons of conventional mind to the concept become all too apparent.
The author of the piece chalked up the desire to believe in "genius" to a romantic desire for transcendence.
One may well grant an element of that existing within the contemporary cult of genius, but it seems to me far from being the whole of it. More important, I think, is the contradiction between the complexity and scale of modern life, and the prevailing individualistic intellectual and emotional attitudes.
Consider, for instance, scientific life. The scientific endeavor is old, vast, collective in spirit--sociologist Robert Merton, indeed, seeing the collectivist attitude toward scientific knowledge as in fact one of the key elements of the scientific ethos. Scientists build on the work of predecessors in conjunction with their colleagues, such that the individual efforts all flow into a common stream so much larger than any of them that it can seem foolishness to worry too much about whether this or that drop came from that particular tributary of the great river--with the depth and intricacy of the collective, collaborative, aspect getting only the more conspicuous to go by the sheer number of authors on single scientific papers today (a function of just how painfully specialized the work has become). However, a culture accustomed to individualism, indeed vehement about explaining results in terms of individual achievement, individual choice, individual contribution, is more likely to stress single, towering figures who, because so much more is credited to them than any one person ever actually did, or for that matter probably could have done (especially insofar as all the others who helped lead up to them fall by the wayside), can only seem superhuman, magical, in a word, "geniuses." Thus in a common view physics had that Isaac Newton guy who did it all pretty much by himself--and never mind those giants on whose shoulders he supposedly stood. And then not much happened until that Albert Einstein guy, who singlehandedly vaulted us into the relativistic era with a few papers. And so forth.
The tendency obscures rather than illuminates--which is plausibly just fine with those who prefer to see humanity as consisting of a tiny elite of superhumans who accomplish everything and a vast mass of dross who owe that elite everything and should accordingly be groveling before them in the dirt lest Atlas decide they are not worth the trouble, and shrug. Naturally the word is bandied about much in our time--and never more than in the case of persons who, one way or another, seem to amass a lot of money (in a reminder of what, infinitely more than knowledge, is really valued in this society).
Thus was Ken Lay a genius. And Jeffrey Epstein. And Sam Bankman-Fried. And Elizabeth Holmes. And many, many others just like them. In the haste to acclaim such persons such the real reasons for the desperate attachment of persons of conventional mind to the concept become all too apparent.
Tom Clancy's Debt of Honor and Japanese Work Culture
Like the other major '90s-era American thriller writers who took up the then highly topical theme of Japanese-American relations Tom Clancy presumed to "explain" Japan to his American audience in Debt of Honor, retailing the clichés of the "experts," though perhaps less admiringly than others. In his account Japan's culture "demanded much of its citizens" in a way supposedly alien to even the Japanese-American character (Chet Nomura) through whose eyes Clancy showed his readers the country, with "[t]he boss . . . always right," the "good employee" the one "who did as he was told," and getting ahead requiring one to "kiss a lot of ass" and go the extra mile in regard to the display of loyalty to the employer ("sing the company song," come in "an hour early to show how sincere you were").
As is often the case with those who make much of "difference," all this was really much less "different" than Clancy made it seem--in America, too, the rules of getting ahead not so different. No more in America than anywhere else do bosses take unkindly to being wrong, and employees who do not do what they are told. And no less in America than anywhere else do "ass-kissers" get ahead, and those who refuse to play the game suffer for fighting to retain their dignity in the inherently degrading situation that is the personal subordination seen in the modern workplace.
Indeed, those familiar with what it takes to enter the more rarefied territories within the professions (like completing a prestigious medical residency), or the more prestigious firms of "Greed is good"-singing Wall Street and "Move Fast and Break Things" Silicon Valley--where many an employer is an insufferably smug idiot bully who considers themselves entitled to put anyone desirous of a position through sheer hell for the privilege of making money for them, or simply put having been an intern for their firm on their resume--will be less impressed than Clancy was by what he thought he knew about how different Japan was with regard to its exploitativeness and destructiveness of the minds and bodies of the ambitious "go-getter" that everyone is expected to take for their ideal.
Compounding the irony, this is generally to the approval of those who see things Clancy's way.
As is often the case with those who make much of "difference," all this was really much less "different" than Clancy made it seem--in America, too, the rules of getting ahead not so different. No more in America than anywhere else do bosses take unkindly to being wrong, and employees who do not do what they are told. And no less in America than anywhere else do "ass-kissers" get ahead, and those who refuse to play the game suffer for fighting to retain their dignity in the inherently degrading situation that is the personal subordination seen in the modern workplace.
Indeed, those familiar with what it takes to enter the more rarefied territories within the professions (like completing a prestigious medical residency), or the more prestigious firms of "Greed is good"-singing Wall Street and "Move Fast and Break Things" Silicon Valley--where many an employer is an insufferably smug idiot bully who considers themselves entitled to put anyone desirous of a position through sheer hell for the privilege of making money for them, or simply put having been an intern for their firm on their resume--will be less impressed than Clancy was by what he thought he knew about how different Japan was with regard to its exploitativeness and destructiveness of the minds and bodies of the ambitious "go-getter" that everyone is expected to take for their ideal.
Compounding the irony, this is generally to the approval of those who see things Clancy's way.
Are Bloggers Wasting Their Time Tinkering with Their Blogs in the Hopes of Getting More Readers?
I suspect at this stage of things that most of those who started blogging have been less than thrilled with their results--certainly to go by the vast number of bloggers who quit the endeavor, often after just a little while. They were given the impression that what goes online is necessarily seen by vast numbers of people, that they would not be crazy to expect the things they put up to "go viral," that they could end up web celebrities.
Instead they mostly found themselves ignored--and when not ignored often insulted. Indeed, many may have been disappointed to find that rather than persistence paying off they have got less and less result with time--finding the search engines less likely to index their posts than before, seeing less and less evidence of actual humans reading their blog by leaving a comment or posting a link, as they suspect that what page views they get are spam, bots, and the rest.
There is no great mystery in this. The reality is that online the ratio of people looking for attention is extremely high relative to those ready to pay attention. The reality is that the great majority of Internet usage is passive, with any kind of engagement a rarity (very, very few of those who do find anything likely to share it with others). The reality is that, where the blogger especially is concerned, the Internet has been moving away from full-bodied text toward shorter-form content (like microblogging) and audiovisual experience (the vlogger rather than even the microblogger). And as if all this did not suffice to mean that things were going from bad to worse the game is entirely rigged at every level in favor of big names, big money, those with legacy media connections, and the other advantages.
Alas, these obvious realities are something few are ready to admit--instead what prevails a stupid meritocratic-aspirationalist view that treats every outcome as the result of a tough but fair test of individual talent and effort. And even those who can see through that stupidity often do not simply shrug their shoulders, but strive instead for a better outcome. One way is by trying to get search engines to treat them more favorably (indexing more of their items, ranking them more highly in search results, etc.), the more in as many a would-be advice-giver claims to know what will do the trick.
Should they post more frequently, or less frequently? Should they go for longer posts, or shorter ones? Are there too many links in their posts, or not enough? Is it a mistake to have so many internal links--for instance, posts that provide handy collections of links to past posts? Should they be weeding out old posts that perhaps did not get looked at as much as others? And so on and so forth. They will hear a lot of advice (typically vague yet completely contradictory advice, supported by no evidence whatsoever)--and if acting on it, spend a lot of time on things that have nothing to do with actual blogging.
My experience is that all that hassle will not accomplish much for most of them, for many reasons. My personal suspicion is that the infrequency with which search engines "crawl" your blog means that it will be a long time before you get any positive result, even if you really do achieve one, which seems to me doubtful. I suspect that the search engines are simply overwhelmed by the amount of content out there, and the efforts to manipulate it--to the extent that they are not catering to the sponsors, demands for censorship, etc. that doubtless tie up much in knots even beyond the direct effect of such corruption of search results (and that half-baked experiments with artificial intelligence are likely lousing things up further). The result is that they probably ought to spare themselves the trouble--and, if they really think their blogging has any meaning at all, get on with that instead.
Instead they mostly found themselves ignored--and when not ignored often insulted. Indeed, many may have been disappointed to find that rather than persistence paying off they have got less and less result with time--finding the search engines less likely to index their posts than before, seeing less and less evidence of actual humans reading their blog by leaving a comment or posting a link, as they suspect that what page views they get are spam, bots, and the rest.
There is no great mystery in this. The reality is that online the ratio of people looking for attention is extremely high relative to those ready to pay attention. The reality is that the great majority of Internet usage is passive, with any kind of engagement a rarity (very, very few of those who do find anything likely to share it with others). The reality is that, where the blogger especially is concerned, the Internet has been moving away from full-bodied text toward shorter-form content (like microblogging) and audiovisual experience (the vlogger rather than even the microblogger). And as if all this did not suffice to mean that things were going from bad to worse the game is entirely rigged at every level in favor of big names, big money, those with legacy media connections, and the other advantages.
Alas, these obvious realities are something few are ready to admit--instead what prevails a stupid meritocratic-aspirationalist view that treats every outcome as the result of a tough but fair test of individual talent and effort. And even those who can see through that stupidity often do not simply shrug their shoulders, but strive instead for a better outcome. One way is by trying to get search engines to treat them more favorably (indexing more of their items, ranking them more highly in search results, etc.), the more in as many a would-be advice-giver claims to know what will do the trick.
Should they post more frequently, or less frequently? Should they go for longer posts, or shorter ones? Are there too many links in their posts, or not enough? Is it a mistake to have so many internal links--for instance, posts that provide handy collections of links to past posts? Should they be weeding out old posts that perhaps did not get looked at as much as others? And so on and so forth. They will hear a lot of advice (typically vague yet completely contradictory advice, supported by no evidence whatsoever)--and if acting on it, spend a lot of time on things that have nothing to do with actual blogging.
My experience is that all that hassle will not accomplish much for most of them, for many reasons. My personal suspicion is that the infrequency with which search engines "crawl" your blog means that it will be a long time before you get any positive result, even if you really do achieve one, which seems to me doubtful. I suspect that the search engines are simply overwhelmed by the amount of content out there, and the efforts to manipulate it--to the extent that they are not catering to the sponsors, demands for censorship, etc. that doubtless tie up much in knots even beyond the direct effect of such corruption of search results (and that half-baked experiments with artificial intelligence are likely lousing things up further). The result is that they probably ought to spare themselves the trouble--and, if they really think their blogging has any meaning at all, get on with that instead.
Never Underestimate the Suckerdom of Bosses
A while back Cory Doctorow rather nicely summed up what I suspect is (contrary to the past decade's Frey-Benedikt study-kickstarted hype, and the post-Open AI GPT-3.5 hype too) the real state of progress in applying artificial intelligence to the vast majority of tasks workers actually perform: "we're nowhere near a place where bots can steal your job," but "we're certainly at the point where your boss can be suckered into firing you and replacing you with a bot that fails at doing your job."
For the worker who gets fired, alas, that the bot they were replaced with fails to do their job is cold comfort. They remain fired, after all--and the odds of their getting a phone call asking them to come back (never mind come back and get compensated for what they were put through) seem to me very slim indeed.
Such is the reality of the relationship between power and responsibility, above all in the Market.
For the worker who gets fired, alas, that the bot they were replaced with fails to do their job is cold comfort. They remain fired, after all--and the odds of their getting a phone call asking them to come back (never mind come back and get compensated for what they were put through) seem to me very slim indeed.
Such is the reality of the relationship between power and responsibility, above all in the Market.
The Unusability of the Internet and the Taming of the Cyber-Space frontier
Back in the 1990s there was a good deal of cyber-utopianism. I think of most of it as having been hype for the web and the tech sector, and for Silicon Valley, as well as broader "market populist" justifications for the neoliberal age by way of their fantasies about the information age--but all the same the idea was there that those who had been shut out, marginalized, denied a chance to speak might have a chance to make themselves heard here.
I suspect that most of those who acted on such premises eventually discovered the cruel lie for themselves, that the Internet was less radical in these ways than advertised, and became less and less so as time went on. That it is a medium whose usership is essentially passive, and carefully gatekept, so that Big Money, legacy media help and well-resourced supporters make all the difference between whether one's message is noticed or not, all to the advantage of the rich and powerful, and disadvantage of the poor and weak. Now with the Internet the mess that it is one may imagine that ever more than before the only way to get a message out through it is to put ever more resources behind it--as those who cannot, whatever they have to say, are seen and heard by virtually no one but themselves.
I suspect that most of those who acted on such premises eventually discovered the cruel lie for themselves, that the Internet was less radical in these ways than advertised, and became less and less so as time went on. That it is a medium whose usership is essentially passive, and carefully gatekept, so that Big Money, legacy media help and well-resourced supporters make all the difference between whether one's message is noticed or not, all to the advantage of the rich and powerful, and disadvantage of the poor and weak. Now with the Internet the mess that it is one may imagine that ever more than before the only way to get a message out through it is to put ever more resources behind it--as those who cannot, whatever they have to say, are seen and heard by virtually no one but themselves.
The End of the Cyber-Frontier
Back in the 1990s the cyber-utopians, reflecting their ideological background, seemed inclined to portray the Internet as a new frontier for the taking. (Indeed, Thomas Friedman, in the midst of lionizing Enron for its business model that he was sure was a perfect symbol of how the twenty-first century would belong to America, exalted its activity as part of a great "cyberspace land grab.")
Today there is no sense of anything like that online, the cyber-frontier long since closed--and, just as with the frontier in American history rather than the version of it the mythmakers continue to promulgate to this day, the result has been less a mass enfranchisement of the many than the further enrichment of an already rich few, as we are reminded by how much, regardless of how Jim Cramer rearranges the letters in his acronyms, a handful of giant firms, the same ones for rather a long time now, have turned what (falsely) seemed an unclaimed continent ripe for the taking into their private domains, and that ever more fully as such of the small freeholders who had made a place for themselves find their situation less and less tenable.
Today there is no sense of anything like that online, the cyber-frontier long since closed--and, just as with the frontier in American history rather than the version of it the mythmakers continue to promulgate to this day, the result has been less a mass enfranchisement of the many than the further enrichment of an already rich few, as we are reminded by how much, regardless of how Jim Cramer rearranges the letters in his acronyms, a handful of giant firms, the same ones for rather a long time now, have turned what (falsely) seemed an unclaimed continent ripe for the taking into their private domains, and that ever more fully as such of the small freeholders who had made a place for themselves find their situation less and less tenable.
Nostalgia as Protest--and its Limitations
I suppose that where nostalgia is concerned we can speak of "pull" and "push." We may be pulled toward some aspect of the past by its apparent intrinsic fascination--but we may also be pushed away from the present by its repulsions, and this exactly why some frown upon nostalgia as they do.
Preferring the past to the present has historically been the preserve of the conservative and especially the reactionary, who really do want to bring back something of the past. But in a reactionary era the liberal might find comfort in looking backward. So have many liberals, faced with the '80s and all that came after, been nostalgic for the '60s.
It is a more awkward fit in their case. Those who believe in progress, if sincere about their belief, should expect that the best days lie ahead rather than behind them. Thinking otherwise is a reflection of the profound demoralization on their part that has played a far from inconsiderable part in moving the world further and further away from what they said they wanted it to be--while this nostalgia for the past, alas, has been far from the only expression of that sad state.
Preferring the past to the present has historically been the preserve of the conservative and especially the reactionary, who really do want to bring back something of the past. But in a reactionary era the liberal might find comfort in looking backward. So have many liberals, faced with the '80s and all that came after, been nostalgic for the '60s.
It is a more awkward fit in their case. Those who believe in progress, if sincere about their belief, should expect that the best days lie ahead rather than behind them. Thinking otherwise is a reflection of the profound demoralization on their part that has played a far from inconsiderable part in moving the world further and further away from what they said they wanted it to be--while this nostalgia for the past, alas, has been far from the only expression of that sad state.
Of "Optimism"
"Optimism" has always struck me as a slippery word whose usage often betrays a lightness of mind. To be optimistic, after all, always means being optimistic in relation to a particular thing--and unavoidably, being pessimistic about another thing. (I am optimistic about how "our team" will do, which means I am pessimistic about how "their team" will do when we go up against them in the competition, for instance.)
So which do we focus on, the optimism or the pessimism?
As is the case with so much else in life, Authority tends to decide what counts as "optimistic," and of course they identify it with unquestioning faith in the status quo and unswerving conformism, pessimism with the opposite attitude--even though optimism about what exists tends to go with pessimism about much else. (Thus the classical conservative is a believer in the wisdom of traditional social arrangements, and optimistic about adherence to them--but pessimistic about human nature, reason, the prospects for a more just social order. Whether they are optimistic or pessimistic, again, depends on who is doing the talking.)
Confusing matters further in doing the above those in charge often promote what they like to call "optimism" as a broad mental attitude--and a positive character trait, and a lack of such optimism also such an attitude, pessimism equally a broad mental attitude and a negative character trait of which one ought to be ashamed.
As if the way in which optimism and pessimism tend to be bound up, and one or the other emphasized in a shabbily manipulative way, were not enough, it is worth noting another great ambiguity within the term and its usage. There is, on the one hand, the optimism of the person who looks at a problem and says "I see a problem, but I think I can fix it." At the same time there is the person whose response is, instead, "I see a problem but I'm sure it will all work itself out somehow."
The former is the optimism of the problem-solver ready to work for a solution. The latter is the optimism of the person who thinks that they do not have to solve a problem--a response that can be genuine, but often bespeaks complacency, irresponsibility, callousness, in which case optimism is nothing but a lazy or cowardly dodge, with this still more blatant in the case of those who see a problem and then deny that they see any problem at all. These days, this, too, accounts for a very large part of what is said and done under the banner of "optimism."
Naturally we ought to be a lot more careful about how we use the word--and how we respond when others use it, not being intimidated or ashamed when they chastise us for failing in "optimism" according to whatever standard they presume to apply to us for their own convenience.
So which do we focus on, the optimism or the pessimism?
As is the case with so much else in life, Authority tends to decide what counts as "optimistic," and of course they identify it with unquestioning faith in the status quo and unswerving conformism, pessimism with the opposite attitude--even though optimism about what exists tends to go with pessimism about much else. (Thus the classical conservative is a believer in the wisdom of traditional social arrangements, and optimistic about adherence to them--but pessimistic about human nature, reason, the prospects for a more just social order. Whether they are optimistic or pessimistic, again, depends on who is doing the talking.)
Confusing matters further in doing the above those in charge often promote what they like to call "optimism" as a broad mental attitude--and a positive character trait, and a lack of such optimism also such an attitude, pessimism equally a broad mental attitude and a negative character trait of which one ought to be ashamed.
As if the way in which optimism and pessimism tend to be bound up, and one or the other emphasized in a shabbily manipulative way, were not enough, it is worth noting another great ambiguity within the term and its usage. There is, on the one hand, the optimism of the person who looks at a problem and says "I see a problem, but I think I can fix it." At the same time there is the person whose response is, instead, "I see a problem but I'm sure it will all work itself out somehow."
The former is the optimism of the problem-solver ready to work for a solution. The latter is the optimism of the person who thinks that they do not have to solve a problem--a response that can be genuine, but often bespeaks complacency, irresponsibility, callousness, in which case optimism is nothing but a lazy or cowardly dodge, with this still more blatant in the case of those who see a problem and then deny that they see any problem at all. These days, this, too, accounts for a very large part of what is said and done under the banner of "optimism."
Naturally we ought to be a lot more careful about how we use the word--and how we respond when others use it, not being intimidated or ashamed when they chastise us for failing in "optimism" according to whatever standard they presume to apply to us for their own convenience.
The Supposed End of Work, and the Drivel Spoken About It
A few years back, when the hype about automation anticipated more than glorified autocompletes, fashionable "thinkers" proclaimed the imminence of radical changes in day-to-day living. Completely misunderstanding the 2013 Frey-Osborne study of automation they thought we were looking at "the end of work" within a few years.
Thus did we have a little, mostly unserious, talk about how society could adapt to half of the work force being put out of a job, like what the relation of work to income would be.
Amid it all, of course, were those who assured everyone that work would still be part of people's lives, with these usually seeming to mean alienated labor done in the market for an employer for wages, the psychological and "spiritual" value of which presumably eternal arrangement they insisted were salutary, claims the media generally treated with the utmost respect.
Of course, few had anything to say about the fact that the media was usually quoting elite commentators whose ideas about what "work" is were remote from the lives of 99 percent of the public, whose experience and thoughts were, as usual, of no account. Many of these, for many of whom "work" has never ceased to mean the hell on Earth that we see in Emile Zola's Germinal, or Upton Sinclair's The Jungle, or Takiji Kobayashi's Kanikosen--and many more for whom work means far, far less but still considerable suffering--would disagree with those "tech" company vice-presidents who think that their sipping a cappuccino on the G6 as they fly home from Davos is "hard work" about whether work is the uplifting and necessary thing they say we cannot do without, instead hopefully looking ahead to the possibility of a different world.
Thus did we have a little, mostly unserious, talk about how society could adapt to half of the work force being put out of a job, like what the relation of work to income would be.
Amid it all, of course, were those who assured everyone that work would still be part of people's lives, with these usually seeming to mean alienated labor done in the market for an employer for wages, the psychological and "spiritual" value of which presumably eternal arrangement they insisted were salutary, claims the media generally treated with the utmost respect.
Of course, few had anything to say about the fact that the media was usually quoting elite commentators whose ideas about what "work" is were remote from the lives of 99 percent of the public, whose experience and thoughts were, as usual, of no account. Many of these, for many of whom "work" has never ceased to mean the hell on Earth that we see in Emile Zola's Germinal, or Upton Sinclair's The Jungle, or Takiji Kobayashi's Kanikosen--and many more for whom work means far, far less but still considerable suffering--would disagree with those "tech" company vice-presidents who think that their sipping a cappuccino on the G6 as they fly home from Davos is "hard work" about whether work is the uplifting and necessary thing they say we cannot do without, instead hopefully looking ahead to the possibility of a different world.
The Beginning of the End of the Cult of the Good School?
By "cult of the Good School" I refer to the profoundly irrational hierarchy that exists among institutions of higher learning in the minds of the conventional wisdom-abiding.
In speaking of that hierarchy as irrational I am not denying that schools differ with regard to the selectivity of their student intake, and the resources available to them for the fulfillment of their educational mission; or denying that that selectivity and those resources may make a difference in the quality of the education and/or training they impart to their graduates. Nor do I deny that these practical differences--or for that matter, the less rational reliance on institutional "reputation"--may make a difference to employers, with implications for their graduates' outcomes with regard to employment and income.
However, the truth is that very few making distinctions among those schools actually know anything about how the schools in question actually stack up against each other in even the more measurable, practical ways (for instance, the average quality of undergraduate instruction); or actually know anything about exactly what difference the school a student graduated from makes in the job market. Rather they just "know" that one school is "better" than others, with most carrying around an image of various strata in their minds--with the Ivy League "better" than others, and within that stratum Harvard better than Yale, all as private schools generally come in ahead of public schools, etcetera, etcetera, albeit with many exceptions all along the line. (Much as many desire to park their car in Harvard yard, the science major will not be looked down upon for choosing the Massachusetts Institute of Technology instead--while Berkley, UCLA, Chapel Hill, among others, in spite of being state institutions, are quite well respected indeed, more so than most private institutions.)
Those who think this way imagine that any young person of intelligence and ambition ought to aim as high as they can in the hierarchy--and if they do not their elders, and conforming peers, think there is something wrong with them. Should they question those elders' and those peers' presumption they will not get an intelligent answer--this "Just what people do."
The insistence on all this ultimately come down to simple-minded snobbery, especially insofar as the data that even the more rigorous sift tend to be flawed and even corrupt. Meanwhile a good deal of longstanding statistical data, confirmed again and again by more recent studies, calls into question the idea of some great wage premium deriving from going to the elite institutions, or even inherent employability. The "power elite" may disproportionately be associated with such colleges, but, even setting aside the fact that the advantages most of them had did not begin or end with the college they attended (where they are, after all, less likely to be going because they were exceptionally bright than because they were exceptionally privileged), the average is a different matter. Stupid writers for trash publications may make claims such as that one can major in somersaults at Harvard and still get a "good job," but as was reconfirmed by a recent study, your major is probably going to be a bigger determinant of whether you get a job in your line and what it will pay than the school to which you went.
However, this foolishness is now up against an era of tighter means, diminished prospects, and greater skepticism about the value of a college education generally, and specific avenues to getting a college education particularly--and as a result of weighing the actual dollars and cents return on their investment in a college degree, and finding that they will very probably be better off commuting to the nearest campus of the university system in their state (maybe after racking up as many credits as they can at the still closer closest campus of their community college) than selling themselves into lifetime debt slavery for "the college experience" of which so many college presidents pompously speak at a Brand Name School. Of course, considering that it is worth acknowledging that the people whose ideas are the mainstream ideas are the ones most remote from the situation of tightened means and diminished prospects; and have absolutely zero respect for the views of the majority. Still, their changing attitude in the face of the crass stupidities of the Cult of the Good School seem likely to increasingly expose those crass stupidities for exactly what they are over time.
In speaking of that hierarchy as irrational I am not denying that schools differ with regard to the selectivity of their student intake, and the resources available to them for the fulfillment of their educational mission; or denying that that selectivity and those resources may make a difference in the quality of the education and/or training they impart to their graduates. Nor do I deny that these practical differences--or for that matter, the less rational reliance on institutional "reputation"--may make a difference to employers, with implications for their graduates' outcomes with regard to employment and income.
However, the truth is that very few making distinctions among those schools actually know anything about how the schools in question actually stack up against each other in even the more measurable, practical ways (for instance, the average quality of undergraduate instruction); or actually know anything about exactly what difference the school a student graduated from makes in the job market. Rather they just "know" that one school is "better" than others, with most carrying around an image of various strata in their minds--with the Ivy League "better" than others, and within that stratum Harvard better than Yale, all as private schools generally come in ahead of public schools, etcetera, etcetera, albeit with many exceptions all along the line. (Much as many desire to park their car in Harvard yard, the science major will not be looked down upon for choosing the Massachusetts Institute of Technology instead--while Berkley, UCLA, Chapel Hill, among others, in spite of being state institutions, are quite well respected indeed, more so than most private institutions.)
Those who think this way imagine that any young person of intelligence and ambition ought to aim as high as they can in the hierarchy--and if they do not their elders, and conforming peers, think there is something wrong with them. Should they question those elders' and those peers' presumption they will not get an intelligent answer--this "Just what people do."
The insistence on all this ultimately come down to simple-minded snobbery, especially insofar as the data that even the more rigorous sift tend to be flawed and even corrupt. Meanwhile a good deal of longstanding statistical data, confirmed again and again by more recent studies, calls into question the idea of some great wage premium deriving from going to the elite institutions, or even inherent employability. The "power elite" may disproportionately be associated with such colleges, but, even setting aside the fact that the advantages most of them had did not begin or end with the college they attended (where they are, after all, less likely to be going because they were exceptionally bright than because they were exceptionally privileged), the average is a different matter. Stupid writers for trash publications may make claims such as that one can major in somersaults at Harvard and still get a "good job," but as was reconfirmed by a recent study, your major is probably going to be a bigger determinant of whether you get a job in your line and what it will pay than the school to which you went.
However, this foolishness is now up against an era of tighter means, diminished prospects, and greater skepticism about the value of a college education generally, and specific avenues to getting a college education particularly--and as a result of weighing the actual dollars and cents return on their investment in a college degree, and finding that they will very probably be better off commuting to the nearest campus of the university system in their state (maybe after racking up as many credits as they can at the still closer closest campus of their community college) than selling themselves into lifetime debt slavery for "the college experience" of which so many college presidents pompously speak at a Brand Name School. Of course, considering that it is worth acknowledging that the people whose ideas are the mainstream ideas are the ones most remote from the situation of tightened means and diminished prospects; and have absolutely zero respect for the views of the majority. Still, their changing attitude in the face of the crass stupidities of the Cult of the Good School seem likely to increasingly expose those crass stupidities for exactly what they are over time.
Subscribe to:
Posts (Atom)