In Mammonart Upton Sinclair acknowledges the hard fact that for the artist "the path to honor and success in the arts has been through the service and glorification of the ruling classes." They desire that their persons and deeds should be flattered, their prejudices and lifeways affirmed, their sense of self-importance and their fantasies indulged--and "their subjects and slaves" taught "to stand in awe of them." Those who did so most satisfactorily could be very well-rewarded indeed, not only in life, but in death, and for the ages, it no accident that a Homer, a Shakespeare, a Racine--ruling-class artists all in Sinclair's view--were acclaimed the greatest of their era and nation, and continue to enjoy that status today.
I suppose this is in part because of the conservatism of literary critics, who tend to act as the priests of literature, uncritically accepting the judgments of the past to such a degree that they can at times seem to not even bother to have real opinions of their own, though they also would not seem to find acceptance of those judgments handed down to them a stretch given that the tastes of the comfortable have not changed so very much over the millennia--the upper-class Briton of Victorian times still able to thrill to Homer as the poet's own listeners once had. (Thus did ancient Greek robber-barons have poems composed "to glorify the ancestors of powerful chieftains and fighting men, and inculcate the spirit of obedience and martial pride in the new generations," and a William Gladstone still be moved.)
Equally those who walked a different path suffered for it--often in death as in life--with Sinclair making a pointed contrast between the treatment of a Shakespeare and a Shelley, a Racine and a Molière.
In handling the conceptions of the "ruling-class artist" and "hero artist" Sinclair displays some nuance--reflecting the reality that over a long career many a writer has been a mix of the two, and that many who began as one thing ended up another, with it all too sadly predictable that the more common pattern has been for a hero artist, or someone who might at least have become one, to become a ruling-class artist (as he recounts cases from Wordsworth to Wagner).
Still, if it is those who pandered to the powerful who have commanded respectability thousands of years after their passing, it has also been the case that those who look at them with their own eyes and make their own judgments are often less admiring than they are told they should be. In the aristocratic heroes giving in to "unbridled desires" of the epics and tragedies they are told they ought to speak of only in superlative terms they may, as Sinclair did again and again, see nothing but "spoiled children, flattered by servants and fawned upon by slaves," who were ultimately raised to be "psychopaths," and the art that portrayed them not at all "sublime" but rather "a bore." Those hero artists the powerful treated with such disdain, however, often manage to speak to us across time, space and culture in a way that surprises us.
Of course, it has been a long time since many read Sinclair--and few of those have been in positions of any influence. And so the canons of a century ago endure as the canons of today, as those of us required to read for school, or still read at all of our own volition, read the stories of psychopaths, and very likely find them a bore, but to retain the good opinion of others claim to have found them sublime.
Tuesday, May 28, 2024
Sunday, May 26, 2024
Reading Ray Kurzweil for a Quarter of a Century: Some Thoughts
I first took an interest in Raymond Kurzweil's writing about the Singularity back when he first published The Age of Spiritual Machines (1999). All that sort of thing was a lot more novel then, and I was really intrigued by it--the more in as Kurzweil, unlike so many futurologists, presented long lists of specific forecasts precisely dated for 2009, 2019 and after.
Those forecasts for dates that were not too far away in particular had me thinking "Let's see what happens with all this," and so as I took in news about technological developments kept an eye open for signs that the advances he described--in the neural net-based pattern recognition that seemed most central to the progress in artificial intelligence, the mass-production and mass-scale usage of carbon nanotubes that would power Moore's Law into a post-Silicon age, the construction of nano-scale machinery, etc.--were actually happening.
A few years on, I saw no evidence of anything remotely like the pace of progress he predicted. The Segway scooter, instead, was as good as that seemed to get. Kurzweil's The Singularity is Near (2005), which seemed to me to elaborate his earlier theorizing rather than really break new ground, let alone demonstrate dramatic progress since the time of his last book, did not convince me otherwise. Nor did anything else I saw in the years that followed, including the smart phones that, for all their wonders, refined well-established technologies rather than representing the breakthroughs he talked about. And indeed, when 2009 rolled around, ten years after the appearance of Spiritual Machines, I published a systematic evaluation of his technological forecasts for that year in which I concluded that he had consistently been far off the mark in those areas where it counted most.
Indeed, watching so long and seeing so little I was getting fairly skeptical of promises for radical developments just around the corner. In fact I paid a lot less attention to technological forecasting afterward--in part because the writing about it I saw gave me less reason to do so, the writers seemingly saying the same things over and over again rather than presenting any new ideas, or new arguments for them, let alone present evidence for momentous things happening.
Of course, around the mid-2010s there was a new wave of techno-hype, the more in as progress in neural nets and machine learning was gaining steam, all as we were told a great wave of automation was very likely to soon sweep through the economy--with the supposedly imminent automation of our vehicles merely the beginning. At its height I suggested, optimistically, that perhaps we were running a decade or so behind Kurzweil's forecasts.
Alas, the it was just another bubble of techno-hype that went bust with almost nothing to show for it before the end of the decade, as the pandemic reminded us all how un-automated our economy really was, and seemed likely to remain for a good long while to come, while we ran a lot more than a decade behind Kurzweil's forecasts--our 2019 less advanced than his 2009, giving us additional grounds to doubt that it was worth comparing our trajectory with his timelines at all.
Of course, I knew that techno-hype is a cyclical thing and would resurge again, but it happened sooner than I would have guessed, while frankly being prompted by rather less than I thought it would take to accomplish that--Open AI and its chatbots. That excitement remains very much with us, fed by further advances in chatbots, and promised wonders such as what we have seen of Sora, while the price of the NVIDIA corporation's stock soared past the $1,000 mark because of its standing as a maker of AI chips, all as Wall Street generally seems to be behaving not as if the Singularity is near, but as if the Singularity is here (with the apocalyptic panic coming from some quarters only the other side of the expectation that momentous things were happening). The result is an especially opportune climate for Kurzweil's long-promised and repeatedly delayed follow-up to The Singularity is Near--The Singularity is Nearer--to appear before the world. As yet only the earliest reviews have appeared, as those of us who did not get early copies must wait another month before having the chance to check out Kurzweil's book for ourselves.
Will Kurzweil persuade those of us who grew skeptical of him over this uninspiring past quarter of a century to pay attention to his predictions again?
I suppose we will find that out soon enough.
Those forecasts for dates that were not too far away in particular had me thinking "Let's see what happens with all this," and so as I took in news about technological developments kept an eye open for signs that the advances he described--in the neural net-based pattern recognition that seemed most central to the progress in artificial intelligence, the mass-production and mass-scale usage of carbon nanotubes that would power Moore's Law into a post-Silicon age, the construction of nano-scale machinery, etc.--were actually happening.
A few years on, I saw no evidence of anything remotely like the pace of progress he predicted. The Segway scooter, instead, was as good as that seemed to get. Kurzweil's The Singularity is Near (2005), which seemed to me to elaborate his earlier theorizing rather than really break new ground, let alone demonstrate dramatic progress since the time of his last book, did not convince me otherwise. Nor did anything else I saw in the years that followed, including the smart phones that, for all their wonders, refined well-established technologies rather than representing the breakthroughs he talked about. And indeed, when 2009 rolled around, ten years after the appearance of Spiritual Machines, I published a systematic evaluation of his technological forecasts for that year in which I concluded that he had consistently been far off the mark in those areas where it counted most.
Indeed, watching so long and seeing so little I was getting fairly skeptical of promises for radical developments just around the corner. In fact I paid a lot less attention to technological forecasting afterward--in part because the writing about it I saw gave me less reason to do so, the writers seemingly saying the same things over and over again rather than presenting any new ideas, or new arguments for them, let alone present evidence for momentous things happening.
Of course, around the mid-2010s there was a new wave of techno-hype, the more in as progress in neural nets and machine learning was gaining steam, all as we were told a great wave of automation was very likely to soon sweep through the economy--with the supposedly imminent automation of our vehicles merely the beginning. At its height I suggested, optimistically, that perhaps we were running a decade or so behind Kurzweil's forecasts.
Alas, the it was just another bubble of techno-hype that went bust with almost nothing to show for it before the end of the decade, as the pandemic reminded us all how un-automated our economy really was, and seemed likely to remain for a good long while to come, while we ran a lot more than a decade behind Kurzweil's forecasts--our 2019 less advanced than his 2009, giving us additional grounds to doubt that it was worth comparing our trajectory with his timelines at all.
Of course, I knew that techno-hype is a cyclical thing and would resurge again, but it happened sooner than I would have guessed, while frankly being prompted by rather less than I thought it would take to accomplish that--Open AI and its chatbots. That excitement remains very much with us, fed by further advances in chatbots, and promised wonders such as what we have seen of Sora, while the price of the NVIDIA corporation's stock soared past the $1,000 mark because of its standing as a maker of AI chips, all as Wall Street generally seems to be behaving not as if the Singularity is near, but as if the Singularity is here (with the apocalyptic panic coming from some quarters only the other side of the expectation that momentous things were happening). The result is an especially opportune climate for Kurzweil's long-promised and repeatedly delayed follow-up to The Singularity is Near--The Singularity is Nearer--to appear before the world. As yet only the earliest reviews have appeared, as those of us who did not get early copies must wait another month before having the chance to check out Kurzweil's book for ourselves.
Will Kurzweil persuade those of us who grew skeptical of him over this uninspiring past quarter of a century to pay attention to his predictions again?
I suppose we will find that out soon enough.
Segway Scooter Memories: Recalling the Innovation That Didn't Change the World
Those old enough to remember the '90s are likely to recall it as a time of tech boom that the most bullish thought would be effectively permanent (Dow Jones 36,000! Dow Jones 40,000! Dow Jones 100,000!)--but that the boom proved bubble just after the turn of the century. What they are less likely to recall is that after the famed dot-com crash much of the business community and "intelligentsia" went on, on some level, thinking, or at least hoping, the crash was just a minor correction and that the boom would get going again in short order (as seen when the business press went nuts over a spasm of growth in late 2003).
If what the investors and those presuming to advise them had on their minds was asset values and profits, there was also where those values and profits would come from--technological progress, which we were led to expect would explode in these years.
It was in this atmosphere that Dean Kamen's development of some new innovation that would "change the world" became a "hot story"--and I remember how in late 2001 we were told what it actually was.
I was underwhelmed by what we actually saw, a funny-looking scooter. So were most others, I think. And if some have not wholly let go the idea (back in 2018 CNN published a piece telling us that while it hasn't done so it yet might), it seems that most of us have forgotten the episode, or at least brushed it off as being of no importance.
However, to me the Segway scooter acquired an increasing symbolic significance over the years--of the gap between the reality of and the hype about the pace and force of technological innovation in our time. Even after we had reconciled ourselves to the remoteness of the "flying car" future, the Kurzweils of the media-business world breathlessly talked up the imminence of a grand new "molecular" age of intelligent machines, of atom-by-atom construction, of revolutionary new materials promising solutions and wonders far, far more radical. Instead the Segway was what we got then--and looking back from 2024 it still seems to me that that it remains as good a symbol as any for the reality of twenty-first century INNOVATION!
If what the investors and those presuming to advise them had on their minds was asset values and profits, there was also where those values and profits would come from--technological progress, which we were led to expect would explode in these years.
It was in this atmosphere that Dean Kamen's development of some new innovation that would "change the world" became a "hot story"--and I remember how in late 2001 we were told what it actually was.
I was underwhelmed by what we actually saw, a funny-looking scooter. So were most others, I think. And if some have not wholly let go the idea (back in 2018 CNN published a piece telling us that while it hasn't done so it yet might), it seems that most of us have forgotten the episode, or at least brushed it off as being of no importance.
However, to me the Segway scooter acquired an increasing symbolic significance over the years--of the gap between the reality of and the hype about the pace and force of technological innovation in our time. Even after we had reconciled ourselves to the remoteness of the "flying car" future, the Kurzweils of the media-business world breathlessly talked up the imminence of a grand new "molecular" age of intelligent machines, of atom-by-atom construction, of revolutionary new materials promising solutions and wonders far, far more radical. Instead the Segway was what we got then--and looking back from 2024 it still seems to me that that it remains as good a symbol as any for the reality of twenty-first century INNOVATION!
Thursday, May 16, 2024
Are We More Annoyed by Celebrities Than We Used to Be?
My answer to this post's titular question is an emphatic "Yes," and I think there are three very good reasons for it.
1. The Worsening of Overexposure.
A celebrity's publicity machine getting so aggressive that rather than interesting us it irritates us is not a new thing. But it is probably worse in an age of truly continuous subjection to media that devise like the smart phone have helped usher in--the more in as that publicity is amplified by the disease of the Internet that is clickbait. Amplified, too, by the way that celebrities personally contribute to this with their use of social media, making their idiocies tiresomely public as they take to heart the adage "Better to be thought a fool and remain silent than open one's mouth and remove all doubt" and do the extreme opposite, and alienate at least part of the public in the process again and again.
2. The Fragmentation of Popular Culture.
It has become a truism that less than ever before do we all seem to be watching the same movies and television shows, listening to the same music, etcetera, etcetera, etcetera. The result is that, especially to the extent that celebrity is associated with the entertainment world, we are probably more likely to be personally attentive to a smaller portion of the range of figures working in it, with the result that when we are hit by a piece of celebrity news we are more likely to say "Who is that?" And even after we learn who that is, not care--but still find ourselves constantly hearing about them so that we ask "Why am I hearing about this person I don't care about all the time?"--which is undeniably irritating.
3. Culture War.
As if points 1 and 2 were not bad enough the world of entertainment, like everything else in American life at least, has been swallowed up by the culture wars. Along with it, so has celebrity culture, with stances, identifications, associations playing their part in it--meaning that just about everyone can offend somebody just by existing, and often what we are hearing about is much more than their existing, as the "journalists" of the entertainment press, analyzing the implications of every word and gesture, every trivial detail of dress, demeanor and everything else make of everything a statement, and every statement a battle. Whatever side of that battle we are on we are likely to find ourselves constantly annoyed, even if that side is no side at all, especially insofar as they detest the very existence of the culture war, because everyone else seems obsessed with it, and rubbing it in their faces all the time.
1. The Worsening of Overexposure.
A celebrity's publicity machine getting so aggressive that rather than interesting us it irritates us is not a new thing. But it is probably worse in an age of truly continuous subjection to media that devise like the smart phone have helped usher in--the more in as that publicity is amplified by the disease of the Internet that is clickbait. Amplified, too, by the way that celebrities personally contribute to this with their use of social media, making their idiocies tiresomely public as they take to heart the adage "Better to be thought a fool and remain silent than open one's mouth and remove all doubt" and do the extreme opposite, and alienate at least part of the public in the process again and again.
2. The Fragmentation of Popular Culture.
It has become a truism that less than ever before do we all seem to be watching the same movies and television shows, listening to the same music, etcetera, etcetera, etcetera. The result is that, especially to the extent that celebrity is associated with the entertainment world, we are probably more likely to be personally attentive to a smaller portion of the range of figures working in it, with the result that when we are hit by a piece of celebrity news we are more likely to say "Who is that?" And even after we learn who that is, not care--but still find ourselves constantly hearing about them so that we ask "Why am I hearing about this person I don't care about all the time?"--which is undeniably irritating.
3. Culture War.
As if points 1 and 2 were not bad enough the world of entertainment, like everything else in American life at least, has been swallowed up by the culture wars. Along with it, so has celebrity culture, with stances, identifications, associations playing their part in it--meaning that just about everyone can offend somebody just by existing, and often what we are hearing about is much more than their existing, as the "journalists" of the entertainment press, analyzing the implications of every word and gesture, every trivial detail of dress, demeanor and everything else make of everything a statement, and every statement a battle. Whatever side of that battle we are on we are likely to find ourselves constantly annoyed, even if that side is no side at all, especially insofar as they detest the very existence of the culture war, because everyone else seems obsessed with it, and rubbing it in their faces all the time.
Wednesday, May 15, 2024
What Does it Really Mean When They Say That the Global Box Office Rose 31 Percent in 2023?
Discussing the post-pandemic box office as a whole I have focused on the U.S. box office, because of the abundance of detailed time series relevant to it, and my long familiarity with them, as well as the abundance of conveniently available analysis to its interpretation.
That does not exist for the global box office, a larger, more complex, less well-covered topic.
The result was that the Deadline report that the global box office saw a 31 percent jump in 2023 over 2022 got my attention. Was it possible that the U.S. was an anomaly, that there was a stronger recovery abroad?
Alas, on close inspection the biggest chunk of that surge is the recovery of the Chinese box office from its extremely depressed state in 2022--this 83 percent surge in that one market accounting for a rough third of the global increase. Cut China out of the picture and you get only a 20 percent rise in the rest of the world. That still sounds pretty impressive--except that this was exactly what we had in the U.S.--the jump from a bit under $7.4 billion collected in 2022 to $8.9 billion collected in 2023--that meager margin of improvement relative to the post-pandemic pattern--itself a gain of 20 percent (indeed, 21 percent), before the inflation that was not even counted here.
The result is that the report seems to me another case of a headline that sounds a lot better than the reality, and only confirms the impression that the box office, and with it the prospects of the studios and their movies, have stabilized at a significantly below pre-pandemic level (while in combination with what Hollywood's movies actually made in that country, China's outsized part in the "recovery" is a reminder of how little business Hollywood can expect in that market). Of course, that same scarcity of data about the global box office implies that more caution is warranted in any judgments about that, though I do not think that this year is likely to change the assessment.
* The box office tended toward $14 billion in 2023 dollars over 2015-2019.
That does not exist for the global box office, a larger, more complex, less well-covered topic.
The result was that the Deadline report that the global box office saw a 31 percent jump in 2023 over 2022 got my attention. Was it possible that the U.S. was an anomaly, that there was a stronger recovery abroad?
Alas, on close inspection the biggest chunk of that surge is the recovery of the Chinese box office from its extremely depressed state in 2022--this 83 percent surge in that one market accounting for a rough third of the global increase. Cut China out of the picture and you get only a 20 percent rise in the rest of the world. That still sounds pretty impressive--except that this was exactly what we had in the U.S.--the jump from a bit under $7.4 billion collected in 2022 to $8.9 billion collected in 2023--that meager margin of improvement relative to the post-pandemic pattern--itself a gain of 20 percent (indeed, 21 percent), before the inflation that was not even counted here.
The result is that the report seems to me another case of a headline that sounds a lot better than the reality, and only confirms the impression that the box office, and with it the prospects of the studios and their movies, have stabilized at a significantly below pre-pandemic level (while in combination with what Hollywood's movies actually made in that country, China's outsized part in the "recovery" is a reminder of how little business Hollywood can expect in that market). Of course, that same scarcity of data about the global box office implies that more caution is warranted in any judgments about that, though I do not think that this year is likely to change the assessment.
* The box office tended toward $14 billion in 2023 dollars over 2015-2019.
Book Review: Mammonart: An Essay in Economic Interpretation, by Upton Sinclair
In the second chapter of his book Mammonart: An Essay in Economic Interpretation (1925) Upton Sinclair explains his purpose in writing the book as "to investigate the whole process of art creation, and to place the art function in relation to the sanity, health and progress of mankind." Of course, this raises the question of just what Sinclair actually means by art. His answer is that it is "a representation of life, modified by the personality of the artist, for the purpose of modifying other personalities, inciting them to changes of feeling, belief and action." Art is thus by definition "inevitably and inescapably propaganda"--"sometimes unconsciously, but often deliberately" Propaganda.
However, if all art is propaganda it is also the case that not all propaganda is equal from the standpoint of artistic significance, Sinclair taking the view that "great," "real and enduring works of art" offer "propaganda of vitality and importance" according to "the practical experience of mankind," conveyed to the audience "with technical competence in terms of the art selected," with this combination of "propaganda of vitality and importance" with such competence to be expected only when "the artist in the labor of his spirit and . . . stern discipline of hard thinking, find[s] a real path of progress for the race."
From this one may conclude that art can never be about just the perfection of form, purely "escapist" or unconcerned with morality or politics; while Sinclair adds that far from being for only the few "great art has always been popular art," and that far from slavish devotion to tradition and the classics as models "vital artists make their own technique," with "present-day technique . . . far and away superior to the technique of any period."
Yet it is also the case that artists must live, and in line with the realities of the class societies of history this has generally meant the accommodation of the artists and their art to the requirements and desires of the rich and powerful--to those who have "owned" the artists as they have owned everything else--with service to these what enables an artist to do well in life (rather than do good), with the most "successful" becoming the coddled pets of wealthy and powerful patrons, and often enjoy critical respectability after their time. In contrast with those "ruling-class artists" the "hero" artists, the "martyr" artists, who did good rather than well--not least in "tak[ing] up the cause of the dispossessed and disinherited" rather than flattering their so-called betters--struggled and were even persecuted in life and often marginalized after their time, with all this reflected in the prevailing standards and canons (a triumph of Mammon over all other moral forces, hence his coinage of "Mammonart" here). Indeed, Sinclair makes it clear that he sees "six great lies prevailing in the art world" against which his positions run up, namely the lies of "art for art's sake" (the argument for art as an exercise in form only), "art snobbery" (the elitist art-is-for-the-few viewpoint), "art tradition" (advocacy of slavish classicism), "art dilettantism" (the view of art as pure diversion), the "art pervert" (the denial that art has anything to do with morality) and "vested interest" (the claim that art "excludes propaganda and has nothing to do with freedom and justice"). Challenging all of them in principle and in practice, here Sinclair endeavors "to set up new canons in the arts, overturning many of the standards now accepted," while in the process rescuing real "treasures" from "the scrap-heap" to which Establishment critics have consigned them, and transferring those false treasures now being exalted "to the history shelves of the world's library" where they belong.
After having lucidly explained the most fundamental of his intellectual premises Sinclair then proceeds upon a grand survey of art through Western history, proceeding from one figure and their works to the next down to the present time (generally devoting a chapter to each).
It is an exceedingly ambitious undertaking for a single-author work (and at that, a non-specialist who had his hands full with a staggering number of other interests and activities over the preceding decades), and it also seems only fair to point out the limits of the result. In surveying Western art, in spite of a handful of glances at the visual arts (Michelangelo, Raphael, and after that, curiously, just Whistler), and music (Beethoven, Wagner), what we get is in the main a survey of literature; with, after some coverage of the Bible and the highlights of Greek and Roman literature, Sinclair skipping over the Middle Ages entirely (no Beowulf or Norse sagas or Song of Roland or Nibelungenlied or Arthurian romances, etc., etc.) to the Renaissance, which rates mere glances before he gets on to the birth of modernity and the centuries since. Doing so he gives us mainly English and French literature with a little attention to a handful of the most prominent Germans and Russians (Goethe, Tolstoy et. al.), and only occasional glances at any other Western literature (Dante, Boccaccio, Cervantes, Ibsen, Strindberg all we get from Southern Europe and Scandinavia), all as even in surveying the literatures on which he concentrates there are some curious omissions. (H.G. Wells appears in this book--but not as one of the writers to be discussed, just as Sinclair's host at the New Reform Club, where he pointed out one of the writers Sinclair does discuss, Henry James, as "the Great Cham.") One may add that besides the omissions Sinclair can seem very dismissive or impatient of a good deal of what he examines (in an extreme example, owning up to having given up reading Dostoyevsky's The Karamazov Brothers at the time of Father's Zosima's funeral, some two-fifths of the way into the book).
Still, imperfect as the survey is of art, or even literature, Sinclair still covers an impressive, and useful, amount of essential territory with knowledge, frankness and insight--enough to constitute a very respectable "intro to Western literature" course in itself. Indeed, I doubt that many of our tenured professors of literature these days, or even in his day, could do the job nearly as well, neither where sheer range is concerned, nor the significant task of making this long treatment of a very large and complicated subject so readable as Mammonart manages to be from beginning to end, in which his framing the text as a dialogue between a caveman and his wife ("Mr. and Mrs. Ogi") is the least of the matter, with Sinclair's not being an academic perhaps an advantage. Far from having been trained in "raking the dust-heaps of history" in the manner he so derided in his book on higher education in America, Sinclair here stands in relation to literature as a writer, reader, lover of literature for whom these works are first and foremost art to be experienced, for what they make the reader feel and think, rather than a priest calling the flock to worship or an archaeologist poring over relics in the hopes of uncovering clues to the past, bringing to bear in addition to the thoroughly worked-out intellectual position he spelled out in that early chapter genuine personal engagement.
One result of that combination of viewpoint, engagement and scope was a good deal of iconoclasm--for the most part, well-warranted iconoclasm. Admittedly I disagreed with particular appraisals of Sinclair again and again, especially as he got to the nineteenth and twentieth centuries, where my reading has been more extensive and frankly my opinions about the authors he discussed stronger. (I thought him overly dismissive of Coleridge, and Scott, and Dostoyevsky; thought it wrong that he saw in Balzac wallowing in the money-mad world of which he wrote rather than criticism of that world, and indeed, "relentless, ferocious assault" on it; etc.) Still, he argued well for his positions, leaving those with whom he disagrees few nits to pick and often something to think about. (Sinclair dislikes Coleridge for his obscurity and irrationality, Scott for celebrating the Middle Ages, Dostoyevsky for going over from rebellion to reaction, and one cannot deny that all those charges are true, or that the only difference between Sinclair in disliking them for that and the kinds of critics he assails here is his being more forthright about his politics, and taking a progressive rather than a conservative or reactionary view stance.) And the truth was that I found myself agreeing with Sinclair more often than not--about the cult of the Classics, about the politics of Shakespeare, about Zola, about much, much else. Indeed, treating writers such as Henry James (a hankerer after feudalism and its relics and supposed graces, for whom no people exist but those furnished with "large sums of money . . . without effort on their part" permitting them "complicated and subtle aesthetic sensibilities"), or Oscar Wilde (whose "smart" dialogue is produced with the simple-minded formula of proclaiming the opposite of "any statement involving the simple common sense of mankind" to produce overrated epigrams), or Joseph Conrad (a "cruel-souled" "Zealot of Pessimism" in the "Agnostic Sunday School" of whose books "Agnosticism upon closer study turns out to be Capitalism," and for whom "the capitalist ownership and control of marine transportation" is "God," and accordingly ever given to directing "jeering scorn," "venom" and satire at the "altruistic impulse"), Sinclair was again and again a breath of fresh air, saying what seems to me all too obvious and obviously in need of saying, but all too rarely, actually, said. (Especially about Conrad, given how "liberal" English teachers so unthinkingly serve up to their students the insanity into which Kurtz descends in Heart of Darkness as the indisputable, sole, truth about human nature.)
At the same time he made good on his promise to try and rescue many a treasure from the scrap-heap--like the poetry of John Greenleaf Whittier, whom Sinclair hails the "[Robert] Burns of New England," while in putting in a word for not-quite-ignored figures like a Shelley or Keats reminding us that in his day they were given less than their due (and perhaps still are). At the same time those writers we may be surprised to see included in spite of their mention serving no such purposes, as with political dramas-without-politics author Humphry Ward, or adventure story writer Richard Harding Davis, still help Sinclair explain what he is trying to say about the history of literature, down to his own time.
In the process Sinclair not only tells us something about these authors and works, and by way of them illustrates and supports his arguments from early in the book--and very powerfully--but usefully develops many of those ideas he expressed initially, as with his thoughts about the "ruling-class artists" and "hero artists" (persuasively exemplified by the contrast between French playwrights Racine and Moliere), and the contrast between the art of ruling elites, and the art of oppressed and rising classes, not least the way in which the art of the former stresses form, the latter content. Particularly rare, and affecting, in the case of a Whittier or a Burns or a Keats he reminds us of the struggles endured and the scorn faced by those poets who emerge from the people than the salons, and wrote of and for those from whence they came (struggles and scorn that, he does not forget, mean that many a would-be poet, one who might well have been great, dies, not only without recognition, but without ever having had a proper chance to compose a line, "some mute, inglorious Milton" resting in an unmarked grave).
Alas, few have since had the benefit of the insights Sinclair offers in this book about authors and literature and art more broadly--far fewer than Sinclair hoped at the time. Writing Mammonart he still thought that the bad old world he was struggling against was but "an evil dream of but a few more years," and its standards with it. However, here we are a century later, with the predictable result that the six lies he called out remain very much with us. Indeed, they are the standards by which Sinclair's own works have been judged; been weighed, measured and found wanting; as one sees looking at Anthony Arthur's obituary for Sinclair in that newspaper that (as he relates in another work, The Brass Check) treated him so abominably in life. Deriding his works for being "admitted propaganda," for his "interest in persuasion and politics"--for his having "sold his birthright for a pot of message"--it makes clear just how little Sinclair's case that all art is by definition propaganda altered the prevailing standards. The result is that where Sinclair endeavored to rescue literary treasures from the scrap-heap, his own novels have been tossed into the same scrap-heap ("no longer read" Arthur flatly said), with the same going for everything else he had to offer, Mammonart most certainly included--as the shelves of the library of which Sinclair spoke have only become more crammed with false treasures.
Literature, culture and social thought today are all the poorer for it.
However, if all art is propaganda it is also the case that not all propaganda is equal from the standpoint of artistic significance, Sinclair taking the view that "great," "real and enduring works of art" offer "propaganda of vitality and importance" according to "the practical experience of mankind," conveyed to the audience "with technical competence in terms of the art selected," with this combination of "propaganda of vitality and importance" with such competence to be expected only when "the artist in the labor of his spirit and . . . stern discipline of hard thinking, find[s] a real path of progress for the race."
From this one may conclude that art can never be about just the perfection of form, purely "escapist" or unconcerned with morality or politics; while Sinclair adds that far from being for only the few "great art has always been popular art," and that far from slavish devotion to tradition and the classics as models "vital artists make their own technique," with "present-day technique . . . far and away superior to the technique of any period."
Yet it is also the case that artists must live, and in line with the realities of the class societies of history this has generally meant the accommodation of the artists and their art to the requirements and desires of the rich and powerful--to those who have "owned" the artists as they have owned everything else--with service to these what enables an artist to do well in life (rather than do good), with the most "successful" becoming the coddled pets of wealthy and powerful patrons, and often enjoy critical respectability after their time. In contrast with those "ruling-class artists" the "hero" artists, the "martyr" artists, who did good rather than well--not least in "tak[ing] up the cause of the dispossessed and disinherited" rather than flattering their so-called betters--struggled and were even persecuted in life and often marginalized after their time, with all this reflected in the prevailing standards and canons (a triumph of Mammon over all other moral forces, hence his coinage of "Mammonart" here). Indeed, Sinclair makes it clear that he sees "six great lies prevailing in the art world" against which his positions run up, namely the lies of "art for art's sake" (the argument for art as an exercise in form only), "art snobbery" (the elitist art-is-for-the-few viewpoint), "art tradition" (advocacy of slavish classicism), "art dilettantism" (the view of art as pure diversion), the "art pervert" (the denial that art has anything to do with morality) and "vested interest" (the claim that art "excludes propaganda and has nothing to do with freedom and justice"). Challenging all of them in principle and in practice, here Sinclair endeavors "to set up new canons in the arts, overturning many of the standards now accepted," while in the process rescuing real "treasures" from "the scrap-heap" to which Establishment critics have consigned them, and transferring those false treasures now being exalted "to the history shelves of the world's library" where they belong.
After having lucidly explained the most fundamental of his intellectual premises Sinclair then proceeds upon a grand survey of art through Western history, proceeding from one figure and their works to the next down to the present time (generally devoting a chapter to each).
It is an exceedingly ambitious undertaking for a single-author work (and at that, a non-specialist who had his hands full with a staggering number of other interests and activities over the preceding decades), and it also seems only fair to point out the limits of the result. In surveying Western art, in spite of a handful of glances at the visual arts (Michelangelo, Raphael, and after that, curiously, just Whistler), and music (Beethoven, Wagner), what we get is in the main a survey of literature; with, after some coverage of the Bible and the highlights of Greek and Roman literature, Sinclair skipping over the Middle Ages entirely (no Beowulf or Norse sagas or Song of Roland or Nibelungenlied or Arthurian romances, etc., etc.) to the Renaissance, which rates mere glances before he gets on to the birth of modernity and the centuries since. Doing so he gives us mainly English and French literature with a little attention to a handful of the most prominent Germans and Russians (Goethe, Tolstoy et. al.), and only occasional glances at any other Western literature (Dante, Boccaccio, Cervantes, Ibsen, Strindberg all we get from Southern Europe and Scandinavia), all as even in surveying the literatures on which he concentrates there are some curious omissions. (H.G. Wells appears in this book--but not as one of the writers to be discussed, just as Sinclair's host at the New Reform Club, where he pointed out one of the writers Sinclair does discuss, Henry James, as "the Great Cham.") One may add that besides the omissions Sinclair can seem very dismissive or impatient of a good deal of what he examines (in an extreme example, owning up to having given up reading Dostoyevsky's The Karamazov Brothers at the time of Father's Zosima's funeral, some two-fifths of the way into the book).
Still, imperfect as the survey is of art, or even literature, Sinclair still covers an impressive, and useful, amount of essential territory with knowledge, frankness and insight--enough to constitute a very respectable "intro to Western literature" course in itself. Indeed, I doubt that many of our tenured professors of literature these days, or even in his day, could do the job nearly as well, neither where sheer range is concerned, nor the significant task of making this long treatment of a very large and complicated subject so readable as Mammonart manages to be from beginning to end, in which his framing the text as a dialogue between a caveman and his wife ("Mr. and Mrs. Ogi") is the least of the matter, with Sinclair's not being an academic perhaps an advantage. Far from having been trained in "raking the dust-heaps of history" in the manner he so derided in his book on higher education in America, Sinclair here stands in relation to literature as a writer, reader, lover of literature for whom these works are first and foremost art to be experienced, for what they make the reader feel and think, rather than a priest calling the flock to worship or an archaeologist poring over relics in the hopes of uncovering clues to the past, bringing to bear in addition to the thoroughly worked-out intellectual position he spelled out in that early chapter genuine personal engagement.
One result of that combination of viewpoint, engagement and scope was a good deal of iconoclasm--for the most part, well-warranted iconoclasm. Admittedly I disagreed with particular appraisals of Sinclair again and again, especially as he got to the nineteenth and twentieth centuries, where my reading has been more extensive and frankly my opinions about the authors he discussed stronger. (I thought him overly dismissive of Coleridge, and Scott, and Dostoyevsky; thought it wrong that he saw in Balzac wallowing in the money-mad world of which he wrote rather than criticism of that world, and indeed, "relentless, ferocious assault" on it; etc.) Still, he argued well for his positions, leaving those with whom he disagrees few nits to pick and often something to think about. (Sinclair dislikes Coleridge for his obscurity and irrationality, Scott for celebrating the Middle Ages, Dostoyevsky for going over from rebellion to reaction, and one cannot deny that all those charges are true, or that the only difference between Sinclair in disliking them for that and the kinds of critics he assails here is his being more forthright about his politics, and taking a progressive rather than a conservative or reactionary view stance.) And the truth was that I found myself agreeing with Sinclair more often than not--about the cult of the Classics, about the politics of Shakespeare, about Zola, about much, much else. Indeed, treating writers such as Henry James (a hankerer after feudalism and its relics and supposed graces, for whom no people exist but those furnished with "large sums of money . . . without effort on their part" permitting them "complicated and subtle aesthetic sensibilities"), or Oscar Wilde (whose "smart" dialogue is produced with the simple-minded formula of proclaiming the opposite of "any statement involving the simple common sense of mankind" to produce overrated epigrams), or Joseph Conrad (a "cruel-souled" "Zealot of Pessimism" in the "Agnostic Sunday School" of whose books "Agnosticism upon closer study turns out to be Capitalism," and for whom "the capitalist ownership and control of marine transportation" is "God," and accordingly ever given to directing "jeering scorn," "venom" and satire at the "altruistic impulse"), Sinclair was again and again a breath of fresh air, saying what seems to me all too obvious and obviously in need of saying, but all too rarely, actually, said. (Especially about Conrad, given how "liberal" English teachers so unthinkingly serve up to their students the insanity into which Kurtz descends in Heart of Darkness as the indisputable, sole, truth about human nature.)
At the same time he made good on his promise to try and rescue many a treasure from the scrap-heap--like the poetry of John Greenleaf Whittier, whom Sinclair hails the "[Robert] Burns of New England," while in putting in a word for not-quite-ignored figures like a Shelley or Keats reminding us that in his day they were given less than their due (and perhaps still are). At the same time those writers we may be surprised to see included in spite of their mention serving no such purposes, as with political dramas-without-politics author Humphry Ward, or adventure story writer Richard Harding Davis, still help Sinclair explain what he is trying to say about the history of literature, down to his own time.
In the process Sinclair not only tells us something about these authors and works, and by way of them illustrates and supports his arguments from early in the book--and very powerfully--but usefully develops many of those ideas he expressed initially, as with his thoughts about the "ruling-class artists" and "hero artists" (persuasively exemplified by the contrast between French playwrights Racine and Moliere), and the contrast between the art of ruling elites, and the art of oppressed and rising classes, not least the way in which the art of the former stresses form, the latter content. Particularly rare, and affecting, in the case of a Whittier or a Burns or a Keats he reminds us of the struggles endured and the scorn faced by those poets who emerge from the people than the salons, and wrote of and for those from whence they came (struggles and scorn that, he does not forget, mean that many a would-be poet, one who might well have been great, dies, not only without recognition, but without ever having had a proper chance to compose a line, "some mute, inglorious Milton" resting in an unmarked grave).
Alas, few have since had the benefit of the insights Sinclair offers in this book about authors and literature and art more broadly--far fewer than Sinclair hoped at the time. Writing Mammonart he still thought that the bad old world he was struggling against was but "an evil dream of but a few more years," and its standards with it. However, here we are a century later, with the predictable result that the six lies he called out remain very much with us. Indeed, they are the standards by which Sinclair's own works have been judged; been weighed, measured and found wanting; as one sees looking at Anthony Arthur's obituary for Sinclair in that newspaper that (as he relates in another work, The Brass Check) treated him so abominably in life. Deriding his works for being "admitted propaganda," for his "interest in persuasion and politics"--for his having "sold his birthright for a pot of message"--it makes clear just how little Sinclair's case that all art is by definition propaganda altered the prevailing standards. The result is that where Sinclair endeavored to rescue literary treasures from the scrap-heap, his own novels have been tossed into the same scrap-heap ("no longer read" Arthur flatly said), with the same going for everything else he had to offer, Mammonart most certainly included--as the shelves of the library of which Sinclair spoke have only become more crammed with false treasures.
Literature, culture and social thought today are all the poorer for it.
Do the Consensus Historians Still Matter?
When I first started reading, for example, Richard Hofstadter (if memory serves the first of his books I picked up was Anti-Intellectualism in American Life) I had either never heard or failed to recognize any significance in the term "consensus historians." The awareness I was to develop of that bigger context came much later, consolidated as I found myself approaching the matter of "centrist" ideology in a serious way.
Naturally reading up on all that I became aware of how the consensus historians belong to another generation, since superseded by other views--with some of those historians, in fact, contributing to that process themselves with displays of renewed attention to what their visions of consensus missed. (Thus did consensus historian Hofstadter produce the volume American Violence; A Documentary History, a reminder of just how conflict-ridden the history of a country whose history was supposedly defined by "consensus" actually was.)
Still, there is no question of those historians of mid-century having left their mark on American intellectual life, and much more besides. This is, in part, because of the ways in which they helped shape that centrist outlook that I think few understand much in any conscious way, but which is so much a part of American political culture--determining thelimits of the "legitimate" ideological spectrum and with it what people are allowed to talk about or even expect, the conduct of mainstream politicians, the operation of the media, the "political language" we use, etc..
However, it may also be because academic historiography since that time, shaped by the continued tendency to specialization, the influence of postmodernism, and the widening gap between the scholarly and the popular in a culture which is on the whole fragmenting while becoming less and less literate, has simply not had the same potential for broad vision, or broad influence. Indeed, considering the situation I find myself recalling their fellow centrist theoretician, the sociologist Daniel Bell, in The End of Ideology, in which, in a deeply lachrymose passage, he remarks the inferiority of the more prominent public intellectuals of his time in comparison with those of the prior generation (a Veblen, a Beard, a Dewey)--and think that today there are grounds for saying the same of the last generation or two as against those we had at mid-century.
Naturally reading up on all that I became aware of how the consensus historians belong to another generation, since superseded by other views--with some of those historians, in fact, contributing to that process themselves with displays of renewed attention to what their visions of consensus missed. (Thus did consensus historian Hofstadter produce the volume American Violence; A Documentary History, a reminder of just how conflict-ridden the history of a country whose history was supposedly defined by "consensus" actually was.)
Still, there is no question of those historians of mid-century having left their mark on American intellectual life, and much more besides. This is, in part, because of the ways in which they helped shape that centrist outlook that I think few understand much in any conscious way, but which is so much a part of American political culture--determining thelimits of the "legitimate" ideological spectrum and with it what people are allowed to talk about or even expect, the conduct of mainstream politicians, the operation of the media, the "political language" we use, etc..
However, it may also be because academic historiography since that time, shaped by the continued tendency to specialization, the influence of postmodernism, and the widening gap between the scholarly and the popular in a culture which is on the whole fragmenting while becoming less and less literate, has simply not had the same potential for broad vision, or broad influence. Indeed, considering the situation I find myself recalling their fellow centrist theoretician, the sociologist Daniel Bell, in The End of Ideology, in which, in a deeply lachrymose passage, he remarks the inferiority of the more prominent public intellectuals of his time in comparison with those of the prior generation (a Veblen, a Beard, a Dewey)--and think that today there are grounds for saying the same of the last generation or two as against those we had at mid-century.
Of Climate "Skepticism"
In the arguments over climate change it has been common for the press to refer to those rejecting the longstanding and overwhelming scientific consensus that anthropogenic climate change is a reality as "climate skeptics."
The term "climate skeptic" implies people who are honestly and seriously considering the evidence for climate change and rejecting it. Some of those rejecting the consensus may fit this description--but hardly all.
The person who rejects the science without really having done the homework--whose answer may simply be "What the hell do scientists know?"--would seem more remote from the characterization. Of course, one may generously allow that perhaps they are skeptical of science broadly. However, after dismissing climate science, we may see them point to a study that correlated low IQ scores with poverty as evidence that economic and social outcomes are a matter of personal failings and not societal failings and confidently underline their position by adding "It's science!" with Ron Burgundy-like assurance.
Rather than any real skepticism there was just denial--and to call it "skepticism" with all the brain-work the term implies dignifies it excessively. But that is what the mainstream media does, and can be expected to do, given that its business imperatives and professional culture, its ideological inclinations, and much, much else, bias it toward extreme deference toward those interests and groups which champion climate denial, and their representatives. The readiness to "both sides" the issue when on so many other points they acknowledge only one side is one way in which this has been the case. The use, down to the present moment (just do a keyword check of recent news stories and you will see this for yourself), of the term "climate skepticism" when they should be saying (as only occasionally they say) "climate denial" is another.
The term "climate skeptic" implies people who are honestly and seriously considering the evidence for climate change and rejecting it. Some of those rejecting the consensus may fit this description--but hardly all.
The person who rejects the science without really having done the homework--whose answer may simply be "What the hell do scientists know?"--would seem more remote from the characterization. Of course, one may generously allow that perhaps they are skeptical of science broadly. However, after dismissing climate science, we may see them point to a study that correlated low IQ scores with poverty as evidence that economic and social outcomes are a matter of personal failings and not societal failings and confidently underline their position by adding "It's science!" with Ron Burgundy-like assurance.
Rather than any real skepticism there was just denial--and to call it "skepticism" with all the brain-work the term implies dignifies it excessively. But that is what the mainstream media does, and can be expected to do, given that its business imperatives and professional culture, its ideological inclinations, and much, much else, bias it toward extreme deference toward those interests and groups which champion climate denial, and their representatives. The readiness to "both sides" the issue when on so many other points they acknowledge only one side is one way in which this has been the case. The use, down to the present moment (just do a keyword check of recent news stories and you will see this for yourself), of the term "climate skepticism" when they should be saying (as only occasionally they say) "climate denial" is another.
The Politics of "Epistemological Nihilism"
All across history the view of those who sought a more just and flourishing world have, like the protagonist of a recent Oscar-winning film, said that "If I know the world, I can improve it."
Those threatened by the prospect of such change, not only its failing but even worse its succeeding, have always had as an important weapon in their intellectual arsenal the counter-assertion that "You can never know the world"--and therefore cannot improve it; what I call "epistemological nihilism."
The result is that, as Carroll Quigley makes clear in his flawed, at times tiresomely pedantic, but still worthwhile The Evolution of Civilizations; and as one sees when working their way through the tradition of Western philosophy themselves; skepticism has overwhelmingly been utilized by those on the side of "things as they are" against those desirous of what we would call progress. And it says everything of just how muddled intellectual life has been that in recent decades postmodernists, even as they cite Nietzsche and Heidegger(!), think themselves espousing not the darkest of the Counter-Enlightenment, but some kind of progressive philosophy.
Those threatened by the prospect of such change, not only its failing but even worse its succeeding, have always had as an important weapon in their intellectual arsenal the counter-assertion that "You can never know the world"--and therefore cannot improve it; what I call "epistemological nihilism."
The result is that, as Carroll Quigley makes clear in his flawed, at times tiresomely pedantic, but still worthwhile The Evolution of Civilizations; and as one sees when working their way through the tradition of Western philosophy themselves; skepticism has overwhelmingly been utilized by those on the side of "things as they are" against those desirous of what we would call progress. And it says everything of just how muddled intellectual life has been that in recent decades postmodernists, even as they cite Nietzsche and Heidegger(!), think themselves espousing not the darkest of the Counter-Enlightenment, but some kind of progressive philosophy.
Of "Skepticism": A Few Words
In a consideration of David Hume I recently ran across one writers, considering Hume's notorious extremism in this respect, he asked "How much skepticism is enough?"
I am not sure that I would think of the issue in terms of "how much"--of an appropriate quantity or proportion of skepticism.
Rather I would say that there is a world of difference between the skepticism of those who think the world is knowable, and are trying to work their way toward such knowledge, and the skepticism of those who deny the possibility altogether--the more in as, as is so often the case with the latter, their purpose is sabotaging someone else's search for fear of where it will lead.
Indeed, the "skeptic" who denies the possibility of knowledge altogether strikes me not so much a skeptic as an epistemological nihilist--and like all nihilists in the sense I am talking about when it comes down to it, a bullshitter in the philosophical sense. After all, one cannot get out of bed in the morning--or decline to get out of bed in the morning--without acting on assumptions about what will come from that (overwhelmingly, correct ones), and anyone who is a "skeptic" for long always does so, while one might add the skeptic never hesitates to inflict their opinions about this, that and the other thing on the world with that very self-assurance their supposed skepticism should make impossible.
Hume, of course, was no exception to that, the philosopher famed for his attack on induction never worrying about his own conclusions about it when engaging in induction himself on matters such as the inferiority of one race to another. His outsized reputation today (and for that matter the extent to which criticism of Hume has been so limited to relatively narrow aspects of his legacy) is no testament to the vitality or rigor of thought in our time.
I am not sure that I would think of the issue in terms of "how much"--of an appropriate quantity or proportion of skepticism.
Rather I would say that there is a world of difference between the skepticism of those who think the world is knowable, and are trying to work their way toward such knowledge, and the skepticism of those who deny the possibility altogether--the more in as, as is so often the case with the latter, their purpose is sabotaging someone else's search for fear of where it will lead.
Indeed, the "skeptic" who denies the possibility of knowledge altogether strikes me not so much a skeptic as an epistemological nihilist--and like all nihilists in the sense I am talking about when it comes down to it, a bullshitter in the philosophical sense. After all, one cannot get out of bed in the morning--or decline to get out of bed in the morning--without acting on assumptions about what will come from that (overwhelmingly, correct ones), and anyone who is a "skeptic" for long always does so, while one might add the skeptic never hesitates to inflict their opinions about this, that and the other thing on the world with that very self-assurance their supposed skepticism should make impossible.
Hume, of course, was no exception to that, the philosopher famed for his attack on induction never worrying about his own conclusions about it when engaging in induction himself on matters such as the inferiority of one race to another. His outsized reputation today (and for that matter the extent to which criticism of Hume has been so limited to relatively narrow aspects of his legacy) is no testament to the vitality or rigor of thought in our time.
David Hume's Criticism of Induction
There is nothing so cheap as nihilism--and in philosophy, nothing so cheap as epistemological nihilism. The reality is that, as any even slightly serious student of the scientific method knows, even the most well-grounded knowledge claims come with qualifications, with fine print. What the epistemological nihilist does is to pretend that this fine print is a discovery new to the world as they scream about it so loudly and at such length that eventually those of weaker mind can think of nothing else, while a certain kind of pseudo-intellectual does not even have to have their minds broken down that way before falling into line with such nonsense. These persons, after all, are, to use Eugene Earnshaw's choice of phrasing, "sadistic contrarians" who delight in "making people feel stupid" by appearing to prove that false propositions are true.
Alas, the weak-minded and pseudo-intellectual are no rarer in this time than any other, and frankly, probably more common. Indeed, in certain corners of academia and culture being such a person is probably a career requirement.
David Hume's attack on induction has always struck me as exemplary of that sort of screaming that so impresses the weak-minded and delights the pseudo-intellectual. And it was thus refreshing to see Dr. Earnshaw point out the obvious--which is that, if far too few card-carrying philosophers paid attention to them, remind us that over the years many have debunked Hume again and again in many different ways (pragmatism, mathematical probability, and best of all for he was thus "hoist by his own petard," a subtler grasp of induction than Hume ever displayed), to the extent that his "argument" ever needed to be debunked at all. It is all the more refreshing in as this particular bit of such screaming gets such a ridiculous amount of respect, and in turn makes Hume's one of those names with which those who prefer to drop names rather than actually read books like to beat their opponents over the head with in argument.
Alas, the weak-minded and pseudo-intellectual are no rarer in this time than any other, and frankly, probably more common. Indeed, in certain corners of academia and culture being such a person is probably a career requirement.
David Hume's attack on induction has always struck me as exemplary of that sort of screaming that so impresses the weak-minded and delights the pseudo-intellectual. And it was thus refreshing to see Dr. Earnshaw point out the obvious--which is that, if far too few card-carrying philosophers paid attention to them, remind us that over the years many have debunked Hume again and again in many different ways (pragmatism, mathematical probability, and best of all for he was thus "hoist by his own petard," a subtler grasp of induction than Hume ever displayed), to the extent that his "argument" ever needed to be debunked at all. It is all the more refreshing in as this particular bit of such screaming gets such a ridiculous amount of respect, and in turn makes Hume's one of those names with which those who prefer to drop names rather than actually read books like to beat their opponents over the head with in argument.
Saturday, May 11, 2024
The Fall Guy in 2024 vs. Mission: Impossible in 1996
Remarking The Fall Guy's "playing like a deflated balloon" Anthony D'Alessandro raised, and dismissed, the faintness of pop cultural memory of the show the film adapts. Up to a point he is right--correctly observing that he does not "think Universal sold the movie on that"--but he also struck me as superficial, and wrong when pointing (with an obnoxious "Hello") to how Mission: Impossible became a franchise-launching hit in 1996. "What Gen Xer had actually watched Mission: Impossible back in the late '60s?" he asked, implying the brand name's irrelevance to the critical youth market.
It was indeed the case that the young had not seen the show during its '60s-era original run. But they had chances to see at least some of it in the reruns that were a rather bigger part of our TV consumption (I specifically remember the cable channel FX airing the show before Mission: Impossible hit theaters), while there had been a two-season revival of the show in the late '80s (1988-1990) which actually brought back Peter Graves as Jim Phelps. All of this helped sustain the brand name in pop cultural memory even among those who never saw an episode of the show. The memory was, as I remarked last year, hazy, but that was probably an advantage in its way--enabling the audience to associate "Mission: Impossible" with cool spy-fi adventure, heralded by celebrated composer Lalo Schifrin's famous theme, the burning fuse graphic, the exploding tape recorded messages, sufficiently for reports of a Mission: Impossible movie to catch their interest at least a little, while keeping them from being purists who would get mad about what the film did with the characters. ("Jim, how could you!")
Rather than being recalled in that hazy but plausibly intriguing way The Fall Guy was simply not recalled at all by most, denying it the advantages that I do think Mission: Impossible managed to derive then, even among the young.
One may add that the market at the time was less crowded with action blockbusters generally and contemporary American spy-fi blockbusters still a comparative novelty (that boom just getting underway), and the audience less tough to bring to theaters (on average making four to five trips per year to the theater then, as against two now.) And I dare say that circa 1996 Tom Cruise's star power was considerably greater than that of Ryan Gosling, and perhaps anyone else in this post-star era. (Indeed, it seems telling that writing the prior sentence just now I had to look up Ryan Gosling's name just to be sure I remembered it correctly--again, proof that, contrary to how that stupid "Everything" trailer had it, Ryan Gosling does not "warrant introduction as RYAN M@TH?RF#*&!NG GOSLING.")
The result is that what we are seeing with The Fall Guy is, in part (but only in part), a lesson in what happens to a franchise when its movie stays in development hell for far too long--so long that people forget the franchise ever existed, and the cinematic market at which it is aimed changes profoundly. Indeed, this seems to me so important as to moot the other ways (gambling on a May opening for a sub-May opening movie, the handling of the protagonist and broader tone, etc.) in which Universal mishandled the release of the film.
It was indeed the case that the young had not seen the show during its '60s-era original run. But they had chances to see at least some of it in the reruns that were a rather bigger part of our TV consumption (I specifically remember the cable channel FX airing the show before Mission: Impossible hit theaters), while there had been a two-season revival of the show in the late '80s (1988-1990) which actually brought back Peter Graves as Jim Phelps. All of this helped sustain the brand name in pop cultural memory even among those who never saw an episode of the show. The memory was, as I remarked last year, hazy, but that was probably an advantage in its way--enabling the audience to associate "Mission: Impossible" with cool spy-fi adventure, heralded by celebrated composer Lalo Schifrin's famous theme, the burning fuse graphic, the exploding tape recorded messages, sufficiently for reports of a Mission: Impossible movie to catch their interest at least a little, while keeping them from being purists who would get mad about what the film did with the characters. ("Jim, how could you!")
Rather than being recalled in that hazy but plausibly intriguing way The Fall Guy was simply not recalled at all by most, denying it the advantages that I do think Mission: Impossible managed to derive then, even among the young.
One may add that the market at the time was less crowded with action blockbusters generally and contemporary American spy-fi blockbusters still a comparative novelty (that boom just getting underway), and the audience less tough to bring to theaters (on average making four to five trips per year to the theater then, as against two now.) And I dare say that circa 1996 Tom Cruise's star power was considerably greater than that of Ryan Gosling, and perhaps anyone else in this post-star era. (Indeed, it seems telling that writing the prior sentence just now I had to look up Ryan Gosling's name just to be sure I remembered it correctly--again, proof that, contrary to how that stupid "Everything" trailer had it, Ryan Gosling does not "warrant introduction as RYAN M@TH?RF#*&!NG GOSLING.")
The result is that what we are seeing with The Fall Guy is, in part (but only in part), a lesson in what happens to a franchise when its movie stays in development hell for far too long--so long that people forget the franchise ever existed, and the cinematic market at which it is aimed changes profoundly. Indeed, this seems to me so important as to moot the other ways (gambling on a May opening for a sub-May opening movie, the handling of the protagonist and broader tone, etc.) in which Universal mishandled the release of the film.
How Much Money Did The Flash Really Lose?
Back in June 2023 when the promised "greatest superhero movie ever!" instead crashed and burned at the box office, the reports were that the lousy gross for The Flash meant the movie possibly losing the studio $200 million.
However, that was back when people thought the production budget was in the vicinity of $200 million. A little while later we were told that it was $300 million--implying the possibility of much more than $200 million being lost.
Yet, even as the budget appeared larger than first reported, Deadline reported a mere $155 million studio loss on the film during its "Most Valuable Blockbusters" tournament.
Odd, isn't it?
Well, when you take a closer look at the numbers you see that Deadline went with the $200 million production budget figure, not what we heard later.
At the same time, where in June we heard the marketing campaign was running $150 million, Deadline claims just $120 million were spent on prints and ads.
All of this allows the number to make sense--all as one could find still more room for it to do so if these were gross rather than net figures (if subsidies and the like offered offsets to the numbers then being thrown about).
However, if we go with those higher numbers things move in the other direction. A $300 million production budget, a $150 million marketing campaign, work out to an addition of $130 million+ in extra expenses. Tacked onto the $155 million loss Deadline reported, in fact, the loss nearly doubles, approaching $300 million.
It may well be that the lower numbers Deadline used are the correct ones. Still, given what we previously heard, doubt seems far from implausible here--and I would not be shocked by later revelations calling the Deadline numbers into question.
However, that was back when people thought the production budget was in the vicinity of $200 million. A little while later we were told that it was $300 million--implying the possibility of much more than $200 million being lost.
Yet, even as the budget appeared larger than first reported, Deadline reported a mere $155 million studio loss on the film during its "Most Valuable Blockbusters" tournament.
Odd, isn't it?
Well, when you take a closer look at the numbers you see that Deadline went with the $200 million production budget figure, not what we heard later.
At the same time, where in June we heard the marketing campaign was running $150 million, Deadline claims just $120 million were spent on prints and ads.
All of this allows the number to make sense--all as one could find still more room for it to do so if these were gross rather than net figures (if subsidies and the like offered offsets to the numbers then being thrown about).
However, if we go with those higher numbers things move in the other direction. A $300 million production budget, a $150 million marketing campaign, work out to an addition of $130 million+ in extra expenses. Tacked onto the $155 million loss Deadline reported, in fact, the loss nearly doubles, approaching $300 million.
It may well be that the lower numbers Deadline used are the correct ones. Still, given what we previously heard, doubt seems far from implausible here--and I would not be shocked by later revelations calling the Deadline numbers into question.
Is The Flash Really the Biggest Box Office Flop of 2023?
Previously considering the numbers Deadline presented in the course of its "Most Valuable Blockbusters" tournament I mentioned that I was surprised that the figures for the film that was ranked the biggest money-loser of the year, Captain Marvel 2 (aka The Marvels), did not include the reported subsidy that cut the net production cost from $270 million+ to $220 million.
I also noticed that the budget for The Flash last year was reported as over $200 million, and then $300 million later, in a tone of scandal.
Had Deadline gone with the net production cost of $220 million for Captain Marvel 2 they would have shaved $50 million off the loss--and had they gone with the $300 million figure for The Flash's production budget they would have added an extra $100 million to its loss. The result of these two changes would have been to lower the loss on Captain Marvel 2 from $237 million to $187 million--and raise the loss on The Flash from $155 million to $255 million. The number would still be terrible for Captain Marvel 2, but even worse than it is now for The Flash, with the result the two movies still being at the top of the "biggest" flops list, but switching places to make The Flash #1 in this unenviable category (all as a fuller accounting could easily translate to an even worse picture for The Flash).
I also noticed that the budget for The Flash last year was reported as over $200 million, and then $300 million later, in a tone of scandal.
Had Deadline gone with the net production cost of $220 million for Captain Marvel 2 they would have shaved $50 million off the loss--and had they gone with the $300 million figure for The Flash's production budget they would have added an extra $100 million to its loss. The result of these two changes would have been to lower the loss on Captain Marvel 2 from $237 million to $187 million--and raise the loss on The Flash from $155 million to $255 million. The number would still be terrible for Captain Marvel 2, but even worse than it is now for The Flash, with the result the two movies still being at the top of the "biggest" flops list, but switching places to make The Flash #1 in this unenviable category (all as a fuller accounting could easily translate to an even worse picture for The Flash).
Is Hollywood Becoming More Careful With its Movie Budgets?
Last year, considering the ways in which The Flash and Indiana Jones 5 in particular flopped I predicted massive losses for each--rather more massive than those films actually suffered according to Deadline, which reported losses of around $150 million for each of those films.
There seems ample reason to be surprised by this, starting with the budgets on which Deadline bases the calculation. In spite of a more than 20 percent jump in prices according to the Consumer Price Index since 2018-2019, and the spiking of interest rates raising the cost of borrowing, the reported production costs do not seem much higher than before, big superhero films, for example, still commonly reported as $200 million productions. The constancy of the budgets is the more surprising given how what we were given in 2023 at least appeared to be gross, not net, figures; and how many major releases of 2023 suffered from cost-raising pandemic-related disruptions.
I am even more struck by the expenditures on prints and ads, which for the films identified as the biggest failures were low relative even to their reported budgets and to what was spent on other comparable productions--$110 million for Captain Marvel 2, $120 million for the Flash and Indiana, whereas the expenditure was $160 million for Disney's Guardians of the Galaxy 3, the $150 million and $175 million for the much cheaper Super Mario Bros. and Barbie. (The figure for The Flash is actually lower than the $150 million we were told was budgeted for the marketing campaign last June--while if the figure for Captain Marvel 2 is correct Disney-Marvel spent a third less, adjusted for inflation, to promote Captain Marvel 2 than it did to promote the original Captain Marvel back in 2019.*)
One might conclude from the lower-than-expected numbers that Hollywood has become more astute financially, with the low figures for prints and ads perhaps indicating that studio heads who knew they had losers on their hands cut their losses. (Indeed, there was speculation about this in the case of the surprisingly low profile Aquaman 2.) Yet, even if we take the numbers presented as absolutely reliable (and one cannot know that), a handful of films made in these chaotic times tell us only so much, and may be slight grounds for claiming any great shift in the way Hollywood funds filmmaking.
* The figures are $110 million for Captain Marvel 2, and $140 million for the first film (which is more like $170 million when early 2019 prices are adjusted for late 2023 prices).
There seems ample reason to be surprised by this, starting with the budgets on which Deadline bases the calculation. In spite of a more than 20 percent jump in prices according to the Consumer Price Index since 2018-2019, and the spiking of interest rates raising the cost of borrowing, the reported production costs do not seem much higher than before, big superhero films, for example, still commonly reported as $200 million productions. The constancy of the budgets is the more surprising given how what we were given in 2023 at least appeared to be gross, not net, figures; and how many major releases of 2023 suffered from cost-raising pandemic-related disruptions.
I am even more struck by the expenditures on prints and ads, which for the films identified as the biggest failures were low relative even to their reported budgets and to what was spent on other comparable productions--$110 million for Captain Marvel 2, $120 million for the Flash and Indiana, whereas the expenditure was $160 million for Disney's Guardians of the Galaxy 3, the $150 million and $175 million for the much cheaper Super Mario Bros. and Barbie. (The figure for The Flash is actually lower than the $150 million we were told was budgeted for the marketing campaign last June--while if the figure for Captain Marvel 2 is correct Disney-Marvel spent a third less, adjusted for inflation, to promote Captain Marvel 2 than it did to promote the original Captain Marvel back in 2019.*)
One might conclude from the lower-than-expected numbers that Hollywood has become more astute financially, with the low figures for prints and ads perhaps indicating that studio heads who knew they had losers on their hands cut their losses. (Indeed, there was speculation about this in the case of the surprisingly low profile Aquaman 2.) Yet, even if we take the numbers presented as absolutely reliable (and one cannot know that), a handful of films made in these chaotic times tell us only so much, and may be slight grounds for claiming any great shift in the way Hollywood funds filmmaking.
* The figures are $110 million for Captain Marvel 2, and $140 million for the first film (which is more like $170 million when early 2019 prices are adjusted for late 2023 prices).
Subscribe to:
Posts (Atom)