From time to time the folks in the news media point out the fact that more people are living alone than before in some fashion or another--or at least, not with a partner (for instance, that more adults of this or that age category are living with their parents instead). In line with its long-established ideological tendency it does not consider the matter very deeply or critically, but it does point it out, and even what it means for individual isolation and its consequences. After all, in a society where for a great many reasons extended family is a faint presence (Edward Luttwak was to comment that "Americans are . . . as poor in their family connections as Afghans or Sudanese are in money"), work is unlikely to provide much in the way of relationships (because it is insecure and tenuous, because organized labor and its associations are so much weaker in the U.S. than elsewhere, because everyone commutes to the job from so far away, etc. ), and the sense of community generally has always been fragile (decades ago a sociologist could observe that we are "bowling alone"), the nuclear family has been critical to any kind of social connectedness. Moving out, setting up a household alone or with another person, and ultimately marrying and starting a family, for all its imperfections (divorce rates are high, even when people don't divorce they become alienated from one another, children move away and don't call, etc.), is pretty much the only thing that provides people with significant social connections for any length of time. (It is even critical to the non-familial connections, because of the prospect of association with other married couples, and parents, and their connections, in a way less accessible to single persons.)
But that, of course, has its economic requirements, especially if one means to do so on the at least "quasi-middle class" terms people have come to expect--and it has become much harder for most to meet those requirements, with the young taking a particularly hard hit.
Most obviously there is the increasingly crushing cost of living relative to income for the majority as college degrees cost more and deliver less, and the burden of student debt and the cost of housing both particularly notorious in the list of problems particularly making life hard for the young, with this going even for those who do as they are told and go for STEM instead of the humanities (as, contrary to the whining of some, more and more have been doing ).
One reflection of the hard times would seem to be that, even with American cities as unwalkable as ever and as ill-provisioned as ever with public transport (even New York's subway system is crumbling), they drive less, which in much of the country means an enormous restriction of their personal mobility, to the point of significantly diminishing any prospect of a social life (and certainly any dating opportunities).
And of course the pandemic has not helped. The media, again in line with its prejudices, lavishes its attention on the "anti-lockdown" and "anti-vaccine" groups and completely ignores the other end of the spectrum here, those who think government has not done enough, that individuals are not doing enough, to control the pandemic--and who as the pandemic keeps raging have changed their habits, perhaps especially to the extent that their particular living standard makes them more vulnerable. (Someone who lives in an apartment building with its narrow hallways and shared utilities, someone who cannot or does not drive and has to ride public transport instead, has had to work much harder to keep themselves safe, if so inclined.) Meanwhile, even those totally indifferent to the problem have seen the escalating price of essentials reduce "real," inflation-adjusted wages as, once more, the price of housing in particular exploded.
Yet alongside all these very material factors which it is scarcely possible to deny (though not for lack of trying on the part of an ever-downward-punching media ever-sneering about "millennials" and out-of-touch billionaires mouthing off about young people's supposed passion for "avocado toast") there are subtler issues. There is, for instance, what it means to not be doing well economically and the ways in which this gets in the way of any attempt to "have a life" materially, but psychologically, in a very unequal, very hierarchical society where the conformist endlessly insist that the world is one big "meritocracy," and on that basis accord themselves carte blanche to be brutal to those who are not doing well. People get what they "deserve," it insists--and those who do not have enough don't deserve more, with no one having to be nice about it, and indeed free to be as nasty about it as possible to the "loser" and the "failure"
--while accusing any of these who would object at all to such treatment of having an unwarranted sense of "entitlement," and telling them that any emotional distress they are suffering is nothing more than "self-pity." Enough in itself to make them want to keep to themselves, societies' inequities hit people especially hard in the "mating market," "the marriage market," which in a very important sense live up to the implication of the word "market" with all its brutal connotations--enough so that those whose experiences have been unpleasant can get very impatient indeed of the hypocrisies one is obliged to respect when raising the subject, the supposedly reassuring banalities that can only seem insultingly patronizing to anyone past a certain not-very-great age.
The result is that while marrying and having a family may be the principal alternative to aloneness--but society's pressure on people to do so is not what it once was in an atomized, anonymous urban world, and those who (often as a result of much painful experience) expect to not do well in the market, to not find someone they want to be with, or who really wants to be with them (let alone someone they want to be with who also wants to be with them, and both thinking they might want that for life), find it easier to resist the pressure; to, as Al Bundy would have us do, "Just Say No." And indeed, even were life to get somewhat better for many--the economic and other stresses to be alleviated--I suspect that we would still see more people than before "bowling alone," and in general living alone. Because even if being alone may be the first choice of only a very few, in anything like the situation we have now many more will find that it hurts less than being with others.
Thursday, February 9, 2023
Wednesday, February 8, 2023
Is Blogging Dead? Not Exactly, But . . .
Typing the keywords "Is blogging dead?" into the principal search of engine of the day invariably leads one to a long list of links to sites emphatically answering "No, of course not!" as they proceed to offer glib advice--as they try to sell you something (for instance, one of their blogs).
Thus does it go with today's Internet--ask a question, and get not an answer but a sales pitch--in a veritable monument to what Cory Doctorow so rightly calls the "enshittification" of the Internet.
Naturally, getting somewhere requires us to think more precisely. Clearly people are still writing blogs, and the evidence is that many blogs are still attracting readers, and so in that sense blogging is not dead.
But I do not think that is what most have in mind when they raise the question. People "blog" in many ways, for many reasons. There are blogs operated by major businesses and other institutions, for example, where prominent individuals put out statements of public consequence for a limited expert audience likely to retail what they have to say to a broader public.
However, for the vast majority of the world blogging is nothing like that. What they mean when they speak of blogging is a person who is not connected with a major platform, who is not in their own right famous, setting up such a platform with little or no money spent, no assistance from anyone else, and no prospect of making their blogging a full-time activity, in the reasonable expectation of presenting their words to the world and somehow finding a readership. If they have something that resonates, maybe a big readership. Maybe even going viral!
And what they mean when they ask "Is blogging dead?" is the extent to which such a hope exists.
The boosters on the aforementioned web sites to which the search engines so relentlessly lead us have no interest whatsoever in the question "Is blogging dead?" when asked in that sense
My answer is that, contrary to the impression given by an idiot media ever trafficking in aspirational garbage that if actually followed is apt to be self-destructive, the hope was always extremely slim. The most obvious reason is that the ratio of "producers" of online content to "consumers" has been extremely high from the start, making the scene something like The Hunger Games, times a million--with some much more disadvantaged than others, like those trying to offer something of substance via a medium not particularly well-equipped for its provision. (After all, as one writer who so far as I can tell never even tried to offer anything of substance was well-equipped to explain, "The Internet is a communication tool used the world over where people can come together to bitch about movies and share pornography with one another"--not have intelligent discussions.)
It has only got worse since. Much, much worse. I can think of at least four very well-attested reasons why.
1. The decline of free-range Internet searching. Instead people access the Internet through social media and its associated filters, making "discovery" less likely (even if you get your blog onto social media).
2. The increasingly pay-to-play character of the Internet entailed by the aforementioned "enshittification," where others' search engine optimization, ad-buying and the rest get them ahead in the search rankings at your expense--while any effort you make to get ahead without laying out the cash gets you penalized. (Kind of like how the people who hand out Oscars wanted to punish Andrea Riseborough for having got an Oscar nomination by way of low-budget promotion by people who were actually enthusiastic about her movie and her performance instead of a $30 million studio campaign.)
3. The favoritism search engines show to "authoritative" content in the name of keeping the minds of the masses unsullied by "fake news" and the like. This means that the non-"authoritative"--anyone not connected with a big, big-budget platform--gets pushed down in the search rankings. And "This means you!" (Probably.) Finally, there is
4. The feedback loop that amplifies all of these tendencies. Get demoted in the search rankings for any reason--or more likely still, for many of them--and fewer people will happen upon your blog. This will get you downgraded yet again, making that many fewer people happen on you again. And so on and so forth in a vicious circle toward a readership of exactly . . . one person, yourself.
As if this were not enough there are the other factors that make blogs generally less likely to land an audience--like a discourse adapted to 140 (or is it 280?) characters at a time, max, and people flocking to "vlogs" over blogs because, reinforced by declining literacy levels, vlogs don't require you to "Read stuff." And so on and so forth.
All of this has turned the Hunger Games-times-a-million into, I suspect, something more like the Super Hunger Games-times-a-billion, the "attention economy" become that much more brutal (and punitive), which has made things worse yet again by encouraging lowest common denominator content. Thus does it seem that those who do write independently incline ever more toward the clickbaity--to the gossipy and the confessional, to hot button-pushing, to meaningless lists and worse than meaningless self-help, to forgotten-in-five-minutes ultra-topicality. Meanwhile, if things do indeed still go viral (I am doubtful that they actually do so, but people say it happens nonetheless), it is almost always the case that they are exceedingly stupid things, about which only stupid people would care about.
Like stupid Ben Affleck looking constipated at a stupid awards show, and people turning it into a "meme," and commentators remarking the face in the meme as somehow the very picture of the world's misery in 2023. (A world sinking ever deeper into a polycrisis of economic and ecological collapse, where hunger spreads and pandemic after pandemic rages, and war spreads and rages, with the fighting in Ukraine possibly already escalated into World War III, and the visage of this narcissistic mediocrity whom life has treated obscenely well next to 99.999 percent of the other people on the planet looking slightly less than pleased with life as he sits next to Jennifer Lopez at the damned Grammys is the "mask of unadorned misery?" to these people? Even as irony, which is profoundly overrated, this is too much.)
Looking at such drivel it is very, very, very hard indeed for even someone as critical of the "Idiocracy" view of the world as I am to believe that the world has not already degenerated into Mike Judge's dystopia--the more in as any blogger who is not catering to complete nitwits has to compete with all that. The result is that where their chances of commanding an appreciable audience today are concerned it is not quite absolute zero it is awfully close to it--justifying the view that, for them, blogging, if not dead, may as well be.
Thus does it go with today's Internet--ask a question, and get not an answer but a sales pitch--in a veritable monument to what Cory Doctorow so rightly calls the "enshittification" of the Internet.
Naturally, getting somewhere requires us to think more precisely. Clearly people are still writing blogs, and the evidence is that many blogs are still attracting readers, and so in that sense blogging is not dead.
But I do not think that is what most have in mind when they raise the question. People "blog" in many ways, for many reasons. There are blogs operated by major businesses and other institutions, for example, where prominent individuals put out statements of public consequence for a limited expert audience likely to retail what they have to say to a broader public.
However, for the vast majority of the world blogging is nothing like that. What they mean when they speak of blogging is a person who is not connected with a major platform, who is not in their own right famous, setting up such a platform with little or no money spent, no assistance from anyone else, and no prospect of making their blogging a full-time activity, in the reasonable expectation of presenting their words to the world and somehow finding a readership. If they have something that resonates, maybe a big readership. Maybe even going viral!
And what they mean when they ask "Is blogging dead?" is the extent to which such a hope exists.
The boosters on the aforementioned web sites to which the search engines so relentlessly lead us have no interest whatsoever in the question "Is blogging dead?" when asked in that sense
My answer is that, contrary to the impression given by an idiot media ever trafficking in aspirational garbage that if actually followed is apt to be self-destructive, the hope was always extremely slim. The most obvious reason is that the ratio of "producers" of online content to "consumers" has been extremely high from the start, making the scene something like The Hunger Games, times a million--with some much more disadvantaged than others, like those trying to offer something of substance via a medium not particularly well-equipped for its provision. (After all, as one writer who so far as I can tell never even tried to offer anything of substance was well-equipped to explain, "The Internet is a communication tool used the world over where people can come together to bitch about movies and share pornography with one another"--not have intelligent discussions.)
It has only got worse since. Much, much worse. I can think of at least four very well-attested reasons why.
1. The decline of free-range Internet searching. Instead people access the Internet through social media and its associated filters, making "discovery" less likely (even if you get your blog onto social media).
2. The increasingly pay-to-play character of the Internet entailed by the aforementioned "enshittification," where others' search engine optimization, ad-buying and the rest get them ahead in the search rankings at your expense--while any effort you make to get ahead without laying out the cash gets you penalized. (Kind of like how the people who hand out Oscars wanted to punish Andrea Riseborough for having got an Oscar nomination by way of low-budget promotion by people who were actually enthusiastic about her movie and her performance instead of a $30 million studio campaign.)
3. The favoritism search engines show to "authoritative" content in the name of keeping the minds of the masses unsullied by "fake news" and the like. This means that the non-"authoritative"--anyone not connected with a big, big-budget platform--gets pushed down in the search rankings. And "This means you!" (Probably.) Finally, there is
4. The feedback loop that amplifies all of these tendencies. Get demoted in the search rankings for any reason--or more likely still, for many of them--and fewer people will happen upon your blog. This will get you downgraded yet again, making that many fewer people happen on you again. And so on and so forth in a vicious circle toward a readership of exactly . . . one person, yourself.
As if this were not enough there are the other factors that make blogs generally less likely to land an audience--like a discourse adapted to 140 (or is it 280?) characters at a time, max, and people flocking to "vlogs" over blogs because, reinforced by declining literacy levels, vlogs don't require you to "Read stuff." And so on and so forth.
All of this has turned the Hunger Games-times-a-million into, I suspect, something more like the Super Hunger Games-times-a-billion, the "attention economy" become that much more brutal (and punitive), which has made things worse yet again by encouraging lowest common denominator content. Thus does it seem that those who do write independently incline ever more toward the clickbaity--to the gossipy and the confessional, to hot button-pushing, to meaningless lists and worse than meaningless self-help, to forgotten-in-five-minutes ultra-topicality. Meanwhile, if things do indeed still go viral (I am doubtful that they actually do so, but people say it happens nonetheless), it is almost always the case that they are exceedingly stupid things, about which only stupid people would care about.
Like stupid Ben Affleck looking constipated at a stupid awards show, and people turning it into a "meme," and commentators remarking the face in the meme as somehow the very picture of the world's misery in 2023. (A world sinking ever deeper into a polycrisis of economic and ecological collapse, where hunger spreads and pandemic after pandemic rages, and war spreads and rages, with the fighting in Ukraine possibly already escalated into World War III, and the visage of this narcissistic mediocrity whom life has treated obscenely well next to 99.999 percent of the other people on the planet looking slightly less than pleased with life as he sits next to Jennifer Lopez at the damned Grammys is the "mask of unadorned misery?" to these people? Even as irony, which is profoundly overrated, this is too much.)
Looking at such drivel it is very, very, very hard indeed for even someone as critical of the "Idiocracy" view of the world as I am to believe that the world has not already degenerated into Mike Judge's dystopia--the more in as any blogger who is not catering to complete nitwits has to compete with all that. The result is that where their chances of commanding an appreciable audience today are concerned it is not quite absolute zero it is awfully close to it--justifying the view that, for them, blogging, if not dead, may as well be.
Balzac's Theatre of Literature, the Theater of Cinema, and Oscar-Time (A Note on the Andrea Riseborough Affair)
In an earlier comment on Balzac's classic Lost Illusions I mentioned the hard lessons that his protagonist, Lucien du Rubempre, learns in the world of publishing, not least that "[t]here is a world behind the scenes in the theatre of literature. The public in front sees . . . success, and applauds." What that public is generally unaware of are "the preparations, ugly as they always are," like "the claqueurs hired to applaud," which are as necessary as they are ugly. Such preparations as the hiring of claqueurs enlist individuals to "put forth all [their] strength to win laurels for a man" they may well "despise, and maintain, in spite of [themselves] that some second-rate . . . is a genius," often successfully persuading the public that they are such.
By contrast the true genius who ventures into the world without prospect of such aids will never even get that far, never get published at all, never mind be acclaimed as a genius. ("You know nobody; you have access to no newspaper," he is told, so the book of poems he wrote "will remain demurely folded as you hold them now. They will never open out to the sun of publicity in fair fields with broad margins enameled with the florets" that a major publisher readily "scatters with a lavish hand for poets known to fame.")
Remarking these lessons my thought was for how far, far more truthful Balzac's words about publishing in his time and in our own than not only the drivel we see on TV and in film about the business (where writers are always seen signing copies of their book in some bookshop for fawning idiots). Still, I never forgot that Balzac's images of the racket that is the creation of "success" in publishing are drawn from the theater, with what he said of the stage translating easily to the theater of cinema, and I have found myself thinking on all this amid the recent stupid controversy over Andrea Riseborough's nomination for the Academy Award for Best Actress.
Ms. Riseborough's nomination has drawn more than the usual attention to the games that film backers play in getting those little statues from their movies and the people who made them--games extending far beyond even those of Balzac's vile Parisian publishing king Dauriat. Thus does David Mouriquand of Euronews write of "'For Your Consideration' bombardments" which "include sent screeners, interviews, lavish luncheons, place advertisement and direct marketing campaigns, in order to get as many votes as possible" and generally confirm in the process that "[t]he more expensive the campaign, the more chances a film can win a nomination." Indeed, as Variety reported in an item way back in 2019 (how long ago that seems now!) even many years before that the campaigns could run as much as $30 million--"success" here, as in Balzac's publishing world, up for sale to the highest bidder, with a winning bid exceeding by orders of magnitude the production budget of the film that aroused so much controversy, To Leslie (reportedly made for under $1 million).
And indeed it would seem that at least some of the anger is over a low-budget film and a low-budget campaign getting a nomination by going around this vast, crass game. (You would be justified in thinking I'm exaggerating, but one critic actually declares it an injustice that "someone who did everything outside of the system" gets the nomination over those who did--who had elaborately and expensively arranged for them--the "luxury dinners, private academy screenings, meet-and-greets, splashy television spots."*)
The result is that, while I had not even heard of Ms. Riseborough or the film until the nonsensical furor over whether the rules of the cowardly, corrupt and ever less relevant Academy were violated (and the still more nonsensical furor over whether there was an "ethical" failure), it seems to me a good thing that the actress in question is not being deprived of her nomination on these mind-bogglingly hypocritical grounds.
* Yes, I know the critic in question had other things on his mind (allegations of prejudice against the Academy, etc.), but all the same, in making this argument he unavoidably defended the conventional way of pursuing nominations, and that is what matters for the purposes of this discussion.
By contrast the true genius who ventures into the world without prospect of such aids will never even get that far, never get published at all, never mind be acclaimed as a genius. ("You know nobody; you have access to no newspaper," he is told, so the book of poems he wrote "will remain demurely folded as you hold them now. They will never open out to the sun of publicity in fair fields with broad margins enameled with the florets" that a major publisher readily "scatters with a lavish hand for poets known to fame.")
Remarking these lessons my thought was for how far, far more truthful Balzac's words about publishing in his time and in our own than not only the drivel we see on TV and in film about the business (where writers are always seen signing copies of their book in some bookshop for fawning idiots). Still, I never forgot that Balzac's images of the racket that is the creation of "success" in publishing are drawn from the theater, with what he said of the stage translating easily to the theater of cinema, and I have found myself thinking on all this amid the recent stupid controversy over Andrea Riseborough's nomination for the Academy Award for Best Actress.
Ms. Riseborough's nomination has drawn more than the usual attention to the games that film backers play in getting those little statues from their movies and the people who made them--games extending far beyond even those of Balzac's vile Parisian publishing king Dauriat. Thus does David Mouriquand of Euronews write of "'For Your Consideration' bombardments" which "include sent screeners, interviews, lavish luncheons, place advertisement and direct marketing campaigns, in order to get as many votes as possible" and generally confirm in the process that "[t]he more expensive the campaign, the more chances a film can win a nomination." Indeed, as Variety reported in an item way back in 2019 (how long ago that seems now!) even many years before that the campaigns could run as much as $30 million--"success" here, as in Balzac's publishing world, up for sale to the highest bidder, with a winning bid exceeding by orders of magnitude the production budget of the film that aroused so much controversy, To Leslie (reportedly made for under $1 million).
And indeed it would seem that at least some of the anger is over a low-budget film and a low-budget campaign getting a nomination by going around this vast, crass game. (You would be justified in thinking I'm exaggerating, but one critic actually declares it an injustice that "someone who did everything outside of the system" gets the nomination over those who did--who had elaborately and expensively arranged for them--the "luxury dinners, private academy screenings, meet-and-greets, splashy television spots."*)
The result is that, while I had not even heard of Ms. Riseborough or the film until the nonsensical furor over whether the rules of the cowardly, corrupt and ever less relevant Academy were violated (and the still more nonsensical furor over whether there was an "ethical" failure), it seems to me a good thing that the actress in question is not being deprived of her nomination on these mind-bogglingly hypocritical grounds.
* Yes, I know the critic in question had other things on his mind (allegations of prejudice against the Academy, etc.), but all the same, in making this argument he unavoidably defended the conventional way of pursuing nominations, and that is what matters for the purposes of this discussion.
Sunday, February 5, 2023
On the Crime Beat
Matt Taibbi, discussing crime reporting in his book Hate, Inc., remarked that part of the media's strategy for keeping people coming back is letting them feel "superior" to the "vile" people they present. Hence, for example, the sorts of persons one sees in the crime news, especially of the lighter type where the crimes come across as idiotic rather than frightening or disturbing (as with their tales of assorted "Florida men").
The implication of Taibbi's remark is that the media avoids making the reader or viewer--especially the reader or viewer in its target, upper middle class audience (for that is who it pays to pander to, today as in Veblen's time)--feel "inferior" to others.
However, it seems to me that the media is actually rather relentless about making its audience, and not just the working class persons who also consume this garbage, but that more privileged target audience, feel inferior.
It just does it in a different way, for different reasons, than he is concerned with.
Specifically that media is relentless about making its audience feel inferior to the rich and powerful and famous, and to those closest to them; to people who have had the advantages of a privileged upbringing, and the high office to which it easily leads, and who have been invested with "authority" in some area; to, above all, the Chief Executive Officers and the billionaires they hasten to call "geniuses" based on no evidence whatsoever besides their having an impressive-sounding title or (allegedly) lots of money, the way they did with Jeffrey Epstein in a truly characteristic display not only of their propensity to grovel before the elite, but their extreme stupidity.
In short, they make their audience look up to the super-rich power elite, and their courtiers, but look down on working class people--the sense of superiority to the latter not unimportant, but the thing not to be forgotten the affirmation of social hierarchy and its associated value system, which they do ceaselessly and unquestioningly.
Yet faced with this people speak of this as the ""liberal" and even "left" media--proving that they do not even know the meaning of those terms.
The implication of Taibbi's remark is that the media avoids making the reader or viewer--especially the reader or viewer in its target, upper middle class audience (for that is who it pays to pander to, today as in Veblen's time)--feel "inferior" to others.
However, it seems to me that the media is actually rather relentless about making its audience, and not just the working class persons who also consume this garbage, but that more privileged target audience, feel inferior.
It just does it in a different way, for different reasons, than he is concerned with.
Specifically that media is relentless about making its audience feel inferior to the rich and powerful and famous, and to those closest to them; to people who have had the advantages of a privileged upbringing, and the high office to which it easily leads, and who have been invested with "authority" in some area; to, above all, the Chief Executive Officers and the billionaires they hasten to call "geniuses" based on no evidence whatsoever besides their having an impressive-sounding title or (allegedly) lots of money, the way they did with Jeffrey Epstein in a truly characteristic display not only of their propensity to grovel before the elite, but their extreme stupidity.
In short, they make their audience look up to the super-rich power elite, and their courtiers, but look down on working class people--the sense of superiority to the latter not unimportant, but the thing not to be forgotten the affirmation of social hierarchy and its associated value system, which they do ceaselessly and unquestioningly.
Yet faced with this people speak of this as the ""liberal" and even "left" media--proving that they do not even know the meaning of those terms.
Friday, February 3, 2023
What Was '90s Irony Really About?
Not too long ago I remarked that in the '90s there still seemed a certain amount of alertness on the part of society's thinkers about just what condition society was in, and where it was going on--which is to say, in the middle of a nervous breakdown, amid breakdowns of other kinds.
Consider all that was going on. Deindustrialization, the decay of the country's infrastructure and services, the hollowing out of the "middle class." Surging socioeconomic inequality, the polarization and moral panic of "culture war." The growling derangement of a "Greed is good"-"Force works" mentality. The hyper-sensationalization of one crime as idiotic as it was gruesome after another (Amy Fisher! Tonya Harding!), which process included endless made-for-TV movies (Amy Fisher had three all to herself, not counting the three more Saturday Night Live presented us with in a more than usually memorable episode!), and the grist it all offered the mill of an age of "shock TV" and "extreme" everything, and enormous smugness on the part of the edgelords dominating the landscape. (This was the era of Jerry Springer, and Howard Stern, and Quentin Tarantino, and South Park.)
All of this is, of course, far more advanced now, far worse now--and taken quite in stride. But there was less acceptance then. And that seems relevant to the famous '90s irony. Annoying as it often was I wonder now if it was not a sign of society (at least, as represented by some of those who do its thinking for it) being in a healthier condition than it is now, aware of what was going on, and using irony as a coping mechanism--distancing itself from its unfortunate situation, laughing at what was happening to it because it still understood that this was what breakdown looks like, and if it was no great era for valiant efforts to change the world, on some level was still fighting against it.
I have said it before but I will say it again: back then writers could imagine society becoming so sick it would be addicted to reality TV (The Truman Show, EdTV), but reality has gone far, far beyond their imaginings (as all that is evoked by utterance of the name "Kardashian" makes clear, and so too the way people shrugged at the amplification of the concept in The Circle). I think of the underrated sequel to Robocop, which had a bankrupt Detroit being privatized to a corporate goliath whose activities include selling a new model of high-tech policing based on killer robots--a joke then, but in hindsight it stands as prophesy infinitely more accurate than the drivel of the Establishment hacks paraded before us on the nightly news as "experts" (whose misunderstanding of the events of the period and their aftermath was unbelievably risible, and has cost the world dearly).
As all this goes to show, we are past the point at which satire, or even parody, is possible. And I don't think it's a simple caving in to nostalgia to point that out.
Consider all that was going on. Deindustrialization, the decay of the country's infrastructure and services, the hollowing out of the "middle class." Surging socioeconomic inequality, the polarization and moral panic of "culture war." The growling derangement of a "Greed is good"-"Force works" mentality. The hyper-sensationalization of one crime as idiotic as it was gruesome after another (Amy Fisher! Tonya Harding!), which process included endless made-for-TV movies (Amy Fisher had three all to herself, not counting the three more Saturday Night Live presented us with in a more than usually memorable episode!), and the grist it all offered the mill of an age of "shock TV" and "extreme" everything, and enormous smugness on the part of the edgelords dominating the landscape. (This was the era of Jerry Springer, and Howard Stern, and Quentin Tarantino, and South Park.)
All of this is, of course, far more advanced now, far worse now--and taken quite in stride. But there was less acceptance then. And that seems relevant to the famous '90s irony. Annoying as it often was I wonder now if it was not a sign of society (at least, as represented by some of those who do its thinking for it) being in a healthier condition than it is now, aware of what was going on, and using irony as a coping mechanism--distancing itself from its unfortunate situation, laughing at what was happening to it because it still understood that this was what breakdown looks like, and if it was no great era for valiant efforts to change the world, on some level was still fighting against it.
I have said it before but I will say it again: back then writers could imagine society becoming so sick it would be addicted to reality TV (The Truman Show, EdTV), but reality has gone far, far beyond their imaginings (as all that is evoked by utterance of the name "Kardashian" makes clear, and so too the way people shrugged at the amplification of the concept in The Circle). I think of the underrated sequel to Robocop, which had a bankrupt Detroit being privatized to a corporate goliath whose activities include selling a new model of high-tech policing based on killer robots--a joke then, but in hindsight it stands as prophesy infinitely more accurate than the drivel of the Establishment hacks paraded before us on the nightly news as "experts" (whose misunderstanding of the events of the period and their aftermath was unbelievably risible, and has cost the world dearly).
As all this goes to show, we are past the point at which satire, or even parody, is possible. And I don't think it's a simple caving in to nostalgia to point that out.
Wednesday, February 1, 2023
The Florida Man Phenomenon: A Note
In the past decade "Florida Man" became a popular Internet meme, derived from just how often news stories detailing comically bizarre incidents perpetrated by individuals who were likely intoxicated or mentally ill begin with the words "A Florida man."
All of this, of course, has had some asking why so many incidents of these types seem to occur in the state. Some have suggested the state's demographic "diversity," others its extreme weather (its seemingly having the "dog days of summer" nearly all year round).
However, the mundane reality is that it is easy to come up with numerous explanations which have nothing to do with any special eccentricity on the part of the state and its inhabitants.
The sheer populousness of the state would quite naturally mean a disproportionate share of eccentricity. With about 22 million inhabitants circa 2023 one would, all other things being equal, expect forty times as many such incidents as would come from, for example, Wyoming--with the fact all the more worth remarking given that the only two states in the Union larger than Florida in population have a pretty strong reputation for eccentricity themselves, namely California and Texas.
Urbanization may matter too--Florida being not just heavily peopled, but highly urbanized. (With over 91 percent of its people living in cities as of 2010, it is perhaps the seventh most urbanized state in the country.) One might add that it is particularly rich in large, high-profile urban areas (the Miami-Fort Lauderdale area, Jacksonville, Tampa-Saint Petersburg, Orlando) where not only are local events likely to get national notice, but which sprawl to such a degree as to make it essentially one giant "megalopolis"--containing 17 million+ of its 22 million people, giving the megalopolis a population larger than is to be seen in any other whole state but California, Texas and New York.
That makes it that much easier for any odd incidents occurring within it (of which, again, there would likely be many given the sheer number of people present) to come to public notice (again, as compared with what may happen in a thinly peopled rural area).
One might add that where comparisons with other states are concerned it may matter that California and Texas are associated with older stereotypes that let news audiences (quite wrongly, though they did it all the same) fit a good many of their incidents into a familiar framework (California's counterculturalism, for example, or Texas' nationalism and ultra-conservatism).
By contrast Florida, long associated simply with retirees and "fun in the sun" has had nothing of the kind, leaving it susceptible to the identification of all this with a vaguer "Floridaness"--which has in itself been powerfully self-reinforcing, with new incidents fit into the increasingly familiar pattern, the more in as the words "a Florida man" became a reliable way of grabbing the attention of the sorts of readers and viewers who eat this kind of thing up.
All of this, of course, has had some asking why so many incidents of these types seem to occur in the state. Some have suggested the state's demographic "diversity," others its extreme weather (its seemingly having the "dog days of summer" nearly all year round).
However, the mundane reality is that it is easy to come up with numerous explanations which have nothing to do with any special eccentricity on the part of the state and its inhabitants.
The sheer populousness of the state would quite naturally mean a disproportionate share of eccentricity. With about 22 million inhabitants circa 2023 one would, all other things being equal, expect forty times as many such incidents as would come from, for example, Wyoming--with the fact all the more worth remarking given that the only two states in the Union larger than Florida in population have a pretty strong reputation for eccentricity themselves, namely California and Texas.
Urbanization may matter too--Florida being not just heavily peopled, but highly urbanized. (With over 91 percent of its people living in cities as of 2010, it is perhaps the seventh most urbanized state in the country.) One might add that it is particularly rich in large, high-profile urban areas (the Miami-Fort Lauderdale area, Jacksonville, Tampa-Saint Petersburg, Orlando) where not only are local events likely to get national notice, but which sprawl to such a degree as to make it essentially one giant "megalopolis"--containing 17 million+ of its 22 million people, giving the megalopolis a population larger than is to be seen in any other whole state but California, Texas and New York.
That makes it that much easier for any odd incidents occurring within it (of which, again, there would likely be many given the sheer number of people present) to come to public notice (again, as compared with what may happen in a thinly peopled rural area).
One might add that where comparisons with other states are concerned it may matter that California and Texas are associated with older stereotypes that let news audiences (quite wrongly, though they did it all the same) fit a good many of their incidents into a familiar framework (California's counterculturalism, for example, or Texas' nationalism and ultra-conservatism).
By contrast Florida, long associated simply with retirees and "fun in the sun" has had nothing of the kind, leaving it susceptible to the identification of all this with a vaguer "Floridaness"--which has in itself been powerfully self-reinforcing, with new incidents fit into the increasingly familiar pattern, the more in as the words "a Florida man" became a reliable way of grabbing the attention of the sorts of readers and viewers who eat this kind of thing up.
Sunday, January 29, 2023
The Myth of a Country Ever More Awash in Holders of "Useless" Degrees
The accusation that the fault for many a national problem lies in the country's young people eschewing STEM in favor of "soft" and "useless" degrees in the arts, humanities and social sciences whose award is something akin to a scam given that upon graduation the fate of those debt-loaded students will be working behind the counter at Starbucks.
As is generally the case with such situations the advancement of the claim, and the debate it provokes (if one dignifies it with that term), is mostly evidence-free--a matter of spewing one's culture war-soiled prejudices.
The fact by itself ought to make the claim suspect. Admittedly education is an area where the collection of statistical data leaves much to be desired. However, the data that is available suffices to make it quite clear that the image of young people all getting "useless" degrees is profoundly false.
Consider, for instance, the much-maligned degree in English. In 2018-2020 the proportion of bachelor's degrees awarded in that field ran below 2 percent of the total (far, far less than you would guess from the noise)--with this, of course, including all those taking English as part of pre-law study, aspirants to work in advertising, persons planning to teach the English language courses that everyone following any course of study must take in school, etc.. The result is that for every B.A. awarded in English American colleges awarded three degrees in engineering, and another three in computer and information science, and at least that many in the biological and biomedical sciences.
Moreover, the trend has been toward less humanities education, and more STEM education, with English and engineering useful reference points. In the 1970s, the 1980s, the 1990s, it may have been common for bachelor's degrees in English to comprise 4-5 percent of the total. By 2010-2011 they were down to 3 percent, on the way to the current level (a drop of a third in a space of a few years). By contrast, if the output of engineering graduates has been more volatile it seems worth remembering that where in the early twenty-first century they seem to have run 4-5 percent of the total, in 2017-2020 they consistently accounted for over 6 percent of B.A.s (a jump of a third in a similarly short span of a time). More broadly, over that decade the Arts/Humanities/Social Sciences category as I have been able to determine it went from accounting for 39 percent of bachelor's degrees at the start of the 2010-2020 period to 33 percent at its end, while STEM categories, broadly defined, surged from 29 to over 39 percent--the ratio gone from 1.3 Arts/Humanities/Social Sciences bachelor's degrees for every one STEM baccalaureate to nearly the reverse (in, again, the space of a decade).* Defining STEM more narrowly so as to focus on the physical science, computer science, engineering and engineering technology graduates, mathematicians and the like (leaving out such undeniable science workers as those in agriculture and natural resources, the biological and biomedical sciences, the health professionals, etc., in favor of the physical science/production-orientation those who speak of the term usually have in mind) I still get a jump from 12 to 16 percent--and so a robust (if somewhat more modest) measure of absolute and relative growth that has these categories going from producing three graduates for every ten in the Arts/Humanities/Social Sciences category to five in the relevant span of time.**
Indeed, to the extent that the image of the overproduction of "useless" arts, humanities and social science majors has any basis it seems to lie in outdated images of what is actually happening at the schools--for the shift has been extraordinary. This is all the more the case as this proportional shift has occurred not in a context where the number of B.A.s awarded fell but actually rose by a near fifth (from 1.72 to 2.04 million), so that one cannot rationalize it as a matter of people simply dropping out of the "useless" humanities, etc. programs. (The number of engineering majors surged from 76,000 in 2010-2011 to 128,000 in 2019-2020, a 68 percent jump bespeaking a sustained 6 percent a year growth rate; the number of graduates of STEM programs as counted here more broadly from 496,000 to 801,000, an only slightly less impressive rate of growth.) Moreover, for all the complaints about the number of foreign students in such programs. (Even discounting the "temporary visa" category one has a fifty percent jump, evidentiating the position that U.S. citizens really are getting so many more degrees.)
It has also happened in a context where there has been little serious effort to make STEM majors more attractive (e.g. better pay and conditions for workers, or an easing of the cost of education for those who go this track); and where there does not seem to have been much done to better equip students to follow those majors at the K-12 level.
One would think that this big jump in the number of STEM majors would be a national story which would give some solace to the "WE NEED MORE STEM!" crowd. However, because it would offer some solace to them (and for many, many other reasons) it conflicts with the narratives so many want to push--of the humanities as a threat to the country because of intellectuals' perversity; of young people having only themselves to blame for their own problems when they leave college and can't find work commensurate with their level of schooling, and the country's industrial decline being attributable to their fecklessness rather than anyone else's--and so is totally ignored by those whose job it is to "inform" the public.
* In calculating the figure for Arts/Humanities/Social Sciences I counted in, besides the "Visual and Performing Arts," English and foreign languages, literature and linguistics, "Social Science and History," "Liberal Arts and Sciences, General studies, and Humanities," "Philosophy and Religious Studies," "Area, Ethnic, Cultural, Gender, and Group Studies," also "Psychology," "Family and Consumer Sciences/Human Sciences," "Theology and Religious Vocation Programs," "Communication, Journalism, and Related Programs," and "Multi/Interdisciplinary Studies." In calculating the figure for STEM I counted in, besides "Engineering," "Engineering Technologies," "Computer and Information Sciences and Support Services," "Communications Technology," "Architecture and Related Services," "Biological and Biomedical Sciences," "Physical Sciences and Science Technologies" and "Mathematics and Statistics," also "Precision Production," "Agriculture and Natural Resources," "Transportation and Materials Moving," "Health Professions and Related Programs," and "Parks, Recreation, Fitness, Leisure and Kinesiology."
** For the calculation I used the "Engineering," "Engineering Technologies," "Computer and Information Sciences and Support Services," "Communications Technology," "Architecture and Related Services," "Physical Sciences and Science Technologies," "Mathematics and Statistics," "Precision Production," and "Transportation and Materials Moving" categories.
As is generally the case with such situations the advancement of the claim, and the debate it provokes (if one dignifies it with that term), is mostly evidence-free--a matter of spewing one's culture war-soiled prejudices.
The fact by itself ought to make the claim suspect. Admittedly education is an area where the collection of statistical data leaves much to be desired. However, the data that is available suffices to make it quite clear that the image of young people all getting "useless" degrees is profoundly false.
Consider, for instance, the much-maligned degree in English. In 2018-2020 the proportion of bachelor's degrees awarded in that field ran below 2 percent of the total (far, far less than you would guess from the noise)--with this, of course, including all those taking English as part of pre-law study, aspirants to work in advertising, persons planning to teach the English language courses that everyone following any course of study must take in school, etc.. The result is that for every B.A. awarded in English American colleges awarded three degrees in engineering, and another three in computer and information science, and at least that many in the biological and biomedical sciences.
Moreover, the trend has been toward less humanities education, and more STEM education, with English and engineering useful reference points. In the 1970s, the 1980s, the 1990s, it may have been common for bachelor's degrees in English to comprise 4-5 percent of the total. By 2010-2011 they were down to 3 percent, on the way to the current level (a drop of a third in a space of a few years). By contrast, if the output of engineering graduates has been more volatile it seems worth remembering that where in the early twenty-first century they seem to have run 4-5 percent of the total, in 2017-2020 they consistently accounted for over 6 percent of B.A.s (a jump of a third in a similarly short span of a time). More broadly, over that decade the Arts/Humanities/Social Sciences category as I have been able to determine it went from accounting for 39 percent of bachelor's degrees at the start of the 2010-2020 period to 33 percent at its end, while STEM categories, broadly defined, surged from 29 to over 39 percent--the ratio gone from 1.3 Arts/Humanities/Social Sciences bachelor's degrees for every one STEM baccalaureate to nearly the reverse (in, again, the space of a decade).* Defining STEM more narrowly so as to focus on the physical science, computer science, engineering and engineering technology graduates, mathematicians and the like (leaving out such undeniable science workers as those in agriculture and natural resources, the biological and biomedical sciences, the health professionals, etc., in favor of the physical science/production-orientation those who speak of the term usually have in mind) I still get a jump from 12 to 16 percent--and so a robust (if somewhat more modest) measure of absolute and relative growth that has these categories going from producing three graduates for every ten in the Arts/Humanities/Social Sciences category to five in the relevant span of time.**
Indeed, to the extent that the image of the overproduction of "useless" arts, humanities and social science majors has any basis it seems to lie in outdated images of what is actually happening at the schools--for the shift has been extraordinary. This is all the more the case as this proportional shift has occurred not in a context where the number of B.A.s awarded fell but actually rose by a near fifth (from 1.72 to 2.04 million), so that one cannot rationalize it as a matter of people simply dropping out of the "useless" humanities, etc. programs. (The number of engineering majors surged from 76,000 in 2010-2011 to 128,000 in 2019-2020, a 68 percent jump bespeaking a sustained 6 percent a year growth rate; the number of graduates of STEM programs as counted here more broadly from 496,000 to 801,000, an only slightly less impressive rate of growth.) Moreover, for all the complaints about the number of foreign students in such programs. (Even discounting the "temporary visa" category one has a fifty percent jump, evidentiating the position that U.S. citizens really are getting so many more degrees.)
It has also happened in a context where there has been little serious effort to make STEM majors more attractive (e.g. better pay and conditions for workers, or an easing of the cost of education for those who go this track); and where there does not seem to have been much done to better equip students to follow those majors at the K-12 level.
One would think that this big jump in the number of STEM majors would be a national story which would give some solace to the "WE NEED MORE STEM!" crowd. However, because it would offer some solace to them (and for many, many other reasons) it conflicts with the narratives so many want to push--of the humanities as a threat to the country because of intellectuals' perversity; of young people having only themselves to blame for their own problems when they leave college and can't find work commensurate with their level of schooling, and the country's industrial decline being attributable to their fecklessness rather than anyone else's--and so is totally ignored by those whose job it is to "inform" the public.
* In calculating the figure for Arts/Humanities/Social Sciences I counted in, besides the "Visual and Performing Arts," English and foreign languages, literature and linguistics, "Social Science and History," "Liberal Arts and Sciences, General studies, and Humanities," "Philosophy and Religious Studies," "Area, Ethnic, Cultural, Gender, and Group Studies," also "Psychology," "Family and Consumer Sciences/Human Sciences," "Theology and Religious Vocation Programs," "Communication, Journalism, and Related Programs," and "Multi/Interdisciplinary Studies." In calculating the figure for STEM I counted in, besides "Engineering," "Engineering Technologies," "Computer and Information Sciences and Support Services," "Communications Technology," "Architecture and Related Services," "Biological and Biomedical Sciences," "Physical Sciences and Science Technologies" and "Mathematics and Statistics," also "Precision Production," "Agriculture and Natural Resources," "Transportation and Materials Moving," "Health Professions and Related Programs," and "Parks, Recreation, Fitness, Leisure and Kinesiology."
** For the calculation I used the "Engineering," "Engineering Technologies," "Computer and Information Sciences and Support Services," "Communications Technology," "Architecture and Related Services," "Physical Sciences and Science Technologies," "Mathematics and Statistics," "Precision Production," and "Transportation and Materials Moving" categories.
Higher Education in the America of Berzelius Windrip
Back before literature went over to the Counter-Enlightenment mind, body and soul, devoting itself to epistemological nihilism, Medieval misanthropy, sneering irony toward human suffering and social concern, and the sniveling subjectivism of the "writers" who have made of the world of letters a Modernist-postmodernist Wasteland this past century, it was the case that a great writer could be expected to show us something of the world through their art. In early twentieth century America this was still very much the case with greats such as Jack London, Theodore Dreiser, John Steinbeck and, of course, Sinclair Lewis.
In line with that cultural shift what made these writers great has since been held against them, with Lewis no exception. Indeed, that champion of the New Criticism Mark Schorer personally played a significant part in destroying Lewis' reputation (as he did that of H.G. Wells), and I am not sure it ever recovered. Still, Lewis' works did endure, such that the literate (such as exist) are still likely to not be completely confused when they hear of George F. Babbitt, or an Elmer Gantry--and indeed it seems that there has been some revival of interest in one of his later works, It Can't Happen Here.
Still, if the book was more talked about than it had been in a long time I'm not sure how many actually read it--and how many of those were attentive to the bit about what happened to higher education in America under the dictatorship of America's answer to Mussolini and Hitler, Berzelius Windrip. As Lewis wrote, the country got new model universities that, rather than "merely kick[ing] out all treacherous so-called "intellectual" teachers who mulishly declined to teach physics, cookery, and geography according to the principles and facts laid down by the political bureaus," were "free from the very first of any taint of 'intellectualism.'" Thus they virtually annihilated the humanities and sciences (dispensing entirely with ancient languages, ancient history, foreign literature and even pre-1800 English literature excepting Shakespeare and Milton, while in science declaring that "too much and too confusing research had already been done" and need not be bothered with anymore), leaving little but political indoctrination (the one history course retained in the curriculum devoted to "show[ing] that, through the centuries, the key to civilization had been the defense of Anglo-Saxon purity against barbarians"), vocational training ("lakeshore-cottage architecture . . . schnauzer-breeding") and athletics with a growing tilt toward preparation for what no fascist state can live without, war (with "speed contests in infantry drill, aviation, bombing, and operation of tanks, armored cars, and machine guns" counting for academic credit).
Being a satire from the 1930s Lewis' book had its over-the-top and in cases dated touches--but I leave it to you to wonder just how big a stretch this vision is from a great deal of talk about higher education conventionally treated as eminently practical and respectable, and what that says about the sort of media we have.
In line with that cultural shift what made these writers great has since been held against them, with Lewis no exception. Indeed, that champion of the New Criticism Mark Schorer personally played a significant part in destroying Lewis' reputation (as he did that of H.G. Wells), and I am not sure it ever recovered. Still, Lewis' works did endure, such that the literate (such as exist) are still likely to not be completely confused when they hear of George F. Babbitt, or an Elmer Gantry--and indeed it seems that there has been some revival of interest in one of his later works, It Can't Happen Here.
Still, if the book was more talked about than it had been in a long time I'm not sure how many actually read it--and how many of those were attentive to the bit about what happened to higher education in America under the dictatorship of America's answer to Mussolini and Hitler, Berzelius Windrip. As Lewis wrote, the country got new model universities that, rather than "merely kick[ing] out all treacherous so-called "intellectual" teachers who mulishly declined to teach physics, cookery, and geography according to the principles and facts laid down by the political bureaus," were "free from the very first of any taint of 'intellectualism.'" Thus they virtually annihilated the humanities and sciences (dispensing entirely with ancient languages, ancient history, foreign literature and even pre-1800 English literature excepting Shakespeare and Milton, while in science declaring that "too much and too confusing research had already been done" and need not be bothered with anymore), leaving little but political indoctrination (the one history course retained in the curriculum devoted to "show[ing] that, through the centuries, the key to civilization had been the defense of Anglo-Saxon purity against barbarians"), vocational training ("lakeshore-cottage architecture . . . schnauzer-breeding") and athletics with a growing tilt toward preparation for what no fascist state can live without, war (with "speed contests in infantry drill, aviation, bombing, and operation of tanks, armored cars, and machine guns" counting for academic credit).
Being a satire from the 1930s Lewis' book had its over-the-top and in cases dated touches--but I leave it to you to wonder just how big a stretch this vision is from a great deal of talk about higher education conventionally treated as eminently practical and respectable, and what that says about the sort of media we have.
Should Economics Be Classified as a STEM Subject?
In recent years there has been a big push to have economics classified as a STEM field--and while this issue is not exactly high on the public agenda it does seem worth saying that most of the discussion of the issue is in favor of such a change. And at a glance that may seem natural. After all, economics is supposed to be a science, and as currently taught it is certainly intensively mathematical.
However, the issue is more complex than it looks--because of the ambiguities of the STEM category, and the context in which proponents of the idea are trying to make STEM a part of it.
Just What is STEM Anyway?
In the past I have admitted to a dislike of the term STEM. This was mainly a reaction to its glib slogan-ness, which it seemed to me made it easier for people to push a bunch of simple-minded and harmful attitudes all the more forcefully—in particular the idea of a hierarchy of intellectual endeavor (and the minds which pursue them), and the diversion of attention from the real problems of the country's manufacturing base, and its college graduates, and much, much else associated them with a dismissive "Those lazy, stupid young people are studying the wrong things."
Still, there is also the (totally unsurprising) intellectual haziness behind the glib formulation. STEM awkwardly lumps together Science, Technology, Engineering, Math--which is to say forms of knowledge and activity to be found all across the economy, such that one can define it so broadly as to include virtually everyone. The number of working people who have absolutely nothing to do with any of those fields in their work is not exactly high, and likely falling all the time with anyone at a desk likely to have a computer in front of them--while going much less far one finds blurry boundaries. Consider, for example, the elementary school teacher who performs the important function of imparting the rudiments of math and science to very young people who will include the STEM workers of tomorrow. Consider the doctor who is lengthily trained in the sciences and their application. Certainly they merit acknowledgment as STEM workers--but they are not what people usually think of when (again, glibly) tossing about the term. Instead what they seem to have in mind was physical scientists (physicists, chemists, etc.), engineers, and people with advanced training in allied occupations (mathematics, computers) devoted to the work of high-technology physical production--and STEM made a cause célèbre in the name of expanding the work force for high-technology manufacturing, such that what was wanted was not so much more in the way of pure scientists or mathematicians (certainly there is no shortage of, for example, qualified Ph.d-holding applicants for tenured university positions in those fields) as more of the Technology and Engineering folks for industry.
But of course the fuzziness, and the hype about STEM, STEM, STEM! gave it the status of "cool kids club" and everyone wanted in (the more in as educational institutions craved the government and private sector support--read: money--going or likely to go to STEM, the more so amid the brutal austerity of the general scene), such that the head can spin just trying to keep track with all the initiatives to get this field recognized as STEM, that field recognized as STEM. And economics has been no exception. But if we are to stick with what STEM is really about then, no, economics does not have a claim here. But even if we take the broader view we still find ourselves facing other important questions.
Physics Envy?
Economics as we know it is a product of the Enlightenment and its respect for science--with the same going for its being a specifically mathematical science (where, at least from the standpoint of mainstream, neoclassical economics, Stanley Jevons' work was foundational). Indeed, the aspiration has been reflected in the etymology of the term "economics." Where people previously spoke of "political economy" from the dawn of its neoclassical era on they increasingly cut out the "political"—cut of the idea that economic life and its rules were rooted in specific historical, technological, social, cultural--and political--conditions, and instead presented economic life as "autonomous." Putting it another way it bespoke the claim that what (mainstream) economists taught--above all, the optimality of profit-seeking, property-owning private individuals being left to their own devices as completely as physically and morally possible, with any attempt to alter the direction of economic life from the path such individuals put it on not only the worst of oppression, but an exercise in futility certain to produce catastrophe--was as indifferent to particular social conditions, as true in all times and places, as Newton's laws of motion.
Not everyone agreed with that teaching, of course, and that being the case they could still less agree with the grand claim for their teaching as eternal truth--the insistence that "This is science!" an attempt to rationalize what the rich find convenient and hostility to anything they would find inconvenient, with the pretense to it all being "science" suspect down to the use of the math itself. (Indeed, as economist James K. Galbraith--who, far from being the harshest critic contemporary economics has, and certainly no anti-capitalist-- remarked, "the clumsy algebra of a typical professional economics article" is there "not to clarify, or to charm, but to intimidate," for ideas "that would come across as simpleminded in English can be made 'impressive looking' with a sufficient string of Greek symbols," and any critic of the argument dismissed as simply too obtuse to follow all the math.)
Moreover, it is notable that those calling for economics to be recognized as a STEM field are not making the same demand on behalf of other social sciences--indeed, are often prone to distance economics from those subjects out of which it grew in line with common prejudices. Social science is squishy and useless, they say--in contrast with practical, useful, tough-minded economics. Again, there is a political prejudice here--namely, against what such persons tend to see as the leftishness of the social sciences (the biggest reason of all for their contempt toward it), as against an economics field whose mainstream has overwhelmingly and squarely stood with the right for the last century and a half (by cutting those social sciences out of the picture, by when Keynesianism could not be ignored bastardizing it beyond all recognition, by nurturing the intellectual seeds of and propagandizing for the neoliberal revolution) a significant factor in its comparative respectability.
In considering the drive to get economics recognized as a STEM field one cannot overlook this dimension of the issue--which to those who object with the claims for economics can only seem the more wrong-headed at a moment when the neoliberal project that the economics field has so squarely promoted, enabled, helped operate stands in more discredit than it has been at any point since it became a political force a half century ago.
However, the issue is more complex than it looks--because of the ambiguities of the STEM category, and the context in which proponents of the idea are trying to make STEM a part of it.
Just What is STEM Anyway?
In the past I have admitted to a dislike of the term STEM. This was mainly a reaction to its glib slogan-ness, which it seemed to me made it easier for people to push a bunch of simple-minded and harmful attitudes all the more forcefully—in particular the idea of a hierarchy of intellectual endeavor (and the minds which pursue them), and the diversion of attention from the real problems of the country's manufacturing base, and its college graduates, and much, much else associated them with a dismissive "Those lazy, stupid young people are studying the wrong things."
Still, there is also the (totally unsurprising) intellectual haziness behind the glib formulation. STEM awkwardly lumps together Science, Technology, Engineering, Math--which is to say forms of knowledge and activity to be found all across the economy, such that one can define it so broadly as to include virtually everyone. The number of working people who have absolutely nothing to do with any of those fields in their work is not exactly high, and likely falling all the time with anyone at a desk likely to have a computer in front of them--while going much less far one finds blurry boundaries. Consider, for example, the elementary school teacher who performs the important function of imparting the rudiments of math and science to very young people who will include the STEM workers of tomorrow. Consider the doctor who is lengthily trained in the sciences and their application. Certainly they merit acknowledgment as STEM workers--but they are not what people usually think of when (again, glibly) tossing about the term. Instead what they seem to have in mind was physical scientists (physicists, chemists, etc.), engineers, and people with advanced training in allied occupations (mathematics, computers) devoted to the work of high-technology physical production--and STEM made a cause célèbre in the name of expanding the work force for high-technology manufacturing, such that what was wanted was not so much more in the way of pure scientists or mathematicians (certainly there is no shortage of, for example, qualified Ph.d-holding applicants for tenured university positions in those fields) as more of the Technology and Engineering folks for industry.
But of course the fuzziness, and the hype about STEM, STEM, STEM! gave it the status of "cool kids club" and everyone wanted in (the more in as educational institutions craved the government and private sector support--read: money--going or likely to go to STEM, the more so amid the brutal austerity of the general scene), such that the head can spin just trying to keep track with all the initiatives to get this field recognized as STEM, that field recognized as STEM. And economics has been no exception. But if we are to stick with what STEM is really about then, no, economics does not have a claim here. But even if we take the broader view we still find ourselves facing other important questions.
Physics Envy?
Economics as we know it is a product of the Enlightenment and its respect for science--with the same going for its being a specifically mathematical science (where, at least from the standpoint of mainstream, neoclassical economics, Stanley Jevons' work was foundational). Indeed, the aspiration has been reflected in the etymology of the term "economics." Where people previously spoke of "political economy" from the dawn of its neoclassical era on they increasingly cut out the "political"—cut of the idea that economic life and its rules were rooted in specific historical, technological, social, cultural--and political--conditions, and instead presented economic life as "autonomous." Putting it another way it bespoke the claim that what (mainstream) economists taught--above all, the optimality of profit-seeking, property-owning private individuals being left to their own devices as completely as physically and morally possible, with any attempt to alter the direction of economic life from the path such individuals put it on not only the worst of oppression, but an exercise in futility certain to produce catastrophe--was as indifferent to particular social conditions, as true in all times and places, as Newton's laws of motion.
Not everyone agreed with that teaching, of course, and that being the case they could still less agree with the grand claim for their teaching as eternal truth--the insistence that "This is science!" an attempt to rationalize what the rich find convenient and hostility to anything they would find inconvenient, with the pretense to it all being "science" suspect down to the use of the math itself. (Indeed, as economist James K. Galbraith--who, far from being the harshest critic contemporary economics has, and certainly no anti-capitalist-- remarked, "the clumsy algebra of a typical professional economics article" is there "not to clarify, or to charm, but to intimidate," for ideas "that would come across as simpleminded in English can be made 'impressive looking' with a sufficient string of Greek symbols," and any critic of the argument dismissed as simply too obtuse to follow all the math.)
Moreover, it is notable that those calling for economics to be recognized as a STEM field are not making the same demand on behalf of other social sciences--indeed, are often prone to distance economics from those subjects out of which it grew in line with common prejudices. Social science is squishy and useless, they say--in contrast with practical, useful, tough-minded economics. Again, there is a political prejudice here--namely, against what such persons tend to see as the leftishness of the social sciences (the biggest reason of all for their contempt toward it), as against an economics field whose mainstream has overwhelmingly and squarely stood with the right for the last century and a half (by cutting those social sciences out of the picture, by when Keynesianism could not be ignored bastardizing it beyond all recognition, by nurturing the intellectual seeds of and propagandizing for the neoliberal revolution) a significant factor in its comparative respectability.
In considering the drive to get economics recognized as a STEM field one cannot overlook this dimension of the issue--which to those who object with the claims for economics can only seem the more wrong-headed at a moment when the neoliberal project that the economics field has so squarely promoted, enabled, helped operate stands in more discredit than it has been at any point since it became a political force a half century ago.
Why Aren't More Young People Attracted to Study of and Careers in STEM Subjects? A Thought
Before proceeding any further I think it necessary to say that there is a considerable body of literature debunking the myth of a (to use the awkward and fuzzy but nonetheless pervasive term) "STEM" skills shortage, and that I generally agree with the case it makes. (Indeed, far from America's young people determinedly refusing to study STEM my own examination of the recent data shows that even when we cut foreign students in the country on a visa out of the picture American colleges' output of B.A.s in engineering surged by 50 percent in the 2010-2020 period.)
It also seems to me worth saying that the endurance of the myth of a STEM skills shortage is entirely political--a matter of scapegoating the humanities, intellectuals and young people for the country's industrial weakness, and the difficult prospects of college graduates, and that the reason this is so successful is because of the numerous interests that find this convenient, and the deference shown them by a news media that never fails to fulfill its designated role as their (to use the most polite word of which I can think) courtiers.
Still, even if there is no shortage of STEM personnel, it does not seem to me inaccurate to say that young people are less attracted to it than to other subject matter which may be less promising career-wise.
It is virtually certain that many factors are operative here--like the extreme variation in American educational outcomes at the K-12 level that means many never had a shot at the proper foundations (you won't do well in trigonometry if your algebra is shaky), and the snobbery that surrounds the stupid hierarchy some make of intellectual life (the stridency about math being superior to words, science to arts, etc. intimidating many and sending many persons with the required aptitudes in other directions).
But I think one particularly overlooked factor is the absence of something we see in other areas.
Consider, for instance, their study of English. Certainly few seem to be very satisfied with the quality of the education imparted in this area, with some justice. (In a country where, on average, people have two years of college--have gone up to "grade 14"--why do they generally read at an eighth grade level?)
But learning here is not limited to what goes on in the classroom. Consider how--not so very long ago--many people read--for fun.
Not everyone was a reader. (Even before the explosion of electronic home entertainment there were those who preferred sports.) And those who did read did not always read the sort of things their parents and teachers would have liked them to be reading. But they read all the same. In the process they practiced their skills, and enlarged their vocabularies, even when they were only entertaining themselves.
Some people even wrote--for fun. They produced diaries, journals, stories--sometimes even whole novels. (A fifteen year old's first novel is not likely to be a great work of literature--but all the same, novels.) And in the process they worked on the relevant skills yet again.
Many, in fact, enjoyed it so much that they aspired to do this kind of thing for a living--so many that, in spite of what was more often than not the extreme discouragement of the people around them, and the extremely long odds against their ever making a living this way --they put a lot of time and effort into trying to do that. And while this is not a particularly happy part of the story it seems to me that what led up to that did make a difference in the overall level of ability people had, and their willingness to pursue degrees and take up jobs where their English skills are relevant.
There simply was not the same opportunity to amuse themselves with numbers that there was with words; to, in the course of pure recreation, improve their mathematical skills the same way; to exercise their imaginations, and express themselves, and play, with numbers the way they could with words; and to do what all this made possible, discover a passion for the activity. Of course, some fell in love with numbers anyway. But the odds, the chances, were far fewer. Save for those few who were exceptionally susceptible to its attraction, or had special opportunity to get interested (perhaps because they had a knowledgeable parent who knew how to intrigue them, who was able to show them that here, too, there could be imagination and play and much else), math was plain and simple work--something they had to study in a highly structured, punitive, stress-and-fear-filled environment which made many want to have nothing to do with it when they could avoid it. Which diminished the odds of becoming attracted to math yet again.
I do not know that it could necessarily have been any other way. But the fact that it has not been that other way must be accounted a part of the story--while it may factor into how the situation is changing. Again, looking over the B.A.s being rewarded I was struck by how the popularity of English as a college major plummeted in recent years, such that where the allotment of some 50,000 such bachelor's degrees seemed to be the norm in the 1990s and the twenty-first century down to 2013-2014 the figure stood at under 40,000 in 2018-2020 (a fall from the level of 5 percent of the degrees awarded in 1990-1991 to under 2 percent of them in 2019-2010, and perhaps still falling). One can argue that the endless drum-beating on behalf of STEM, STEM, STEM! (and the denigration of the humanities that has gone with it) has factored into their choosing other studies and other careers. However, it may also be no coincidence that the age cohort getting those later degrees grew up in a more fully digital age where there was less and less scope for reading to compete with other entertainments--and the smart phone, in its having everyone taking the whole package of electronic entertainment options everywhere with them, delivered a Mortal Kombat-like finishing blow. Bluntly put, we have fewer English majors because we have fewer people who found that they liked to read--in what can seem another rebuke to that conventional wisdom which, again, is ever conventional, but rarely wise.
It also seems to me worth saying that the endurance of the myth of a STEM skills shortage is entirely political--a matter of scapegoating the humanities, intellectuals and young people for the country's industrial weakness, and the difficult prospects of college graduates, and that the reason this is so successful is because of the numerous interests that find this convenient, and the deference shown them by a news media that never fails to fulfill its designated role as their (to use the most polite word of which I can think) courtiers.
Still, even if there is no shortage of STEM personnel, it does not seem to me inaccurate to say that young people are less attracted to it than to other subject matter which may be less promising career-wise.
It is virtually certain that many factors are operative here--like the extreme variation in American educational outcomes at the K-12 level that means many never had a shot at the proper foundations (you won't do well in trigonometry if your algebra is shaky), and the snobbery that surrounds the stupid hierarchy some make of intellectual life (the stridency about math being superior to words, science to arts, etc. intimidating many and sending many persons with the required aptitudes in other directions).
But I think one particularly overlooked factor is the absence of something we see in other areas.
Consider, for instance, their study of English. Certainly few seem to be very satisfied with the quality of the education imparted in this area, with some justice. (In a country where, on average, people have two years of college--have gone up to "grade 14"--why do they generally read at an eighth grade level?)
But learning here is not limited to what goes on in the classroom. Consider how--not so very long ago--many people read--for fun.
Not everyone was a reader. (Even before the explosion of electronic home entertainment there were those who preferred sports.) And those who did read did not always read the sort of things their parents and teachers would have liked them to be reading. But they read all the same. In the process they practiced their skills, and enlarged their vocabularies, even when they were only entertaining themselves.
Some people even wrote--for fun. They produced diaries, journals, stories--sometimes even whole novels. (A fifteen year old's first novel is not likely to be a great work of literature--but all the same, novels.) And in the process they worked on the relevant skills yet again.
Many, in fact, enjoyed it so much that they aspired to do this kind of thing for a living--so many that, in spite of what was more often than not the extreme discouragement of the people around them, and the extremely long odds against their ever making a living this way --they put a lot of time and effort into trying to do that. And while this is not a particularly happy part of the story it seems to me that what led up to that did make a difference in the overall level of ability people had, and their willingness to pursue degrees and take up jobs where their English skills are relevant.
There simply was not the same opportunity to amuse themselves with numbers that there was with words; to, in the course of pure recreation, improve their mathematical skills the same way; to exercise their imaginations, and express themselves, and play, with numbers the way they could with words; and to do what all this made possible, discover a passion for the activity. Of course, some fell in love with numbers anyway. But the odds, the chances, were far fewer. Save for those few who were exceptionally susceptible to its attraction, or had special opportunity to get interested (perhaps because they had a knowledgeable parent who knew how to intrigue them, who was able to show them that here, too, there could be imagination and play and much else), math was plain and simple work--something they had to study in a highly structured, punitive, stress-and-fear-filled environment which made many want to have nothing to do with it when they could avoid it. Which diminished the odds of becoming attracted to math yet again.
I do not know that it could necessarily have been any other way. But the fact that it has not been that other way must be accounted a part of the story--while it may factor into how the situation is changing. Again, looking over the B.A.s being rewarded I was struck by how the popularity of English as a college major plummeted in recent years, such that where the allotment of some 50,000 such bachelor's degrees seemed to be the norm in the 1990s and the twenty-first century down to 2013-2014 the figure stood at under 40,000 in 2018-2020 (a fall from the level of 5 percent of the degrees awarded in 1990-1991 to under 2 percent of them in 2019-2010, and perhaps still falling). One can argue that the endless drum-beating on behalf of STEM, STEM, STEM! (and the denigration of the humanities that has gone with it) has factored into their choosing other studies and other careers. However, it may also be no coincidence that the age cohort getting those later degrees grew up in a more fully digital age where there was less and less scope for reading to compete with other entertainments--and the smart phone, in its having everyone taking the whole package of electronic entertainment options everywhere with them, delivered a Mortal Kombat-like finishing blow. Bluntly put, we have fewer English majors because we have fewer people who found that they liked to read--in what can seem another rebuke to that conventional wisdom which, again, is ever conventional, but rarely wise.
Friday, December 16, 2022
Some Thoughts on Reading a List of the Top 500 Keyword Searches
The Search Engine Optimization company PageTraffic posted a list of the 500 most popular keywords on Google during 2022 (so far, anyway) on its web site.
As might be expected many of the top searches were clearly a way of accessing utilities that could be used for a great many different purposes in daily life (like e-mail and social media accounts usable for general communication, or means of online payment). A great many of the others also appeared to be related to the performance of essential daily tasks, like shopping (there were plenty of names of retail outlets here, in the main of a general nature; interest in bargains on used cars, used textbooks, etc.), and getting information related to that (as by checking price comparison sites, or finding directions to some place to which they needed to get, or the weather they would face going to work or going about their errands during the day).
When one got away from that to searches for information that was not necessarily being sought for some sort of immediate use one saw signs of at least a little interest in current events (manifest mainly in people going to favored news sites), and even a few particular news stories (generally associated with the cultural and political "fringe"--what most would call "conspiracy theory"-type stuff, or tabloid-type stuff). However, entertainment predominated, especially if one includes the numerous searches referencing celebrities and pornography under that heading. (Even where the keyword was not the name of some porn site or of some well-known fetish many of the searches were very, very specific, and the intent hard to mistake.)
Most of this seems predictable enough, but there were surprises too. I had a notion that, if most people are not intellectuals and mostly search for information they can make immediate practical use of in their own lives, in looking for the immediately useful they would show themselves to be not merely "buyers," but "doers." I thought there would be more evidence of people seeking out, for example, health information, or wanting to know how to fix something, clean something, cook something. (WebMD did make the list, but it was not very high up that list, and I failed to notice anything else like it. One may imagine that some of those going to YouTube had an interest in its ample supply of "how-to" videos, but again, the keywords failed to make that clear--and I personally suspect people looking for YouTube generally rather than something specific on YouTube were going looking for amusement rather than to learn something.)
The other surprise was in the celebrities people were looking up. Those they searched for in 2022 were pretty much the same ones I would have expected them to be searching for in 2012 (Kim Kardashian, Taylor Swift, Scarlett Johansson, Anne Hathaway, Emma Watson, Miley Cyrus, Lindsay Lohan, Megan Fox)--and even 2002 (Jessica Alba, Jessica Biel, Jessica Simpson, Jennifer Aniston, Jennifer Love Hewitt, Britney Spears, Trish Stratus, Angelina Jolie, Natalie Portman, Alyssa Milano, Brooke Burke, Pamela Anderson, Carmen Electra, Jenny McCarthy)--as if none of those who have emerged since have captured the "public imagination" in the same way, even by comparison with them at this point in which the great majority of these personages would seem to be very, very far from the peak of their cachet.
Certainly plenty of people have remarked the decline of the movie star, but a decline of celebrity more broadly in the way implied here is something else. Could it be a function of the ever-more extreme fragmentation of popular culture? Or is something else going on?
What do you think, readers? (Lest there be any doubt about the matter that's not a rhetorical question, I really am inviting your comment in the thread below. If anyone is out there. The results I'm talking about didn't exactly raise my confidence in anyone being a reader of the kind of thing this blog happens to offer . . .)
As might be expected many of the top searches were clearly a way of accessing utilities that could be used for a great many different purposes in daily life (like e-mail and social media accounts usable for general communication, or means of online payment). A great many of the others also appeared to be related to the performance of essential daily tasks, like shopping (there were plenty of names of retail outlets here, in the main of a general nature; interest in bargains on used cars, used textbooks, etc.), and getting information related to that (as by checking price comparison sites, or finding directions to some place to which they needed to get, or the weather they would face going to work or going about their errands during the day).
When one got away from that to searches for information that was not necessarily being sought for some sort of immediate use one saw signs of at least a little interest in current events (manifest mainly in people going to favored news sites), and even a few particular news stories (generally associated with the cultural and political "fringe"--what most would call "conspiracy theory"-type stuff, or tabloid-type stuff). However, entertainment predominated, especially if one includes the numerous searches referencing celebrities and pornography under that heading. (Even where the keyword was not the name of some porn site or of some well-known fetish many of the searches were very, very specific, and the intent hard to mistake.)
Most of this seems predictable enough, but there were surprises too. I had a notion that, if most people are not intellectuals and mostly search for information they can make immediate practical use of in their own lives, in looking for the immediately useful they would show themselves to be not merely "buyers," but "doers." I thought there would be more evidence of people seeking out, for example, health information, or wanting to know how to fix something, clean something, cook something. (WebMD did make the list, but it was not very high up that list, and I failed to notice anything else like it. One may imagine that some of those going to YouTube had an interest in its ample supply of "how-to" videos, but again, the keywords failed to make that clear--and I personally suspect people looking for YouTube generally rather than something specific on YouTube were going looking for amusement rather than to learn something.)
The other surprise was in the celebrities people were looking up. Those they searched for in 2022 were pretty much the same ones I would have expected them to be searching for in 2012 (Kim Kardashian, Taylor Swift, Scarlett Johansson, Anne Hathaway, Emma Watson, Miley Cyrus, Lindsay Lohan, Megan Fox)--and even 2002 (Jessica Alba, Jessica Biel, Jessica Simpson, Jennifer Aniston, Jennifer Love Hewitt, Britney Spears, Trish Stratus, Angelina Jolie, Natalie Portman, Alyssa Milano, Brooke Burke, Pamela Anderson, Carmen Electra, Jenny McCarthy)--as if none of those who have emerged since have captured the "public imagination" in the same way, even by comparison with them at this point in which the great majority of these personages would seem to be very, very far from the peak of their cachet.
Certainly plenty of people have remarked the decline of the movie star, but a decline of celebrity more broadly in the way implied here is something else. Could it be a function of the ever-more extreme fragmentation of popular culture? Or is something else going on?
What do you think, readers? (Lest there be any doubt about the matter that's not a rhetorical question, I really am inviting your comment in the thread below. If anyone is out there. The results I'm talking about didn't exactly raise my confidence in anyone being a reader of the kind of thing this blog happens to offer . . .)
Does a Book Release Now Have to be an Event for People to Take an Interest?
I have previously written about how the tougher and tougher market for theatrically released films since the advent of TV has meant that films' backers need their movies to be perceived as events that the audience will want to see in the theater now rather than wait three months and stream at home to get very many people to buy tickets.
These days I think the same thing is happening with novels.
Simply put, people used to consume novels casually and frequently in the course of their daily routine--during a commute, while looking to relax before bed, etc. opening one up. In doing so they often followed some genre, some author, some series fairly faithfully, so much so that the most popular authors commonly building careers out of some formula, or the continuing adventures of some character, in a manner reminiscent of catching up with the latest episode of some TV show. (Indeed, Mack Bolan had a dozen adventures a year in the '80s--not unlike what a season of a show might offer--while hardback heroes like Jack Ryan or Dirk Pitt, or later, Alex Cross or Kay Scarpetta, often had new adventures annually or biennially.)
However, just like the old pattern of movie theater-going, that whole model of producing and enjoying books is collapsing due to the convenience of other media (just as it became easier to watch TV than see a movie people can watch a show or play a game just as easily as read a book during their commute or anything else, and more disposed to do so too), with this likely having much to do with the decline of the paperback (the end of The Executioner was a non-story in the media), and the way the bestseller lists look these days. Those hardback authors who crank out, for example, procedural-type thrillers regularly, seem to me in the main old stars doing so for the audience--or what remains of the audience--they won long ago, because younger people are simply not becoming casual readers, and heavy consumers of novels in the process, in that same way. Thus do we see James Patterson still making the list--but we don't see new writers of comparable thrillers getting up there in the ranks with him. Indeed, where the names on the covers are concerned the paperback rack at your local convenience store or supermarket (in general, dominated by the paperback editions of last year's bestsellers) is virtually indistinguishable from what it was in the '90s.
Those very few "newer" novelists we see make it really big these days generally seem to offer something more idiosyncratic in content, and are more often aided by he publicity potential of the author themselves--with Della Owens' Where the Crawdads Sing exemplary. The book is admittedly describable as a murder mystery--which makes selling it that much easier--but no basis for some series whose latest entries people will snap up when they hit the market each and every year, with other features of the narrative (its sociobiological perspective, its politics, etc.) critical to the draw, while it probably mattered a good deal that the author was already a famous and not uncontroversial figure. (Again I am reminded of the vile king of the Paris publishing scene Dauriat sneeringly telling young Lucien de Rubempre that what he does is take "distinguished names," "reputations ready-made," and via crass and corrupt means for manufacturing "success"--"the claqueurs hired to applaud"--coin money out of them, rather than giving aspiring authors a chance to become famous by actually writing, and that they are fools to expect anything else from him.)
In this particular case the publisher was especially successful in making the event happen--and in the process made a bestseller not simply by attracting people in the habit of reading, but creating such an atmosphere as drew in a significant number of that ever-growing portion of the potential audience who do not read so regularly, because they were promised something special, because the author was themselves a source of interest, because they wanted to see what the fuss was about. Afterward, I suspect, many of those readers did not become more regular readers but went back to reading nothing—at least, until another publisher similarly interested them in their wares as something to which they simply must attend.
These days I think the same thing is happening with novels.
Simply put, people used to consume novels casually and frequently in the course of their daily routine--during a commute, while looking to relax before bed, etc. opening one up. In doing so they often followed some genre, some author, some series fairly faithfully, so much so that the most popular authors commonly building careers out of some formula, or the continuing adventures of some character, in a manner reminiscent of catching up with the latest episode of some TV show. (Indeed, Mack Bolan had a dozen adventures a year in the '80s--not unlike what a season of a show might offer--while hardback heroes like Jack Ryan or Dirk Pitt, or later, Alex Cross or Kay Scarpetta, often had new adventures annually or biennially.)
However, just like the old pattern of movie theater-going, that whole model of producing and enjoying books is collapsing due to the convenience of other media (just as it became easier to watch TV than see a movie people can watch a show or play a game just as easily as read a book during their commute or anything else, and more disposed to do so too), with this likely having much to do with the decline of the paperback (the end of The Executioner was a non-story in the media), and the way the bestseller lists look these days. Those hardback authors who crank out, for example, procedural-type thrillers regularly, seem to me in the main old stars doing so for the audience--or what remains of the audience--they won long ago, because younger people are simply not becoming casual readers, and heavy consumers of novels in the process, in that same way. Thus do we see James Patterson still making the list--but we don't see new writers of comparable thrillers getting up there in the ranks with him. Indeed, where the names on the covers are concerned the paperback rack at your local convenience store or supermarket (in general, dominated by the paperback editions of last year's bestsellers) is virtually indistinguishable from what it was in the '90s.
Those very few "newer" novelists we see make it really big these days generally seem to offer something more idiosyncratic in content, and are more often aided by he publicity potential of the author themselves--with Della Owens' Where the Crawdads Sing exemplary. The book is admittedly describable as a murder mystery--which makes selling it that much easier--but no basis for some series whose latest entries people will snap up when they hit the market each and every year, with other features of the narrative (its sociobiological perspective, its politics, etc.) critical to the draw, while it probably mattered a good deal that the author was already a famous and not uncontroversial figure. (Again I am reminded of the vile king of the Paris publishing scene Dauriat sneeringly telling young Lucien de Rubempre that what he does is take "distinguished names," "reputations ready-made," and via crass and corrupt means for manufacturing "success"--"the claqueurs hired to applaud"--coin money out of them, rather than giving aspiring authors a chance to become famous by actually writing, and that they are fools to expect anything else from him.)
In this particular case the publisher was especially successful in making the event happen--and in the process made a bestseller not simply by attracting people in the habit of reading, but creating such an atmosphere as drew in a significant number of that ever-growing portion of the potential audience who do not read so regularly, because they were promised something special, because the author was themselves a source of interest, because they wanted to see what the fuss was about. Afterward, I suspect, many of those readers did not become more regular readers but went back to reading nothing—at least, until another publisher similarly interested them in their wares as something to which they simply must attend.
Black Panther 2, Avatar and the End of 2022
Black Panther 2 has taken in $11 million in its fifth weekend, lifting its total to $409 million (up from $394 million last weekend), a rate of decline suggesting the movie will finish out below the $450 million mark, and perhaps not much above $430 million in North America; while globally (where the film now stands at $768 million) it seems certain to finish well below not just the billion-dollar mark, but the $900 million mark, and I think, likely to finish below even the $850 million mark (which would give it about half of the original's gross when this is adjusted for 2022 prices).
Of course, this is pretty much in line with the low end of the conventional expectations from the start for the U.S. (with which I did not disagree, and which seemed to me increasingly persuasive from the first weekend on). At the same I have little more to say about what all this means for a Black Panther 3 or for the continuation of the Marvel Cinematic Universe.
Instead what interests me is that Black Panther 2 has held to the #1 spot at the box office for five straight weeks--a rare feat these days, especially in the normally very competitive holiday season, though alas, this seems at least as much a function of the weakness of the competition as the strength of the film's draw. Had it come out in a "regular" year it might have done even less well--which is to say that, like Top Gun 2, if in lesser degree, it has been a beneficiary of the pandemic's leaving 2022 with a relatively thin slate of releases.
I expect that Avatar 2, cutting into what remains of Black Panther 2's ticket sales from this weekend forward, will similarly benefit from being up against a weaker-than-usual slate of December and January releases. But afterward the market will tighten fast, with 2023, after three years of slim pickings at the box office, looking as packed with potential hits of the big-budget action franchise variety as any pre-pandemic year.
Of course, this is pretty much in line with the low end of the conventional expectations from the start for the U.S. (with which I did not disagree, and which seemed to me increasingly persuasive from the first weekend on). At the same I have little more to say about what all this means for a Black Panther 3 or for the continuation of the Marvel Cinematic Universe.
Instead what interests me is that Black Panther 2 has held to the #1 spot at the box office for five straight weeks--a rare feat these days, especially in the normally very competitive holiday season, though alas, this seems at least as much a function of the weakness of the competition as the strength of the film's draw. Had it come out in a "regular" year it might have done even less well--which is to say that, like Top Gun 2, if in lesser degree, it has been a beneficiary of the pandemic's leaving 2022 with a relatively thin slate of releases.
I expect that Avatar 2, cutting into what remains of Black Panther 2's ticket sales from this weekend forward, will similarly benefit from being up against a weaker-than-usual slate of December and January releases. But afterward the market will tighten fast, with 2023, after three years of slim pickings at the box office, looking as packed with potential hits of the big-budget action franchise variety as any pre-pandemic year.
Friday, December 9, 2022
The Need for Movies to be "Events" and the Fate of the Marvel Cinematic Universe
The decline of movie theater-going is a story that has been told many a time--not least by the writer of this blog post.
Short version: in a couple of decades a number of changes, the most important of which was the proliferation of TV, meant that measured on a per capita basis Americans' trips to the movie theater fell from thirty to four, and the studios struggled then and continue to struggle now simply to maintain that level. Over that period they have developed and utilized many strategies to accomplish that object (bigger screens, color and 3-D to provide a superior visual experience; "edgier" fare than the more thoroughly censored small screen could offer, etc., etc.), but improvement in TV technology (color, bigger screens and high-definition; access to more content via cable, home video, streaming; the relaxation of censorship over television until it became edgier than the movies) progressively narrowed their options. And these days filmmakers have only two options left, namely:
1. Deliver the kind of spectacle TV production companies still can't deliver on their budgets, while extracting every bit of impact from screens bigger than any TV-maker offers, to offer a sensory experience such as viewers still cannot have at home; and
2. Make your movie seem like an "event" that people want to participate in right now, rather than waiting for two or three months and catching the thing on streaming at a much lower price.
Big-budget action-adventure films with salable brand names fit the bill well, and in this century, none have more consistently done so than Marvel, which offered the requisite spectacle while making its movies seem an event even as it put out three of them in a year in what still stands as the only really successful attempt at a "Shared Universe" (as the fate of such efforts with Star Wars, Warner Brothers' Justice League, Universal's "Dark Universe," and the rest make all too clear).
Where the matter of making the films seem like an "event" is concerned Marvel benefited greatly from the fact that where DC characters had been previously familiar--and new versions of their stories unavoidably and often unfavorably compared to their predecessors--they were putting their not-much-less famous characters on the big screen for the first time, and in the process delivering something new in at least visual terms (like Spider-Man in a really big-budget, big-screen production for the first time ever, swinging his way through New York in Sam Raimi's original 2002 film). As more and more such characters got the treatment there was still the interest of cross-overs, culminating in the gathering together of Iron Man, and Captain America, and Thor--and the Hulk, and Black Widow, and Nick Fury, and Hawkeye--that helped make 2012's The Avengers an event. Subsequently Phase 3's continued milking of the interest of multi-hero cross-over events (from Captain America 3 forward), mobilization of "identity politics" behind Black Panther and Captain Marvel gave those films "event" status claims, and the films' collectively offering an arc which drew that whole "universe" of superheroes together into their biggest cross-over event as they fought their biggest enemy ever in their climactic battle with Thanos, saw the franchise go from strength to strength.
By contrast nothing of the kind could be claimed for Phase 4, by which time a very great deal had already been done, and the franchise could not go bigger, while the law of diminishing returns was kicking in very forcefully. The interest of certain critical figures was already exhausted (Thor), while by this point the figures being put on-screen for the first time tended to be less well-known to the broader audience (The Eternals, Shang-Chi--Spider-Man they are not). And the makers of the films did not exactly rise to the challenge the situation posed. An origin story for Black Widow could seem an anti-climax to her prior adventures, as prequels so often do; Dr. Strange, another of the less well-known figures, had a sequel movie tied in with the Wandavision TV series in a way problematic for the broader audience that could not all be counted upon to have seen it; and Black Panther 2 was mainly an event to the extent that it was connected with a prior movie that had been an event (while dispensing with the lead character and actor who played him!).
In the end, only Spider-Man: No Way Home, with its ever-popular hero and retconning of the two prior Spider-Man franchises into a single multiverse-spanning narrative could seem really an event that way. And unsurprisingly it was the only really spectacular, Marvel Cinematic Universe-at-its-peak, billion dollar barrier-bursting success (with the fact that, if one can blame the pandemic for much of the underperformance of the first three movies one cannot do the same for that of the three movies which followed Spider-Man underlining the fact). And right now I have my doubts that there is anywhere left to go in this respect--the Phase 5 slate of films already in the pipeline. (Indeed, Ant-Man 3, will be coming to a theater near you scarcely two months from now.) Some of them might make decent money (with Captain America 4 perhaps the best bet that way, to go by their prior track records). But I see no sign of anything that will bring back the sense of Marvel releases as events that did so much for the franchise's first three phases, with all that implies for the franchise's fortunes in the coming years.
Short version: in a couple of decades a number of changes, the most important of which was the proliferation of TV, meant that measured on a per capita basis Americans' trips to the movie theater fell from thirty to four, and the studios struggled then and continue to struggle now simply to maintain that level. Over that period they have developed and utilized many strategies to accomplish that object (bigger screens, color and 3-D to provide a superior visual experience; "edgier" fare than the more thoroughly censored small screen could offer, etc., etc.), but improvement in TV technology (color, bigger screens and high-definition; access to more content via cable, home video, streaming; the relaxation of censorship over television until it became edgier than the movies) progressively narrowed their options. And these days filmmakers have only two options left, namely:
1. Deliver the kind of spectacle TV production companies still can't deliver on their budgets, while extracting every bit of impact from screens bigger than any TV-maker offers, to offer a sensory experience such as viewers still cannot have at home; and
2. Make your movie seem like an "event" that people want to participate in right now, rather than waiting for two or three months and catching the thing on streaming at a much lower price.
Big-budget action-adventure films with salable brand names fit the bill well, and in this century, none have more consistently done so than Marvel, which offered the requisite spectacle while making its movies seem an event even as it put out three of them in a year in what still stands as the only really successful attempt at a "Shared Universe" (as the fate of such efforts with Star Wars, Warner Brothers' Justice League, Universal's "Dark Universe," and the rest make all too clear).
Where the matter of making the films seem like an "event" is concerned Marvel benefited greatly from the fact that where DC characters had been previously familiar--and new versions of their stories unavoidably and often unfavorably compared to their predecessors--they were putting their not-much-less famous characters on the big screen for the first time, and in the process delivering something new in at least visual terms (like Spider-Man in a really big-budget, big-screen production for the first time ever, swinging his way through New York in Sam Raimi's original 2002 film). As more and more such characters got the treatment there was still the interest of cross-overs, culminating in the gathering together of Iron Man, and Captain America, and Thor--and the Hulk, and Black Widow, and Nick Fury, and Hawkeye--that helped make 2012's The Avengers an event. Subsequently Phase 3's continued milking of the interest of multi-hero cross-over events (from Captain America 3 forward), mobilization of "identity politics" behind Black Panther and Captain Marvel gave those films "event" status claims, and the films' collectively offering an arc which drew that whole "universe" of superheroes together into their biggest cross-over event as they fought their biggest enemy ever in their climactic battle with Thanos, saw the franchise go from strength to strength.
By contrast nothing of the kind could be claimed for Phase 4, by which time a very great deal had already been done, and the franchise could not go bigger, while the law of diminishing returns was kicking in very forcefully. The interest of certain critical figures was already exhausted (Thor), while by this point the figures being put on-screen for the first time tended to be less well-known to the broader audience (The Eternals, Shang-Chi--Spider-Man they are not). And the makers of the films did not exactly rise to the challenge the situation posed. An origin story for Black Widow could seem an anti-climax to her prior adventures, as prequels so often do; Dr. Strange, another of the less well-known figures, had a sequel movie tied in with the Wandavision TV series in a way problematic for the broader audience that could not all be counted upon to have seen it; and Black Panther 2 was mainly an event to the extent that it was connected with a prior movie that had been an event (while dispensing with the lead character and actor who played him!).
In the end, only Spider-Man: No Way Home, with its ever-popular hero and retconning of the two prior Spider-Man franchises into a single multiverse-spanning narrative could seem really an event that way. And unsurprisingly it was the only really spectacular, Marvel Cinematic Universe-at-its-peak, billion dollar barrier-bursting success (with the fact that, if one can blame the pandemic for much of the underperformance of the first three movies one cannot do the same for that of the three movies which followed Spider-Man underlining the fact). And right now I have my doubts that there is anywhere left to go in this respect--the Phase 5 slate of films already in the pipeline. (Indeed, Ant-Man 3, will be coming to a theater near you scarcely two months from now.) Some of them might make decent money (with Captain America 4 perhaps the best bet that way, to go by their prior track records). But I see no sign of anything that will bring back the sense of Marvel releases as events that did so much for the franchise's first three phases, with all that implies for the franchise's fortunes in the coming years.
Martin Eden, Radical Rightist
In Jack London's Martin Eden the eponymous protagonist, discovering the world of books and the ideas in them, early on happens on the work of Herbert Spencer, which--along with his later discovery of the work of Friedrich Nietzsche--becomes a profound if ultimately destructive influence on him.
In that, as in so many other ways, the book had a very contemporary ring.
That may seem odd, as Spencer and Nietzsche hardly enjoy the same vogue today that they did in London's time. But they stand in a tradition that endures--the radical right's hostility to egalitarianism, with Social Darwinism, nihilism and the rest invoked in opposition to liberalism and the left. And Eden's discovering those ideas so early on, and being influenced by them, should seem obviously contemporary to anyone looking at the "alt-right" and related phenomena today.
I cannot credit London with saying much about why Eden came across those thinkers and ideas that held him so spellbound--but the question of why this happened seems easy enough to answer. Simply put, the ideas of even the extreme right are given a publicity that the ideas of the left are not. Often they are lent a fair amount of prestige through their having rich, powerful champions. Even when the publicity is not wholly positive it tends to be somewhat respectful, and even if less than respectful it still creates a wider awareness of them, and may promote interest in them, in a way that it does not with leftist ideas. And so a young person who starts looking for answers is far more likely to encounter them than they are those of the left (certainly if we are speaking of the actual left, and not the conservative centrists to which our lobotomized political discourse refers as left), as endlessly demonstrated by how much more easily a young person finds their way to, for example, Ayn Rand or Jordan Peterson, than to socialism.
It may also be that for a young person looking for answers severe, elitist, ideas like the ones discussed here have some attraction. The fact that they are looking for answers, after all, is likely to bespeak a measure of dissatisfaction--with the society in which they find themselves, with the people around them . . . with all that means in a society which speaks endlessly of equality, democracy, etc.. (That society may in fact be extremely unequal, the democratic pretensions threadbare at best, but if, as is often the case, they take the rhetoric seriously they are easily persuaded that this is part of the problem.) The idea that there is rightfully an elite may not be unappealing, even when they are clearly not part for it--telling themselves that they really belong in such an elite, that they have somehow been deprived of their rightful place by those from whom they feel alienated, the "herd" somehow responsible for their being down here rather than up there where they ought to be. Indeed, the very severity of the views can fit in well with all of this--their adopting "tough-minded" ideas seeming to them to be proof that they are themselves "tough," unlike all those others spouting what seems to them namby-pamby claptrap, and therefore part of that circle above and better than all the others, because they see life as it is, all as, even if their situation is none too estimable at the moment, they entertain hopes for what they might become, attaining to their rightful place. (This may go especially for those who have found delusions of grandeur a necessary coping mechanism for their hard actual circumstances, or the challenge of going out into a world turning a very hard and ugly face toward a young person of modest background--especially until they learn just how rough the going will actually be, how irrelevant their estimation of themselves.)
Thus did it go with the would-be superman Eden, disdainful of the "herd," and intent on winning his way to some place higher and better--who only very late, too late, realized that for all his very many and real gifts he was nothing of the kind, and just when he should have reaped the rewards of his struggles, threw everything away.
In that, as in so many other ways, the book had a very contemporary ring.
That may seem odd, as Spencer and Nietzsche hardly enjoy the same vogue today that they did in London's time. But they stand in a tradition that endures--the radical right's hostility to egalitarianism, with Social Darwinism, nihilism and the rest invoked in opposition to liberalism and the left. And Eden's discovering those ideas so early on, and being influenced by them, should seem obviously contemporary to anyone looking at the "alt-right" and related phenomena today.
I cannot credit London with saying much about why Eden came across those thinkers and ideas that held him so spellbound--but the question of why this happened seems easy enough to answer. Simply put, the ideas of even the extreme right are given a publicity that the ideas of the left are not. Often they are lent a fair amount of prestige through their having rich, powerful champions. Even when the publicity is not wholly positive it tends to be somewhat respectful, and even if less than respectful it still creates a wider awareness of them, and may promote interest in them, in a way that it does not with leftist ideas. And so a young person who starts looking for answers is far more likely to encounter them than they are those of the left (certainly if we are speaking of the actual left, and not the conservative centrists to which our lobotomized political discourse refers as left), as endlessly demonstrated by how much more easily a young person finds their way to, for example, Ayn Rand or Jordan Peterson, than to socialism.
It may also be that for a young person looking for answers severe, elitist, ideas like the ones discussed here have some attraction. The fact that they are looking for answers, after all, is likely to bespeak a measure of dissatisfaction--with the society in which they find themselves, with the people around them . . . with all that means in a society which speaks endlessly of equality, democracy, etc.. (That society may in fact be extremely unequal, the democratic pretensions threadbare at best, but if, as is often the case, they take the rhetoric seriously they are easily persuaded that this is part of the problem.) The idea that there is rightfully an elite may not be unappealing, even when they are clearly not part for it--telling themselves that they really belong in such an elite, that they have somehow been deprived of their rightful place by those from whom they feel alienated, the "herd" somehow responsible for their being down here rather than up there where they ought to be. Indeed, the very severity of the views can fit in well with all of this--their adopting "tough-minded" ideas seeming to them to be proof that they are themselves "tough," unlike all those others spouting what seems to them namby-pamby claptrap, and therefore part of that circle above and better than all the others, because they see life as it is, all as, even if their situation is none too estimable at the moment, they entertain hopes for what they might become, attaining to their rightful place. (This may go especially for those who have found delusions of grandeur a necessary coping mechanism for their hard actual circumstances, or the challenge of going out into a world turning a very hard and ugly face toward a young person of modest background--especially until they learn just how rough the going will actually be, how irrelevant their estimation of themselves.)
Thus did it go with the would-be superman Eden, disdainful of the "herd," and intent on winning his way to some place higher and better--who only very late, too late, realized that for all his very many and real gifts he was nothing of the kind, and just when he should have reaped the rewards of his struggles, threw everything away.
Subscribe to:
Posts (Atom)