Thursday, April 25, 2024

What's Happening with Boxoffice Pro?

For quite some time I have found Boxoffice Pro's comprehensive forecasts of upcoming and recently released films' grosses and their systematic updating of great interest, especially in these times in which so many movies surprise us amid what may be historic changes in the business of film--big franchise films crashing and burning (like The Flash), or unlikely-seeming movies becoming colossal hits (like Oppenheimer), as Hollywood confronts the possible end of the high concept-dominated post-New Hollywood era.

Alas, for some weeks now the forecasting section of the site has been much reduced, with just a handful of long-range (or even weekend) forecasts appearing this past month. (Thus The Fall Guy forecast I cited has not had a single update.)

I imagine that the impoverishment of the forecasting section is a temporary situation. (After all, Boxoffice Pro is a major film industry trade publication in circulation since 1920, before "movies" became "talkies.") Still, I have no idea when things will get back to normal that way--and can only look forward to when they do so.

Daniel Bessner's New Article in Harper's

Daniel Bessner's "The Life and Death of Hollywood" is exactly what journalism ought to be but all too rarely is--a deeply informed piece of reporting that goes beyond mentioning some facts or passing on some impressions to putting together a picture of its subject and explaining how things stand to the reader with, in this case, the analysis of Hollywood's development in recent decades, especially as seen from the standpoint of its writers, benefiting greatly from its combination of critical perspective, historical background and attentiveness to contemporary economic fact.* In this piece Bessner quite rightly explains Hollywood as undergoing its own piece of the broader neoliberal turn, affecting it in the same way that it has affected the rest of America's economy and society. Just as has happened elsewhere, the convergence of deregulation (in this case, particularly the collapse of antitrust enforcement under the Reagan administration, and the Telecommunications Act of 1996 under Clinton), combined with creditist monetary policy to produce an extraordinary concentration of ownership, production and market power, and a triumph of speculation and short-termism, that has had catastrophic effects for the workers involved and society at large as the game became one of "winner take all." Thus did the early '00s see a half dozen vertically integrated giants (Disney, Time Warner and the other usual suspects) "raking in more than 85 percent of all film revenue and producing more than 80 percent of American prime-time television," while the ultra-loose monetary policy Ben Bernanke ushered in enabled further concentration as a mere three asset-management companies (BlackRock, Vanguard, State Street) to concentrate in their hands the ownership of, well, nearly everything ("becoming the largest shareholders of 88 percent of the S&P 500"), with the media giants no exception to the pattern. (As of the end of 2023 Vanguard "owned the largest stake in Disney, Netflix, Comcast, Apple, and Warner Bros. Discovery," while also having "substantial share[s] of Amazon and Paramount Global.") Indeed, even the talent agencies supposed to represent the writers to those companies similarly became an oligopoly of a few giants in turned owned by asset management firms.

In the wake of the reduction of film and TV production to an oligopoly owned by a few financial firms, and the inevitable ascent of a pseudo-efficiency-minded short-termist control freak mentality among studio executives, it was no accident that, in line with his cite of Shawna Kidman, the share of "franchise movies" in "studios' wide-release features" surged from 25 percent in 2000 to 64 percent in 2017--the age of "multistep deals" and "spec scripts" increasingly a thing of the past, exemplified by the crassness of Disney, and perhaps especially its Marvel operation, which "pioneered a production apparatus in which writers were often separated from the conception and creation of a movie’s overall story." (Thus did the writers have to generate a whole season of scripts for WandaVision without knowing how the season was supposed to end, because the "executives had not yet decided what other stories they might spin off from the show." Thus did Marvel bring the TV-style "writer's room" into filmmaking--if one calls the result filmmaking--as the TV networks replaced their writer's rooms with "mini-writer's rooms" offering those who work in them still less.) Amid the declining readiness to support the development of many ideas in the hope that one would pay off (thirty scripts to get one which might be made into a movie in the old days), amid the declining creative control and bargaining power of writers ever at a disadvantage in relation to the executives whose media courtiers celebrate them as "geniuses" (as with a Kevin Feige), amid the studios playing the old game of exploiting technological changes to contractually cut writers out of any share in the profits (in this case, streaming and its revenues, as Bessner shows through the case of Dickinson creator Alena Smith), writers lost in just about every way. Moreover, the post-pandemic economic shock has dealt them another blow as media companies that had, however exploitatively, been funding a lot of production retrenched, and all too predictably won the labor battle of 2023--at the end of which they may be said to have imposed an at best thinly veiled Carthaginian peace, with what is to come likely to be worse.

Given all that Bessner naturally declined to end his story on a note of false optimism, instead going from the "Life" to the "Death" part of his title. As he notes one could picture various palliatives helping, like government regulation of asset ownership (or indeed, mere application of antitrust law here), but all this seems unthinkable--precisely because what is happening here is so much of a piece with what is happening in all the rest of the economy, and it is all too clear how that is going. Indeed, one can picture things becoming even worse still in the fairly near term if the Hollywood executive class manages to realize its fantasies of using artificial intelligence to dispense with its workers entirely in even a small degree--or even in spite of the technology's inadequacies, simply tries and fails to do so, but destroys a lot of careers and "disrupts" the industry in all the wrong ways in the process.

* Reading Dr. Bessner's biography I found that he is not a reporter but a professor of International Studies--and cannot help wondering if that did not make all the difference for the quality of the item, especially where the avoidance of the hypocrisy of "objectivity" and the contempt for context that characterizes so much contemporary journalism are concerned.

Upton Sinclair on the Greeks

Discussing the literary legacy of the ancient Greeks in Mammonart Upton Sinclair asks just "how much do we really admire Greek literature and Greek art, and how much do we just pretend to admire it?" Recalling Samuel Johnson's quip about a dog walking on two legs, and how "it is not well done, but we are surprised that it is done at all," he suggested that the conventional exaltation of the Greeks is based not on a clear-headed judgment of the poetry and drama by modern standards, next to which these works are technically crude and representative of a world-view so unacceptable to modern people that as a matter of course we can seem "in denial" about it (far from Greek art being an art "of joy and freedom," their theme was invariably "the helplessness of the human spirit in the grip of fate," and at that, a fate which destroyed them), but rather a matter of "superstition" that has been "maintained by gentlemen who have acquired honorific university degrees, which represent to them a meal ticket," and the snobbery to which they cater (Classicism in itself become so "leisure class" by his time as to make "Homer . . . to the British world of culture what the top-hat is to the British sartorial"). Indeed, in Sinclair's view those who (like a Matthew Arnold or William Gladstone) "write volumes of rhapsody about Homer" testify less to Homer's superlativeness as a poet than his works' accordance with their prejudices--his "ha[ving] the aristocratic point of view, and giv[ing] the aristocratic mind what it craves," namely a vision of life in which aristocrats "unrestrained in their emotions and limitless in their desires" act out and are flattered in their self-importance by seeing the gods themselves take an interest in their personal tales, all as, one might add, they generally sing conservatives' militarist, patriotic, values.

This is not to say that Sinclair contemptuously dismissed it all (let alone was in any way a promoter of the kind of "cancel culture" that today seeks to "cancel" the Classics along with so much else). If Sinclair stresses the Greek pessimism he thinks too much overlooked, he is historically-minded enough to acknowledge, as others of like ideology have done, that to the ancient mind the world easily seemed "a place of blind cruelty, the battle-ground of forces which he did not understand," and out of this "made . . . a philosophy of stern resignation, and an art of beautiful but mournful despair"--and not confuse all this with the "dispensation of official pessimism" in more enlightened modern times (as with the apparent "'Classical' attitude" of "pathetic and heroic" "resignation to the pitiful fate of mankind on earth" of the aforementioned Arnold), which he saw as coming from quite other sources. Indeed, for all the limitations of that ancient mind Sinclair owns to pleasure in being able to see for ourselves "the beginnings of real thinking, of mature attitudes toward life" in in their early writings, and much else besides. If Sinclair finds much in Homer ridiculous and repellent, it seemed to him that one still did find "beautiful emotion"--albeit not the ones that moved lovers of the "heroism" of an Achilles, but "the mothers and fathers, the wives of children of those heroes [who] express for them an affection of which they are unworthy." He finds much to admire in the satire of Euripedes, "jeering at militarism and false patriotism, denouncing slavery and the subjection of women in the home, rebuking religious bigotry, undermining the noble and wealthy classes." Yet, if it seems to Sinclair appropriate to qualify one's criticisms, and that the works have their worthwhile aspects, that exaltation of the Classics in the familiar way as a matter of superstition and snobbery above all else stands.

Reading Sinclair it seems to me that there is an enormous amount of truth in this statement--which, indeed, ought to be evident to anyone who actually tries looking at those writings with clear eyes, as others have done, not all of them inclined to see things his way on most issues. (The critic John Crowe Ransom, who as one of the twelve writers of the Southern Agrarian Manifesto was far, far removed from Sinclair politically, seems to me to have grasped very well, and put even more poetically and succinctly, just how remote the Greek view of life, or at least the early Greek view, was from one of "joy and freedom.") Indeed, present a professor of literature today with the hard facts of just what is in "this stuff" and you are likely to get as their answer a sort of embarrassed agreement that reminds you just how much the Canon, and indeed what we say about the Canon, is the product of timid deference to received judgment, which one is expected to, in the words of Oscar Wilde, "endure . . . as the inevitable." And so the superstition and snobbery of which Sinclair wrote significantly prevail a century on--qualified, of course, by the decreasing extent to which anyone is paying attention to the Classics or to the humanities or to culture at all.

Alex Garland's Civil War: Some of the Critics' Views

Recently remarking Alex Garland's Civil War the movie theater trade publication Boxoffice Pro characterized the film's promotion as a "bait-and-switch marketing campaign that sold opening weekend audiences on the promise of an action movie" while "delivering a bleak drama that focuses a lot more on journalism instead."

It seems to me that many would regard this as not the only piece of bait-and-switch at work in the film, or even the most important one. As Forbes' Erik Kain remarked, the movie Civil War, in the publicity for which even the "action movie" aspect was arguably less important than the prospect of a drama about American political divisions (there are lots of action movies out there, not so many political dramas with a "high concept" draw like that one), ended up not really being about those divisions, Garland studiously "avoid[ing] the politics of the day in order to tell a more universal story." Kain's view of this approach was favorable, but others have been less impressed (finding in Garland's approach less universalism or "neutrality" than thoughtlessness and evasion and maybe worse in a piece of what one critic has described as a "gutless" cinematic "both sidesism").

However, whatever one makes of it, audiences would seem to have been pulled in by the offer of one thing, and given another. This weekend's gross will likely say something about how the audience's ultimate reaction to that.

Sunday, April 21, 2024

The New York Times, Right-Wing Publication

Recently reading the Columbia Journalism Review's analysis of pre-2022 midterm coverage of domestic affairs by the New York Times and Washington Post I was unsurprised (indeed, confirmed in my views) when they quantitatively demonstrated those papers' preference for politics to policy, and the rarity with which anything published in them actually attempted to explain the affairs of the day to the general public.

I was more surprised by the editorial contrast between the Times and its Washington-based counterpart. The Review article found that, if both papers were more prone to attend to the issues that Republicans care most about than those Democratic voters care most about, the Times was significantly more prone to do so than the Post. By the analysts count the Times favored the Republicans' concerns by a margin of over 5 to 1 (37 to 7 pieces, respectively), as against the 4 to 3 margin seen with the Post. It is an astonishing predominance, even in comparison with that other newspaper, that can seem to bespeak the political shift of the newspaper that publishes Ross Douthat, Bret Stephens, Christopher Caldwell (to say nothing of figures like Thomas Friedman).

Media Biases: A Quantitative Analysis from the Columbia Journalism Review

Some time ago I extended my analysis of political centrism to the mainstream news media and argued that much of its conduct--and in particular much of the conduct that many view as problematic--is explicable in terms of centrist ideology. This particularly included the matters of the media's preference for politics to policy, and its declining to attempt to explain complex issues to the public.

As it happens a November 2023 piece in the Columbia Journalism Review setting out the results of a quantitative examination of the pre-2022 midterm coverage of domestic affairs by the New York Times and Washington Post (from September 2022 to Election Day) was consistent with exactly that reading. The analysts found that, apart from their both emphasizing campaign "horse race and . . . palace intrigue," only a very small proportion of the items had any policy content to speak of, particularly of the explicatory type. Of 219 front page stories about domestic politics during this period in the Times, "just ten . . . explained domestic policy in any detail," while of 215 stories in the Post just four "discussed any form of policy"--working out to less than 5 percent in the former case, less than 2 percent in the latter case. Indeed, the Post did not have a single front-page story about "policies that candidates aimed to bring to the fore or legislation they intended to pursue" during that whole critical period, opting instead for speculative pieces about the candidates and "where voter bases were leaning."

Remember, these are the most respected mainstream publications--the "papers of record"--the kind of publications that many of those complaining about "fake news" tell us to attend to instead of the rest of what we are apt to see online. Considering that fact, one can only judge these publications to be themselves a considerable part of the problem with regard to the deficiencies of the public's understanding of current events--with, indeed, the preference for politics over policy and the extreme paucity of explanation just the tip of the iceberg.

What is a Superhero? An Attempt at a Definition

What is a superhero?

It does not seem unreasonable to define the term. Granted, a certain sort of person dismisses such attempts as bound to fail given that exceptions will always exist to any definition--but the existence of ambiguities does not, as such, keep definition from being a useful, in fact essential, tool for making sense of reality. (The map may not be the territory, but that does not make maps useless, no matter how much contrarians addicted to stupid epistemological nihilism natter on.)

A good starting point for such a definition is an examination of those criteria that may seem least dispensable to the conception of the superhero as we know them today--the superhero tradition as established in the comic book medium, especially from the advent of Superman forward.

1. They Must Be Both a Hero--and Super.
A superhero must obviously be a hero--a figure that does something both positive and extraordinary such as is worthy of admiration by others. Where the superhero's sort of heroism is concerned there is an expectation of service to the community of some physical kind, usually entailing courage in the face of physical danger, to save the lives of others because it is the right thing to do rather than a matter of self-interest; with such action a vocation rather than an exceptional incident. Of course, people pursue vocations that have them performing such acts--firefighters are the go-to example--but superheroes are distinguished from them in that they are super, their abilities exceeding the human norm in a way which is quantitative or qualitative or both. (Conventionally the firefighter running into a burning building must make do with "merely" human physical and mental capacities, but a superhero might be able to put out the fire with a blast of icy breath, as Superman can.)

2. They Have a Very Distinct and Very Public Persona, of which Distinct Powers, Codenames and Physical Appearances (Usually but Not Always a Function of Costuming) Tend to be a Part.
The above seems to me mostly self-explanatory. In contrast with, for example, heroes who are secret agents, who are likely to conceal their "heroic" aspect behind cover identities as they set about their work, the superhero's cover identity, their public persona, is itself heroic--and indeed advertised to the world at large, the more in as its elements include the aforementioned matters of a power, or powers, that tend to be very individualized; a similarly unique codename; and a unique appearance, usually but not always derived from a costume. (Barry Allen is a hero made a superhero by his application of "speedster" capabilities such as even other DC superheroes tend not to have to fighting evil; is, appropriately, known as the Flash, after his speed; and wears a red costume with lightning bolt insignia, also evoking that speed. Other figures such as Marvel's The Hulk and the Thing have distinct powers and codenames, and their distinct appearance--albeit less reliant on any costume in their case.)

3. The Superhero's Activity is Highly Individualistic.
As implied by the exceptional character of the hero's capacity, and their conspicuous public persona, the conception of the superhero is individualistic in action. Consider, for example, the firefighter and secret agent both. Both figures are conventionally employees of a public agency, which pays them money to do a job in line with orders--which does not make a sense of duty irrelevant by any means, but still raises the question of the paycheck, career, etc. in ways that unavoidably give their activity a different texture. Both also rely on their organizations for their ability to do their jobs--unlikely to have equipment or other supports necessary to their hero task if they were not so employed. By contrast the superhero is ordinarily alone, answering to no one--taking no orders, and often having a complex and fraught relationship with authority (epitomized by stories in which Batman falls afoul of the law), all as what equipment they need is their own (one reason why superheroes are so often independently wealthy, as Batman or Iron Man is).

Moreover, when we see superheroes "team up," teaming up is exactly what they do--these individuals cooperating in a joint effort rather than relinquishing their identities to be good "organization men" and women. (Iron Man is first and foremost Iron Man, and never "just" an Avenger.)

Of course, much else tends to go with this. The public persona is often a way of concealing a private identity, and at least attempting to protect a private life. And of course there is apt to be the unusual origin story--for superhuman abilities, and the decision to put on a costume and fight a private war against crime or some other such evil are the kinds of things for which most people expect an explanation, which is likely to be extraordinary because of what has to be explained. (Thus is Superman literally from another planet, while Batman has been motivated by childhood trauma and equipped by a lifetime of preparation for his vigilante mission, aided by vast wealth as well as extraordinary talent.) One may add that the superhero almost always faces a supervillain at some point--because any other sort of villain is a less than worthy adversary. (How long would Superman remain interesting just catching small-time purse-snatchers?) However, those three items seem to me to constitute the indispensable minimum--with any character not meeting those three criteria, which I think enable us to distinguish between superheroes and non-superheroes of various kinds, without bounding the category so narrowly as to deprive it of analytical usefulness, and permitting distillation as the following:
A superhero is a figure who, acting on their individual initiative and resources, and through a distinct public persona apt to entail codename and (usually costume-based) appearance, makes a vocation of defending the public from physical dangers such as accident, crime and "supervillainy," usually in a way requiring physical action and courage on their part, and drawing on abilities and/or equipment endowing them with more than ordinary human physical and/or mental capacities.

'90s Nostalgia-Mining and the Scandals of that Era

Amid the exploitation of memories of the '90s by the pop culture industry these past many years some have seized on the scandals of the era for material. Thus did we get a feature film about the Tonya Harding scandal, and FX miniseries' about the Bill Clinton impeachment and the O.J. Simpson trial.

I have no idea how much of a public response they really drew. What I can say is that these scandals absolutely do not make me nostalgic for the '90s. Quite the contrary, unlike, for example, 16-bit-era console gaming, the sci-fi shows of the era, the golden age of The Simpsons, or Baywatch, or any number of other things which really do make me feel nostalgic, what they bring to mind is the horror and disgust I felt when I looked away from the day's more amusing pop cultural products at the state of the real world, and the news media that brought it to us, the vileness of which played its part in the world's going from one catastrophe to the next.

Hiram Lee on O.J. Simpson

In the wake of the media response to O.J. Simpson's arrest back in 2007 Hiram Lee published a piece titled "The Media's Obsession with O.J. Simpson" (emphasis added).

As the title of Lee's item indicates the piece was about how the media, not the public, was obsessed with Simpson, even as with extreme stupidity and sanctimoniousness the media's talking heads relentlessly insisted that it was the public was obsessed, and forced the media against its will to attend to the matter to the neglect of all the rest of what was happening in the world. (As Lee wrote, the "anchors and pundits . . . occasionally pose the question: 'Why are we so interested in O.J. Simpson?' . . . lament the drawn out and salacious" coverage, and "[t]hen, with feigned regret . . . return to the tawdry story at hand.")

As Lee remarked, those in the media who did so "attribute[d] their own shameful behavior to the supposed demands of a coarsened, celebrity-obsessed audience," while totally eliding "their own role in cultivating and directing such attitudes toward celebrity culture."

An obscenity in 1995, their behavior was more obscene still in 2007. If, contrary to the celebration of the era as one of peace and prosperity the actual '90s were very troubled years, and the truth was that anyone of even slight intelligence knew it. (Indeed, as one of Mr. Lee's colleagues put it, "Despite the official triumphalism, America was coming apart at the seams.") However, the situation in 2007 was graver still amid the war in Iraq, and the first signs of a historic financial crisis, in the shadow of both of which we have lived ever since.

Talking about O.J. was a way of diverting public attention away from that, to say nothing of more broadly stultifying the public mind--and if some of the public went along with it that did not change the fact that it was the media, not the public, driving this particular piece of idiocy.

Remember that as the remembrances of the trial in which the media is now awash speak of us all having been "captivated" by the trial.

Saturday, April 20, 2024

Harry Turtledove's Ruled Britannia: A Few Thoughts

For quite a number of years I was an avid reader of Harry Turtledove's alternate histories--running out and grabbing the latest installment in his Timeline-191 series when it became available year in, year out. I enjoyed his rigorous working out of his scenarios, which, in contrast with so much alternate history, made the basis of his story a genuinely compelling counterfactual, in contrast with the flimsy "What ifs?" on which so many of his colleagues have relied. I also enjoyed the "big picture" emphasis of his narratives, presented through his large but still manageable and strategically arranged casts of viewpoint characters, and the briskness of his narratives, which seemed to have been worked out mathematically but effectually (Turtledove cutting among twenty viewpoint characters, each of whom got six four-page scenes by the end of the volume).

Turtledove's Ruled Britannia was a very differently structured book, less oriented to the big picture, and more narrowly focused on a mere two characters rather than twenty--in this case, William Shakespeare and Spanish Golden Age playwright Lope de Vega, brought face to face by a successful Spanish conquest of Britain in 1588 that had de Vega a soldier of the occupying Spanish army who, because of his predilection for the theater, is tasked with keeping an eye on a Shakespeare who has become drawn into a plot by old Elizabethan loyalists to stage a rebellion which will drive the Spanish out. The comparatively original premise (neither Civil War nor World War II!) intrigued me, as did the promise to depict these actual historical figures in this altered timeline, an approach which has always appealed to me. (Of the Timeline-191 novels my favorite was and remains How Few Remain, precisely because of its stress on actual personages.) Still, I wondered if Turtledove would manage to hold my interest with this approach through the nearly five hundred page narrative.

For my part I thought the book longer than it had to be, and I could have done without a good many bits. (I think we saw more of de Vega the self-satisfied womanizer than we needed to, for instance, and did not care for the usage of Christopher Marlowe either, or for that matter the final confrontation between these two figures, which seemed to me entirely pointless.) Still, in spite of the unnecessary or bothersome patches Turtledove pleasantly surprised me by carrying this more focused narrative. In doing that it helped not only that Turtledove displayed some adroitness in developing the cat and mouse game between Shakespeare and the Spanish occupiers, but that Turtledove presents Shakespeare, and de Vega, as both human beings rather than pedestal-placed literary titans--Shakespeare in particular a man whose talent may have marked him out for greater things but in the here and now a jobbing actor and writer trying to make a living and thrust less than willingly into high intrigue. Particularly commendable was Turtledove's not shying away from the difficult task his own plot presented him--taking seriously Shakespeare's enlistment to produce a propaganda play in aid of the rebellion (about the life of Iceni queen Boudicca), and letting us see just enough of it dramatized at the climax to give the project on which the whole plot rests some solidity. The closing lines of the play ("No epilogue here, unless you make it/If you want freedom go and take it") struck me as a bit more Brecht than Bard, but on the whole the pastiche worked, with that close entirely logical in the circumstances, and all this to the good of a climax, and denouement, that drew all the narrative strands together in very satisfying fashion.

On Cryptohistory, the Paranormal and William Shakespeare

One of the oddities of contemporary culture is how persons who ordinarily find history dull suddenly become attentive when someone mentions "aliens"--in the sense of extraterrestrials having been a part of it.

The explanation of this seems to me to be that those persons' interest is still not in history, but the possibility that the claims for the existence of extraterrestrials visiting Earth has been validated.

But that raises the question of why precisely they should care to prove that such visitations have happened. Why should so many people be invested in this?

About that I am not at all clear, but I have noticed a similar interest in much more down-to-Earth, less world-shaking, subjects, such as the possibility of the authorship of William Shakespeare's plays by someone other than William Shakespeare. It seems that, just as with history, people who are not normally interested in Shakespeare, and not knowledgeable about Shakespeare, get interested when, for example, Edward de Vere or Francis Bacon or Walter Raleigh or somesuch is mentioned as the real author of his works. The result is that if some are quite invested in the controversy and make detailed claims on the basis of Shakespeare's plays (all the evidence is indirect in nature), a significant number of people far from capable of making or appreciating such arguments still find interest in the essential claim--which comes down to people being fascinated by the thought that productions they have not read or seen or cared about that have been attributed to one historical figure of whom they know next to nothing actually being attributable to another figure of whom they likely know even less.

Considering both obsessions it can seem symbolic that Roland Emmerich, who made the aliens-visited-Earth-in-the-past movies Stargate (1994) and Independence Day (1996) later directed a movie dramatizing Edward de Vere's writing the plays and using Shakespeare as a front, Anonymous (2011). Didn't see it? That's okay, pretty much no one did, and it would not seem that you or they or anyone else missed much in not doing so--certainly to go by David Walsh's take on the film some years ago (which, as may be expected of those who have read Walsh at his best, is most certainly attentive toward and insightful into the pseudo-controversy over the authorship of Shakespeare's works, which he went so far as to follow up in a second item on the matter).

"All Art is Propaganda," Somebody Said, Somewhere

Looking back it seems that many an early twentieth century literary great claimed that "All art is propaganda." George Orwell seems the one most associated with the phrase, which he states in his 1940 essay on Charles Dickens. However, Upton Sinclair said exactly the same thing in exactly those words in his epic 1925 history of art, Mammonart, with which Orwell might have been familiar given Orwell's praises for Sinclair's work.

That far more people seem to associate the phrase with Orwell than with Sinclair, I suppose, reflects the greater respectability of the former than the latter these days. Like Orwell Sinclair shifted away from his earlier political stances, but Sinclair's greatest work, fiction and nonfiction, is generally associated with his time as a committed socialist, whereas Orwell is best known for, and celebrated for, producing a work that, in spite of his much more complex attitudes and intentions, came to be regarded as the supreme piece of Cold Warrior literature, overshadowing all the rest of his work--which has been all the better for his memory given the prejudices of the tastemakers.

Jane Austen and Brazilian Jiu-Jitsu

As I remarked once, for me the most worthwhile passage in Jane Austen's Pride and Prejudice is the satirical dialogue about the idea of the "accomplished young lady."

Such accomplishment--if one goes by Thorstein Veblen's theory of the leisure class--merely the leisured showing off their greater resources, and the supposed greater prowess which affords them that leisure, by having their young ladies display costly and time-consuming training in skills of no practical value in her life in a supposedly masterful fashion, and announcing to the world that "This is why we are here and you are there, churls!"

Just as much as ever we are today bombarded with claims of such upper-class accomplishment (not least by way of pop culture). Still, the precise skills in which today's young lady, and gentleman, are expected to display "accomplishment" are different, with one example the martial arts. I do not in the slightest deny the validity of training in those arts for purposes of exercise, sport, self-defense and much else--but if we are to be honest the reason we are so often (falsely and stupidly) given the impression that "everyone" is a black belt holder of some kind (preferably something fashionable at the moment) is because in the United States such training has acquired an association with upper classness, and this because of the sustained commitment of significant disposable time and money required to complete such a training, which working persons are unlikely to have, with this reinforced by the way that unthinking conformists imitate what better-off people do, and of course, many, many, many more lie about doing so, and about their accomplishments and abilities in the course of that. (If "everyone" is a black belt, then who wears all the other belts?)

Otherwise we would not see thriller writers get away with such nonsense as having their heroes fight off a foe with a "judo kick."

In What Does Middle Classness Consist?

We hear about middle classness all the time, usually from commentators desirous of dissolving the reality of inequality in an image of generalized middleness. One approach to such dissolution is judging the matter not on the basis of material criteria actually having to do with the terms on which people live, but instead favoring educational levels, or the "values" they purport to espouse, and even self-identification (i.e. "If you think you're middle class, then you are!"). When they do acknowledge material criteria they often equate middle classness not with the income requirements permitting life at a middle class standard demarcated in some fashion (never mind the standard most seem to actually identify with middle classness), but the middle of the range of the income distribution, a very different thing that generally constitutes a far lower bar (depending on the context, one can be mid-income while being much less than middle class in any meaningful sense), in yet another shabby evasion of the sharp edges of social reality.

I tried to do better than that in my working papers on the subject. You can find these here.

The Obscurantism of "Genius"

I recall encountering a lengthy discussion of the concept of "genius" in a popular news magazine a long time ago (TIME, perhaps).

The author of the piece chalked up the desire to believe in "genius" to a romantic desire for transcendence.

One may well grant an element of that existing within the contemporary cult of genius, but it seems to me far from being the whole of it. More important, I think, is the contradiction between the complexity and scale of modern life, and the prevailing individualistic intellectual and emotional attitudes.

Consider, for instance, scientific life. The scientific endeavor is old, vast, collective in spirit--sociologist Robert Merton, indeed, seeing the collectivist attitude toward scientific knowledge as in fact one of the key elements of the scientific ethos. Scientists build on the work of predecessors in conjunction with their colleagues, such that the individual efforts all flow into a common stream so much larger than any of them that it can seem foolishness to worry too much about whether this or that drop came from that particular tributary of the great river--with the depth and intricacy of the collective, collaborative, aspect getting only the more conspicuous to go by the sheer number of authors on single scientific papers today (a function of just how painfully specialized the work has become). However, a culture accustomed to individualism, indeed vehement about explaining results in terms of individual achievement, individual choice, individual contribution, is more likely to stress single, towering figures who, because so much more is credited to them than any one person ever actually did, or for that matter probably could have done (especially insofar as all the others who helped lead up to them fall by the wayside), can only seem superhuman, magical, in a word, "geniuses." Thus in a common view physics had that Isaac Newton guy who did it all pretty much by himself--and never mind those giants on whose shoulders he supposedly stood. And then not much happened until that Albert Einstein guy, who singlehandedly vaulted us into the relativistic era with a few papers. And so forth.

The tendency obscures rather than illuminates--which is plausibly just fine with those who prefer to see humanity as consisting of a tiny elite of superhumans who accomplish everything and a vast mass of dross who owe that elite everything and should accordingly be groveling before them in the dirt lest Atlas decide they are not worth the trouble, and shrug. Naturally the word is bandied about much in our time--and never more than in the case of persons who, one way or another, seem to amass a lot of money (in a reminder of what, infinitely more than knowledge, is really valued in this society).

Thus was Ken Lay a genius. And Jeffrey Epstein. And Sam Bankman-Fried. And Elizabeth Holmes. And many, many others just like them. In the haste to acclaim such persons such the real reasons for the desperate attachment of persons of conventional mind to the concept become all too apparent.

Tom Clancy's Debt of Honor and Japanese Work Culture

Like the other major '90s-era American thriller writers who took up the then highly topical theme of Japanese-American relations Tom Clancy presumed to "explain" Japan to his American audience in Debt of Honor, retailing the clichés of the "experts," though perhaps less admiringly than others. In his account Japan's culture "demanded much of its citizens" in a way supposedly alien to even the Japanese-American character (Chet Nomura) through whose eyes Clancy showed his readers the country, with "[t]he boss . . . always right," the "good employee" the one "who did as he was told," and getting ahead requiring one to "kiss a lot of ass" and go the extra mile in regard to the display of loyalty to the employer ("sing the company song," come in "an hour early to show how sincere you were").

As is often the case with those who make much of "difference," all this was really much less "different" than Clancy made it seem--in America, too, the rules of getting ahead not so different. No more in America than anywhere else do bosses take unkindly to being wrong, and employees who do not do what they are told. And no less in America than anywhere else do "ass-kissers" get ahead, and those who refuse to play the game suffer for fighting to retain their dignity in the inherently degrading situation that is the personal subordination seen in the modern workplace.

Indeed, those familiar with what it takes to enter the more rarefied territories within the professions (like completing a prestigious medical residency), or the more prestigious firms of "Greed is good"-singing Wall Street and "Move Fast and Break Things" Silicon Valley--where many an employer is an insufferably smug idiot bully who considers themselves entitled to put anyone desirous of a position through sheer hell for the privilege of making money for them, or simply put having been an intern for their firm on their resume--will be less impressed than Clancy was by what he thought he knew about how different Japan was with regard to its exploitativeness and destructiveness of the minds and bodies of the ambitious "go-getter" that everyone is expected to take for their ideal.

Compounding the irony, this is generally to the approval of those who see things Clancy's way.

Are Bloggers Wasting Their Time Tinkering with Their Blogs in the Hopes of Getting More Readers?

I suspect at this stage of things that most of those who started blogging have been less than thrilled with their results--certainly to go by the vast number of bloggers who quit the endeavor, often after just a little while. They were given the impression that what goes online is necessarily seen by vast numbers of people, that they would not be crazy to expect the things they put up to "go viral," that they could end up web celebrities.

Instead they mostly found themselves ignored--and when not ignored often insulted. Indeed, many may have been disappointed to find that rather than persistence paying off they have got less and less result with time--finding the search engines less likely to index their posts than before, seeing less and less evidence of actual humans reading their blog by leaving a comment or posting a link, as they suspect that what page views they get are spam, bots, and the rest.

There is no great mystery in this. The reality is that online the ratio of people looking for attention is extremely high relative to those ready to pay attention. The reality is that the great majority of Internet usage is passive, with any kind of engagement a rarity (very, very few of those who do find anything likely to share it with others). The reality is that, where the blogger especially is concerned, the Internet has been moving away from full-bodied text toward shorter-form content (like microblogging) and audiovisual experience (the vlogger rather than even the microblogger). And as if all this did not suffice to mean that things were going from bad to worse the game is entirely rigged at every level in favor of big names, big money, those with legacy media connections, and the other advantages.

Alas, these obvious realities are something few are ready to admit--instead what prevails a stupid meritocratic-aspirationalist view that treats every outcome as the result of a tough but fair test of individual talent and effort. And even those who can see through that stupidity often do not simply shrug their shoulders, but strive instead for a better outcome. One way is by trying to get search engines to treat them more favorably (indexing more of their items, ranking them more highly in search results, etc.), the more in as many a would-be advice-giver claims to know what will do the trick.

Should they post more frequently, or less frequently? Should they go for longer posts, or shorter ones? Are there too many links in their posts, or not enough? Is it a mistake to have so many internal links--for instance, posts that provide handy collections of links to past posts? Should they be weeding out old posts that perhaps did not get looked at as much as others? And so on and so forth. They will hear a lot of advice (typically vague yet completely contradictory advice, supported by no evidence whatsoever)--and if acting on it, spend a lot of time on things that have nothing to do with actual blogging.

My experience is that all that hassle will not accomplish much for most of them, for many reasons. My personal suspicion is that the infrequency with which search engines "crawl" your blog means that it will be a long time before you get any positive result, even if you really do achieve one, which seems to me doubtful. I suspect that the search engines are simply overwhelmed by the amount of content out there, and the efforts to manipulate it--to the extent that they are not catering to the sponsors, demands for censorship, etc. that doubtless tie up much in knots even beyond the direct effect of such corruption of search results (and that half-baked experiments with artificial intelligence are likely lousing things up further). The result is that they probably ought to spare themselves the trouble--and, if they really think their blogging has any meaning at all, get on with that instead.

Never Underestimate the Suckerdom of Bosses

A while back Cory Doctorow rather nicely summed up what I suspect is (contrary to the past decade's Frey-Benedikt study-kickstarted hype, and the post-Open AI GPT-3.5 hype too) the real state of progress in applying artificial intelligence to the vast majority of tasks workers actually perform: "we're nowhere near a place where bots can steal your job," but "we're certainly at the point where your boss can be suckered into firing you and replacing you with a bot that fails at doing your job."

For the worker who gets fired, alas, that the bot they were replaced with fails to do their job is cold comfort. They remain fired, after all--and the odds of their getting a phone call asking them to come back (never mind come back and get compensated for what they were put through) seem to me very slim indeed.

Such is the reality of the relationship between power and responsibility, above all in the Market.

The Unusability of the Internet and the Taming of the Cyber-Space frontier

Back in the 1990s there was a good deal of cyber-utopianism. I think of most of it as having been hype for the web and the tech sector, and for Silicon Valley, as well as broader "market populist" justifications for the neoliberal age by way of their fantasies about the information age--but all the same the idea was there that those who had been shut out, marginalized, denied a chance to speak might have a chance to make themselves heard here.

I suspect that most of those who acted on such premises eventually discovered the cruel lie for themselves, that the Internet was less radical in these ways than advertised, and became less and less so as time went on. That it is a medium whose usership is essentially passive, and carefully gatekept, so that Big Money, legacy media help and well-resourced supporters make all the difference between whether one's message is noticed or not, all to the advantage of the rich and powerful, and disadvantage of the poor and weak. Now with the Internet the mess that it is one may imagine that ever more than before the only way to get a message out through it is to put ever more resources behind it--as those who cannot, whatever they have to say, are seen and heard by virtually no one but themselves.

The End of the Cyber-Frontier

Back in the 1990s the cyber-utopians, reflecting their ideological background, seemed inclined to portray the Internet as a new frontier for the taking. (Indeed, Thomas Friedman, in the midst of lionizing Enron for its business model that he was sure was a perfect symbol of how the twenty-first century would belong to America, exalted its activity as part of a great "cyberspace land grab.")

Today there is no sense of anything like that online, the cyber-frontier long since closed--and, just as with the frontier in American history rather than the version of it the mythmakers continue to promulgate to this day, the result has been less a mass enfranchisement of the many than the further enrichment of an already rich few, as we are reminded by how much, regardless of how Jim Cramer rearranges the letters in his acronyms, a handful of giant firms, the same ones for rather a long time now, have turned what (falsely) seemed an unclaimed continent ripe for the taking into their private domains, and that ever more fully as such of the small freeholders who had made a place for themselves find their situation less and less tenable.

Nostalgia as Protest--and its Limitations

I suppose that where nostalgia is concerned we can speak of "pull" and "push." We may be pulled toward some aspect of the past by its apparent intrinsic fascination--but we may also be pushed away from the present by its repulsions, and this exactly why some frown upon nostalgia as they do.

Preferring the past to the present has historically been the preserve of the conservative and especially the reactionary, who really do want to bring back something of the past. But in a reactionary era the liberal might find comfort in looking backward. So have many liberals, faced with the '80s and all that came after, been nostalgic for the '60s.

It is a more awkward fit in their case. Those who believe in progress, if sincere about their belief, should expect that the best days lie ahead rather than behind them. Thinking otherwise is a reflection of the profound demoralization on their part that has played a far from inconsiderable part in moving the world further and further away from what they said they wanted it to be--while this nostalgia for the past, alas, has been far from the only expression of that sad state.

Of "Optimism"

"Optimism" has always struck me as a slippery word whose usage often betrays a lightness of mind. To be optimistic, after all, always means being optimistic in relation to a particular thing--and unavoidably, being pessimistic about another thing. (I am optimistic about how "our team" will do, which means I am pessimistic about how "their team" will do when we go up against them in the competition, for instance.)

So which do we focus on, the optimism or the pessimism?

As is the case with so much else in life, Authority tends to decide what counts as "optimistic," and of course they identify it with unquestioning faith in the status quo and unswerving conformism, pessimism with the opposite attitude--even though optimism about what exists tends to go with pessimism about much else. (Thus the classical conservative is a believer in the wisdom of traditional social arrangements, and optimistic about adherence to them--but pessimistic about human nature, reason, the prospects for a more just social order. Whether they are optimistic or pessimistic, again, depends on who is doing the talking.)

Confusing matters further in doing the above those in charge often promote what they like to call "optimism" as a broad mental attitude--and a positive character trait, and a lack of such optimism also such an attitude, pessimism equally a broad mental attitude and a negative character trait of which one ought to be ashamed.

As if the way in which optimism and pessimism tend to be bound up, and one or the other emphasized in a shabbily manipulative way, were not enough, it is worth noting another great ambiguity within the term and its usage. There is, on the one hand, the optimism of the person who looks at a problem and says "I see a problem, but I think I can fix it." At the same time there is the person whose response is, instead, "I see a problem but I'm sure it will all work itself out somehow."

The former is the optimism of the problem-solver ready to work for a solution. The latter is the optimism of the person who thinks that they do not have to solve a problem--a response that can be genuine, but often bespeaks complacency, irresponsibility, callousness, in which case optimism is nothing but a lazy or cowardly dodge, with this still more blatant in the case of those who see a problem and then deny that they see any problem at all. These days, this, too, accounts for a very large part of what is said and done under the banner of "optimism."

Naturally we ought to be a lot more careful about how we use the word--and how we respond when others use it, not being intimidated or ashamed when they chastise us for failing in "optimism" according to whatever standard they presume to apply to us for their own convenience.

The Supposed End of Work, and the Drivel Spoken About It

A few years back, when the hype about automation anticipated more than glorified autocompletes, fashionable "thinkers" proclaimed the imminence of radical changes in day-to-day living. Completely misunderstanding the 2013 Frey-Osborne study of automation they thought we were looking at "the end of work" within a few years.

Thus did we have a little, mostly unserious, talk about how society could adapt to half of the work force being put out of a job, like what the relation of work to income would be.

Amid it all, of course, were those who assured everyone that work would still be part of people's lives, with these usually seeming to mean alienated labor done in the market for an employer for wages, the psychological and "spiritual" value of which presumably eternal arrangement they insisted were salutary, claims the media generally treated with the utmost respect.

Of course, few had anything to say about the fact that the media was usually quoting elite commentators whose ideas about what "work" is were remote from the lives of 99 percent of the public, whose experience and thoughts were, as usual, of no account. Many of these, for many of whom "work" has never ceased to mean the hell on Earth that we see in Emile Zola's Germinal, or Upton Sinclair's The Jungle, or Takiji Kobayashi's Kanikosen--and many more for whom work means far, far less but still considerable suffering--would disagree with those "tech" company vice-presidents who think that their sipping a cappuccino on the G6 as they fly home from Davos is "hard work" about whether work is the uplifting and necessary thing they say we cannot do without, instead hopefully looking ahead to the possibility of a different world.

The Beginning of the End of the Cult of the Good School?

By "cult of the Good School" I refer to the profoundly irrational hierarchy that exists among institutions of higher learning in the minds of the conventional wisdom-abiding.

In speaking of that hierarchy as irrational I am not denying that schools differ with regard to the selectivity of their student intake, and the resources available to them for the fulfillment of their educational mission; or denying that that selectivity and those resources may make a difference in the quality of the education and/or training they impart to their graduates. Nor do I deny that these practical differences--or for that matter, the less rational reliance on institutional "reputation"--may make a difference to employers, with implications for their graduates' outcomes with regard to employment and income.

However, the truth is that very few making distinctions among those schools actually know anything about how the schools in question actually stack up against each other in even the more measurable, practical ways (for instance, the average quality of undergraduate instruction); or actually know anything about exactly what difference the school a student graduated from makes in the job market. Rather they just "know" that one school is "better" than others, with most carrying around an image of various strata in their minds--with the Ivy League "better" than others, and within that stratum Harvard better than Yale, all as private schools generally come in ahead of public schools, etcetera, etcetera, albeit with many exceptions all along the line. (Much as many desire to park their car in Harvard yard, the science major will not be looked down upon for choosing the Massachusetts Institute of Technology instead--while Berkley, UCLA, Chapel Hill, among others, in spite of being state institutions, are quite well respected indeed, more so than most private institutions.)

Those who think this way imagine that any young person of intelligence and ambition ought to aim as high as they can in the hierarchy--and if they do not their elders, and conforming peers, think there is something wrong with them. Should they question those elders' and those peers' presumption they will not get an intelligent answer--this "Just what people do."

The insistence on all this ultimately come down to simple-minded snobbery, especially insofar as the data that even the more rigorous sift tend to be flawed and even corrupt. Meanwhile a good deal of longstanding statistical data, confirmed again and again by more recent studies, calls into question the idea of some great wage premium deriving from going to the elite institutions, or even inherent employability. The "power elite" may disproportionately be associated with such colleges, but, even setting aside the fact that the advantages most of them had did not begin or end with the college they attended (where they are, after all, less likely to be going because they were exceptionally bright than because they were exceptionally privileged), the average is a different matter. Stupid writers for trash publications may make claims such as that one can major in somersaults at Harvard and still get a "good job," but as was reconfirmed by a recent study, your major is probably going to be a bigger determinant of whether you get a job in your line and what it will pay than the school to which you went.

However, this foolishness is now up against an era of tighter means, diminished prospects, and greater skepticism about the value of a college education generally, and specific avenues to getting a college education particularly--and as a result of weighing the actual dollars and cents return on their investment in a college degree, and finding that they will very probably be better off commuting to the nearest campus of the university system in their state (maybe after racking up as many credits as they can at the still closer closest campus of their community college) than selling themselves into lifetime debt slavery for "the college experience" of which so many college presidents pompously speak at a Brand Name School. Of course, considering that it is worth acknowledging that the people whose ideas are the mainstream ideas are the ones most remote from the situation of tightened means and diminished prospects; and have absolutely zero respect for the views of the majority. Still, their changing attitude in the face of the crass stupidities of the Cult of the Good School seem likely to increasingly expose those crass stupidities for exactly what they are over time.

Does it Make Sense to Self-Publish a Book in 2024 (if Ever it Did)?

Considering the question that is the title of this post I think it worth starting with a bit of background (covered before in some degree, but worth revisiting). Specifically, circa 2010, while self-publishing was not new, the proliferation of the e-book reader and print-on-demand publishing, significantly by way of Amazon, made part of self-publishing much easier. Writing, editing, copyediting were as costly as ever. But the cost of physically reproducing and distributing the book fell dramatically. For nothing up front anyone could take a manuscript, upload it, and in days have it on sale at retailers all around the world. Meanwhile, if there was still the problem of promotion, there was optimism that the Internet could facilitate that at similarly low cost--and that things generally were looking up. It seemed easier to get people to take chances on 99 cent e-books than on more expensive print works, and e-books were supposed to be moving toward dominance of the market; while it was to be hoped that the self-published would develop ways and means to publicize themselves, and win their way toward acceptance. There were, for instance, notions of book blogs providing a promotional ecosystem for the self-published writer.

Alas, the traditional publishers, faced with the disruption of their business by the e-book, did what companies usually do in the face of such challenge--fight back against disruption, and not by becoming "more competitive" but rather any way they could fair or unfair, and as is usually the case when established firms fight back from their position of advantage with every means at their disposal, they succeeded. The containment of the e-book (I suspect fewer people have e-readers now than did so in 2015) meant the containment of the self-published to a significant degree, all as the Internet, the utility of which for low-cost promotional efforts was probably always far lower than cyber-utopians would have had the public believe, became a lot less conducive to such efforts, with the web increasingly crowded, and search engines, social media and the rest increasingly "enshittified," fragmenting online life so that only big, well-funded, operations had much chance of reaching a wider public, to the disadvantage of the self-published, with this including not only book writers but bloggers, who found their audiences disappearing.

Likely significant in that has been the evolution of the Internet in the broadband era, away from the written word, and toward audiovisual content, often taken in over small screens carried by people on the go. The vlog replaced the blog, while people gravitated away from Facebook to TikTok--as reading generally collapsed. Consider, for instance, how the young adult book boom went bust back in 2015--not coincidentally, about the time that a majority of young people had acquired smart phones. Consider, too, how even as the e-book was contained, the sale of the mass market paperback collapsed (which is why, very likely, you have noticed the disappearance of the paperback racks at the supermarkets, convenience stores and other retail outlets you frequent).

The result is that I suspect that, all other things being equal, getting any self-published work to an audience requires doing more for less than was the case a decade ago--maybe much more for much less.

Of course, all things are not equal. And therein lie the questions that you probably should ask when considering the endeavor, specifically "Is there anything that will give this book a chance in the face of all that?" And if the answer to the first question is "'No,'" "Is what I have to get out there so important that it is worth my while to do that anyway?"

I admit that it can be very unsatisfying to ask a question and get in return another question. However, it is an honest opinion--which it seems to me is far more likely than you are likely to find in most places on the web, where all we hear from are the elitists of traditional publishing who think the self-published author has no right to be in the same market as they, and the services looking to bully and frighten the self-published author into handing over their money so that they profit whether the author themselves has any chance of achieving their own goals or not.

Politics, Art, Hypocrisy

One of the great hypocrisies of art criticism is the sneer at art for being "political."

After all, in treating any subject--or refusing to treat that subject--one makes choices in which there is a politics, even if only at any implicit or unconscious level.

The result is that the condemnation is not of political art but of art the critic does not like--with, given the tendency toward Establishment politics of any critic likely to have a major platform, what they do not like dissent.

Of course, they are free to not like something for taking a view different from their own. Everybody likes having their view of the world validated, and dislikes having it challenged, and one or the other may be part of their experience of an art work. The problem is that rather than forthrightly owning to their prejudices they promulgate that double standard--because they are lacking in sufficient self-awareness to recognize it (as they will not if they are people of conventional outlook, as most people are), because they are too cowardly to admit to what they do recognize in the work and in themselves (specifically that they are not broad-minded enough to stand the disagreement), or simply because the admission would make more difficult the hatchet job they so delight in doing on a work that displeases them (or which, their feelings apart, may simply be required of them by peers, editors or anyone else).

As all this indicates, looking at a work with which one disagrees, appraising its good qualities and bad, and then discussing that openly and freely, is much, much more difficult than most realize--and frankly beyond many of those who strut about calling themselves professional critics.

Tom Clancy: Another View

Some time ago I came across Keith Pille's analysis of Tom Clancy's novels.

The approach he takes differs greatly from mine. Where in considering the sensibility of the books what was uppermost in my mind were the events of the day, and the larger tendency of America's political life (the ascent of "post-Vietnam" right-wing backlash, the neoliberal and neoconservative ascendancies that benefited so much from them, the "Second Cold War"), Pille's stress is more cultural and generational--examining Clancy's world view as that of a "boomer dad," for whom football and lawn-mowing are hugely important signifiers. Some of it works well, some less so. (Pille is entirely right when he remarks how wrong Clancy was about the effects that foreign attacks on U.S. soil would have on American political life, but has little to say about why Clancy was so wrong--and I think the cultural politics-minded use of the "boomer dad" world-view as a lens is simply less conducive to that.) Still, it is on the whole an interesting and worthwhile reader for anyone up for a critical take on the books.

On Irony

Recently writing about "the mood of the 1990s" it was completely unironic that the matter of irony came up in my research, and my writing--and so too why so many were drawn to it then and after.

The plain and simple fact of the matter is that in being ironic toward something or someone people get to feel superior, without that superiority conferring on them any responsibility whatsoever.

They get to stand back and watch as other people go over a cliff and be very pleased with themselves about how much smarter they are in not going over cliffs, and how they have no obligation to care about those going over the cliff.

Naturally the self-absorbed, the vain, the selfish, the mean, and of course the snobbish, find it irresistible--and in the process reveal themselves for what they really are.

I suppose there is irony in that too, albeit one they would vehemently deny even if it did not go over their heads.

"Is This Blog Written by an AI?"

In answer to the question that is the title of this post, "No."

It may well be that others are handing over the job of generating "content" to glorified autocompletes, but I have not done so. What you see here is all mine, and that is how you can expect it to stay, not because I have anything against Artificial Intelligence (quite the contrary), but because the whole reason for this blog's existence is to present my thoughts, my ideas, my posts, not someone or something else's. And so will it remain for the foreseeable future.

"Are You a Robot?"

Recently revisiting the theorizing about the "information age" of Alvin Toffler and others I have mainly focused on what has not happened. Our economic life did not "dematerialize" through the radically intensive substitution of information for material inputs in the way he described to produce a different form of civilization which has combined sustainability with abundance. This is exemplified by how where he thought the whole world would be running on renewables by 2025 well into 2024 we remain a long, long way away from any such future according to any conventional reckoning.

Rather than an information age civilization we have at best scraps of one--just as, contrary to the stupid technological hype ever-prevailing, we have only scraps of the imaginings of the past about how far technology was to have progressed by the early twenty-first century. Thus is it the case that, even though we are a long way from Blade Runner-style androids we find ourselves constantly subjected to a dumbed-down Voight-Kampff test as we roam the web, asked "Are You a Robot?" everywhere we go, precisely because a far, far more basic type of "bot" has proven enough to cause a very great deal of confusion within the very modest and ever more decrepit version of an Internet this civilization has managed to build.

The End of Fan Fiction?

The question of whether "fan fiction" is in decline has at this stage of things been debated for decades.

It has long seemed to me quite logical that it should be in decline.

One reason is that fan fiction thrives on fans being able to sink their teeth into an exciting world--and for many, many years now we have seen the media, instead of offering new worlds, serve up the same old thing again and again, to diminishing returns. (Consider how at FanFiction.net Harry Potter is far and away the biggest fandom. When was the last time we had a really new thing become a hit on that scale?)

There has also been the extreme fragmentation of pop culture--which means that fewer and fewer people are all likely to be looking at any one thing at the same time, about which only a very limited percentage is likely to be sufficiently moved to write stories (which, again, works against any Harry Potter-like phenomenon ever occurring), with all that means for the proportions the fandom is likely to achieve.

Let us also acknowledge that the base of fan fiction has always been the young--and that almost two decades into the age of the smart phone the young are probably less likely to read and write recreationally than their predecessors, with all that implies for their writing fan fiction. (It probably matters that the Internet was just taking off when Harry Potter arrived, that through many of the early years of the fandom when people did experience the Internet it was through a desktop--which left more time for books, for example, as compared with the life of someone young enough to not be able to remember not having had a smart phone in their hand.)

Apart from affecting the inclination to write fan fiction, all this also affects the inclination to read it--translating to the lack of an audience for those who do get something out there, which is no encouragement to keep writing and sharing such stories. Indeed, considering all this I find myself thinking of the reality that so many of those who write fan fiction are responding not to books, but to movies and TV shows. Even when employing the written word they have audiovisual media on their minds--and I suspect that for them a fan fiction story is a poor and distant second to what they would really like to produce, fan movies and shows of the kind only a few can make for lack of the financial and technical resources, which may well dampen their enthusiasm. In fact, it may well be that, just as the vlogger probably does their part in drawing attention away from the old-fashioned blog, the fan works that those who can produce them do generate are likely claiming the attention of the audience that would once have whiled away its time in the old fan fiction repositories.

"Paying Your Dues"

Where the right to a position or some other such good is concerned people often speak of having "paid their dues."

One might take this to mean their having earned the position through evident preparations for that position's tasks and demonstrations of the relevant competences gained thereby--as with an artisan who completed apprenticeship and wander year, produced a "masterpiece," and thus has shown their worthiness to stand with the other masters in a guild, with all the rights (and responsibilities) pertaining to that.

However, the demonstration of actual merit is not what the phrase "paying one's dues" calls to mind. Rather it makes one think of membership in an exclusive social club--but they are, of course, not usually looking to get into a social club per se, or paying money to do it. However, that word "pay" still seems relevant, calling to mind how people used to buy offices, and why they were prepared to do so, namely to make money out of that office, commonly by taking bribes.

As in that case, those who speak of having "paid their dues" have let others exploit them in the past so that they can be in the privileged position where they get to exploit others later in their turn.

It is an essentially nasty concept--the kind of nastiness quite natural to the mediocre conformist mind that believes in "playing the game."

When Should a Blogger Call it Quits?

As I have remarked in the past on this blog (and elsewhere), blogging seems to me an activity in decline as the way in which people use the Internet moves on--and arguably also as the Internet decays into a state we can hardly predict now. Those who have tried blogging, and not got the results they hoped for (for instance, with regard to building an audience online), have to know that the going is probably just going to get tougher as all of this continues.

Might it then be that the rational thing is to quit now?

I won't say that--or even pretend that I can even make figuring out the answer easy. But I do think that it is possible to at least suggest a starting point for figuring it out with this question: Does what I am doing here have intrinsic value to others or myself? In other words, does what you are putting out there deserve to be out there for its own sake, even if in the end it is possible that the world will not notice? Are you, for example, saying something that others need to hear even if they are not hearing it--because it is that important, and others are not saying it? Absent that, even if others do not hear it, do you find that it at least satisfies your need to speak--even if others do not seem to hear?

If a blog does that then I think one can justify continued plugging away even in the face of an indifferent world. If it does not, then it might be time to at least think about at least cutting back on the time and attention you put into your blog, if only for the sake of your own well-being.

Peer Review in the Age of Automated Scholarship

I have personally been on both sides of the peer review process, in more than one field. I thus know something of not just its virtues but its limitations and vulnerabilities firsthand. Altogether it seems to me rather a fragile thing, belonging within an earlier, slower-paced era in which the academic world was smaller and "clubbier." (Indeed, on more than one occasion reading my peer reviewer's comments I made guesses about who wrote the feedback--on one occasion having the guess confirmed as correct.)

In an era of chatbot-powered paper mills supplying a much more thoroughly global community of scholars living by "publish or perish" rules content to submit to the explosion of increasingly fee-charging journals that system seems too fragile to survive, as, indeed, absurdities break through its barriers to deluge us all in even worse nonsense than before. Much as elitists whine about the openness of the Internet (which I think a good thing on the whole, of which we unfortunately have had less these many years), the fact remains that even at its worst what we had on that open Internet was at least humans putting up content, rather than very incomplete artificial intelligences slapping words together; and those who put together an academic journal adhered to some minimum standard, especially because they expected subscribers, not authors, to pay for that journal's operation out of finite resources, which meant there was some filtration, far from perfect, but not wholly unhelpful, with all it means for even those journals where peer-review will still be effective (even their efforts tarnished).

I suspect the only real solution will mean effectively automating the examination of all this material--helping us weed out what is illegitimate--in step with the automation of its production. However, just as it seems that the effectual editing of content is a much higher-level skill than just churning stuff out, any such adaptation lies a long way away, leaving us making do as best we can with our old devices in the meantime.

The Crisis of Academic Publishing

These days we are hearing quite a bit about the turmoil within academic publishing--the explosion of dubious journals, the article processing fees (running into the thousands of dollars) scholars and scientists are now expected to pay for the privilege of presenting their work through journals old and new, the inundation of even the most reputable journals with "paper mill" and now artificial intelligence-generated content, much of it of risible quality, some of which makes it into print, on top of the abundance of problems the scholarly and scientific world already faces (like the ongoing crises of particular fields, such as medicine's "crisis of reproducibility" of research results).

To be frank, it leaves me that much more deeply disinclined to go back to submitting to academic journals, and that much more inclined to simply publish my working papers through the Social Science Research Network and leave them at that (especially given how having the official stamp of passage through a peer-reviewed publication now means less than it used to for those counting on being "led by Authority" to make up for their inability to judge a piece of scholarship on its own merits for themselves). Still, if it is enough for my purposes--indeed, very handy for my purposes (I no longer have to worry about items being too long or learn they were "not quite right" for a particular journal's range of concerns after many months of waiting, etc.)--I know that this does not answer all needs by a long way, and I can only wonder how, or even if, the system will come to cope with the new reality.

Of the Term "NIMBY"

I have never cared for the use of the term "NIMBY" (an acronym for "Not In My Back Yard"), which is typically as a sneer at those who object to some large piece of construction that might interfere with their lives. What it comes down to, after all (and this is the clearer when one considers the bad faith of the kind of people who like to attack others for not wanting things in their backyard) is sanctimonious contempt for people who think they have a right to a say in what goes on in their neighborhoods and communities; the idea that those who may want to build something in a particular place may not unreasonably be expected to go partway in making concessions to those people into whose lives they mean to bring what may well be large changes.

All of this has very pointed political implications--and as is usually the case when people gloss over them in this way, not pleasant ones.

Of the Word "Pundit"

The word "pundit" is ultimately derived from the Sanskrit word "pandita" meaning "learned."

Today, however, in English the word is usually used in reference to the members of the media commentariat, particularly those more prominent members of the commentariat known to the mainstream.

Given the quality of the comment that we get from the mainstream media it seems fair to say that the word has come to denote the extreme opposite of its original meaning.

For nothing can be further from "learned" than those the conventional hail as "pundits" today.

Writers Write About Writing

It is notorious that fiction, not least that fiction produced by Hollywood, rarely depicts work in a convincing way, even when the depiction is at the level of mere reference, never mind a much more detail-intensive dramatization. Whether the person we see is a lawyer or police officer, a journalist or a scientist--a tinker, tailor, soldier, spy--we can usually count on their job being depicted in a risibly false manner, and it is occasion for impressed comment when they do not depict it in a risibly false manner.

But it has always seemed to me that their false depiction of the writing life has been particularly egregious--because where most writers know nothing of those other jobs, they should know about the one they actually do, else their name would not ordinarily be in the credits of the movie or show we are looking at. Indeed, because they are themselves writers one would think that, like most other people, they would be irritated by how movies and TV get their job wrong--with this carrying over to how they present writers of print fiction, given that so many of those who write for TV and film have at least some experience of that endeavor.

However, they instead flog the same stale and profoundly misleading clichés.

There is the extreme simple-mindedness of their handling of the creative process--reflected in the trite references to "writer's block.

There is the way they gloss over the sheer hell that is the effort to publish what one has written.

There is the absence of any reference to revision and editing, and the wretchedness to which they so often reduce writers.

And of course, there are the endless scenes of "successful" authors smugly signing copies of their book for adoring fans.

All this should be the easier as so often the character who is a writer is that not because the plot really needs them to be one, but because writers rather lazily settle on that occupation for protagonists as simply "what they know"--allowing them to easily eschew the clichés, maybe provide a reference to the reality here and there. Alas, all this has proven far, far beyond them.

Those Lives in Which Chance Plays No Part

In Lost Illusions Balzac remarks that "there are lives in which chance plays no part"--such that those "despair[ing] of a life become stale and unprofitable in the present" find no relief in "outlook for the future," having as they do "nothing to look for, nothing to expect from chance."

Writing of one particular character that matter of chance comes up again and again, with that specific word "chance" coming up at least three dozen times in the text. Certainly as discussed here again and again chance means the "lucky break," the "big break," that make for high position and glorious careers of the kind that have the conventional conformist idiot most people with any sort of public platform seem to be taking mediocrities are superhumans of Renaissance Man versatility--as people who might prove themselves actual geniuses go through their lives thought idiots and treated as worthless because they were among those in whose lives chance played no part.

As Balzac made clear here, chances are not at all equally distributed--with chance overwhelmingly enjoyed by the privileged, the connected, the kind of people for whom the activity of "networking" actually has meaning, because they are in a position to network with people who have the power to do something for them, and could be induced to use it, because they in turn can do something for them.

For the rest of the world, alas, "chance" is too likely to simply mean "mischance"; mean disaster striking, and their "becoming a statistic."

Considering their lot I find myself thinking of the self-help drivel about "stepping outside your comfort zone," and "being open to new experiences." As things already are most people are pretty uncomfortable--and unprotected from life's shocks and their terrors, rather than failing to be "open to new experience." What they want, for perfectly good reason, is some comfort, and some protection; some security and some control of the kind of which they have too little, a reality to which the outside-their-comfort-zone-stepping, open-to-experience advice-purveying privileged nitwits are profoundly oblivious.

Review: Harold Coyle's Trial by Fire

Like the others who became really Big Names in the techno-thriller field, Harold Coyle made his name with Cold War stories, starting with 1987's Team Yankee, which depicts a NATO-Soviet clash in Central Europe as seen by an American tank crew. However, the end of the Cold War compelled him to shift to other themes, with his first post-Cold War book Trial by Fire, presents a scenario based on U.S. relations with Mexico. Here, in the wake of a takeover of Mexico by a new military junta--the "Council of Thirteen" intent on saving the country from corruption, crime and poverty--a Mexican crime boss driven into flight and exile by the Council's crackdown (Hector Alaman) manufactures a series of violent border incidents he intends for the U.S. to blame on the new Mexican regime, and compel the U.S. government to drive it from power, permitting him to resume his old position within the country.

As might be guessed from that scenario--and for that matter, experience of post-Cold War techno-thriller writing generally--Coyle had to make a good deal more effort than before to lay down the scenario that leads to the fighting (in comparison with his borrowing John Hackett's scenario for the armored warfare in 1987's Team Yankee, for example, or his only slightly developed background to the Soviet invasion of Iran in 1988's Sword Point). As it happened, Coyle displayed some adroitness in both choosing his scenario, and developing it. In envisioning a crisis of this kind with Mexico he set up a situation in which the U.S. may be overwhelmingly more powerful with regard to conventional military capability, such that it can destroy Mexico's armed forces with ease, but at the same time not manage to get that way what the U.S. wants--security along its two thousand mile border with the vast and populous neighbor to its south, tasks requiring levels of military manpower far, far beyond what the U.S. has at its disposal. (In one of the more memorable scenes Coyle's military officer protagonists explain the numbers to a stupid, sarcastic politician with little apparent ability or willingness to process the obvious.) Moreover, Coyle displays some nuance in depicting how the situation unfolds--not least in an alertness to the vulnerability of governments to being manipulated by parties, foreign and domestic, for their own ends; how prejudice, media sensationalism, political hucksterism can easily combine with the demand of some to "do something" and the plain and simple happenstance that throws sparks into a tinderbox to push policymakers who are supposed to "know better" into disastrous courses (with one aspect of note how the politics of border states like Texas color national policy in this area); how as crises escalate, and missions "creep," political objects become ambitious beyond anyone's actual ability to realize them, with this particularly happening when they win a conventional battle with ease only to find themselves trying to hold territory and impose their political will in an era of what Rupert Smith called "war amongst the people" and finding the task nowhere near so easy. (Indeed, much more than is the case with most techno-thrillers, the policymakers and "pundits" in Washington would have had a chance to learn something if they picked it up--to the extent that they have the faculty at all.)

Coyle's strengths here also extend to the depiction of the fighting itself, which here has the kind of grit one is unlikely to see in the more high-tech stories of aviators in super-planes--or of those writers of fighting on the ground inclined to present perfectly competent super-people rather than real human beings. When America and Mexico do go to war American officers do not always prove wise, strong or noble in making their decisions, or dealing with their consequences, which can and do include subordinates losing their lives (in contrast with one techno-thriller writer whose prose on this level was rightly compared with that of a "Victorian boy's book"). At the same time Coyle treats the Mexican characters on the opposite side of the battle from those Americans with a level of respect far from standard in the genre, the military officers of the Council sincerely acting as patriots, and intelligent ones, who have good reason to bristle at how their neighbor to the north sees them. (Where this rather nationalistic genre tends to treat foreigners' criticisms of the U.S. and its treatment of them as sanctimoniousness and worse, when Colonel Alfredo Guajardo complains that American representatives to his country lecture, threaten and dictate rather than talk to them we are getting his honest and not necessarily invalid appraisal of the relationship--while in making Guajardo the Council's "face" in America there is no sense that their view that his being of visibly European rather than indigenous descent would be advantageous for public relations with the United States government and the American public is at all an unfair charge of racism on their part.)

Of course, for all that the tale has its implausibilities—and its limitations. Even as Mexico's domestic troubles--not least, its poverty--loom large as a theme within the book, Coyle does not go into any great depth in discussing these, and his vision of the Council's attempt to redress the problems can seem a bit more muddled. (The Council does not breathe a word about nationalizations and the renunciation of foreign debt the way a left-wing government might, while instead planning painful austerity measures the way a right-wing government might; but also uses price controls and backs them up with drumhead justice in a manner not easily reconciled with the demands of the "Washington Consensus" to which it seems the Council means to accommodate the country.) There is, too, the way that Latin America and the Caribbean (including close, longtime U.S. allies such as Panama) line up behind Mexico against the U.S. in the crisis (a turn critical to even the limited extent to which Mexico is able to throw up a conventional defense against the U.S. intervention).

Meanwhile, getting away from the construction of the scenario that is on the whole one of the book's strengths there is much that is less than ideal. Like many an author a few books on an initially lean (or thin) writing gives way to prolixity--and not always to good effect. To his advantage a Clancy Coyle, if hardly likely to offer a warts-and-all view of life in uniform, is still at least prepared to acknowledge its more human side, with its moments of silliness and pettiness--an officer's awkward fumbling after a dropped uniform clip, arguments among units about the order of precedence at a ceremonial parade, an officer's foul mood after he is bested by a colleague in a training exercise, the disrespect that personnel of different ranks and specialties often feel for each other. Still, Coyle is only so successful in mining the quotidian for interest (a feat few writers ever really pull off), the more in as there is so much of it this time around--extended considerably by the prominent subplot about the then-fashionable topic of whether women should be allowed to serve in the armed forces' combat units by making a major character the first assigned to command an infantry platoon as part of an experimental program which proves plodding and predictable. (Did anyone picking this up in the '90s ever doubt that Lieutenant Nancy Kozak would prove herself as a combat officer and show up the naysayers?) Indeed, the only real spark of human interest this side of the book had to offer was in a very minor character--a harassed trucker called back to National Guard duty whose time in the story is concluded within a mere forty pages of his first appearance. Meanwhile what we see outside of uniform--the depictions of the media's coverage of the crisis by way of the adventures of Dixon's lover, reporter Jan Fields, and the inevitable inside-the-Beltway stuff--fall as flat as these things usually do in books like this one. (Fields seems to operate in a very different media business than the one we know, where reporters are free to follow stories without regard for pressures or interference from commercially- and politically-minded higher-ups, with the sensationalization of the crisis that helps escalate it somehow having nothing to do with her, never confronting her with a single dilemma; while the D.C. goings-on have the same sanitized quality.)

All of this contributed to the book's feeling cluttered and overlong in that way that gives the impression that the author in this case was not "cutting to the chase" because there really was just not much "chase," the political scenario, after all, still used as the basis not of a political thriller but a military techno-thriller which ends up with few military techno-thrills. The 446 page book, in which not the Mexican Council but the criminal Alaman is the real villain, gives the heroes what is in military terms a very small-time opponent indeed by the genre's standards, while the action is also "small-time," consisting mainly of a few episodes of small unit-level violence on the border functioning mainly as dramatizations of Alaman's plot, and a few glimpses of the principal "set piece" of the novel, the "Battle of Monterey" following from U.S. forces' attempt to establish a security zone along the southern side of the Mexican border (the mortaring of a truck convoy, an anti-armor ambush against a tank battalion)--a Lilliputian affair next to the battles in Coyle's earlier Sword Point, bookended by a couple of limited-scale air assaults bookending the violent action. Meanwhile, just as Jack Ryan quickly became too senior to get into the action very much, his "Coyleverse" counterpart Army officer Scott Dixon, now a Lieutenant Colonel on a divisional staff, is a non-participant (and indeed, rather realistically denied permission when personal motives impel him to try and play action hero near the end), denying the fighting some of what personal edge it may have had. The result is that, even if the book is a much more than usually intelligent treatment of its principal theme, its particular dramas lent themselves less well to the constraints and demands of the techno-thriller form than those of a few years earlier--and indeed, I can imagine that a more satisfying novel might have come had there not been an attempt to accommodate it to those constraints.

Subscribe Now: Feed Icon