Saturday, April 20, 2024

The Irony of Jaws 19

In its first act Back to the Future, Part II was set in the then futuristic-seeming year of 2015. Some aspects of that future (mostly the attractive ones) have come to seem fairly dated--like the idea of flying cars running on garbage, and hoverboards, and weather reports accurate down to the minute. Others have proven more perceptive--like how '80s nostalgia would be big in the '10s, down to a fondness on the part of some for retro gaming.

Others were a mix of both. In the film's depiction of "Jaws 19" playing at theaters the film was entirely right about how shameless Hollywood would become about indulging its addiction to franchises. But it was, ironically, wrong about the Jaws franchise being the epitome of that. After Jaws 4 flopped back in the summer of 1987 (not the only fourquel to do so that box office season) it pretty much left Jaws alone, putting an end to one of the franchises that launched Hollywood along its current, high concept-dominated path, even as it stuck like glue to just about every other franchise of those same years, so that the Star Wars and Superman and Star Trek and Alien movies just kept on coming.

"Realism" in a Misanthropic Era

There is some disagreement about exactly when literary realism as we know it began, and what it includes, but it is safe to say that it at least refers to a modern literary and broader artistic movement premised on a scientific view of life that bases its writing on a rigorous and copious usage of observable fact, rather than writing from supposedly transcendent truths, romanticism, myth, artificial conventions.

These days, however, the term has taken on a particular baggage, with as is often the case film critic David Walsh worth citing. Writing of one film that had been described as "realistic" he observed that it had become common to simply use the term to mean "presenting humanity in the dimmest possible light." Thus "[f]ilm writers and directors seek to outdo each other . . . in their depictions of people's depravity and sadism"--while displaying an "almost grateful acceptance of the filthiness." One can add to what Walsh says that they delight in the display, in rubbing what they present in the viewer's face with a "Welcome to the real world!" swagger as they endeavor to destroy any hope the reader, the viewer, may have of there ever being anything better, not least by implicating them in the evil they see. (Presenting, for example, the horrors of war, and what some think some must be done to prepare for them, they say "You want me on that wall." Presenting horrific forms of economic exploitation they say "You are a beneficiary of this, too, and so forfeit your right to say anything about it.")

There are certain political assumptions there--very dark assumptions about human beings and society and the universe in which they live (which, past a certain point, may actually be quite contrary to the scientific, rationalist character of realism), and one might suggest, at least grounds for accusation of political manipulation (with such manipulation, again, at odds with any genuine realism).

In an era in which a modicum of comprehension of political ideology (and political rhetoric) could be taken for granted among mainstream cultural commentators, and a meaningful range of views existed among them, those assumptions would be recognized and acknowledged and challenged, certainly by critics not necessarily sharing those assumptions. (Just read Roger Ebert's old reviews and you will see what we used to get in even a publication like the Chicago Tribune.) However, as we are reminded again and again, this is not such an era, but the extreme opposite of one--as Walsh makes clear in the rest of a review with an importance extending far beyond the appraisal of one movie.

The Distastefulness of Slang

It seems a thing to be expected that, as life goes on, humans face new realities, and situations that make them reckon with old realities they may not have noticed before, or look at those old realities in new ways--and that they will come up with terms to go with those new or new-old realities, situations, states of being, etc.. Some of these terms may be coinages by intellectuals engaged in scientific, academic, philosophical inquiry, but many are likely to emerge from the bottom-up, in the colloquial usages of persons who have no formal standing of that kind--in a word, slang.

Still, there is something very off-putting to many about a great deal of slang, with this arguably extending beyond the distaste of the old and stuffy for the new, or the annoyance of those who are not "with it" at constantly being bewildered by such neologisms.

Why is that?

One explanation may be that of the colloquialisms that people come up with, and even those colloquialisms that come into wide use, a very high proportion are unsatisfactory for whatever reason, perhaps not actually conveying very much, or becoming irritating in a hurry. Certainly this would seem to account for how much slang proves ephemeral, the usage seemingly everywhere for a while and then disappearing such that it afterward only appears as a retro or ironic phenomenon. (I certainly don't miss the '90s tendency to make a false statement and then add "NOT!" at the end.)

Another I can think of would have to do with what many find the repellent quality of slang in our particular day. One possibility is that slang reflects an era, and that an illiterate, intellectually stultified, mean, crass, crude era of backsliding and backwardness and barbarism, of sadism and stupidity and self-satisfied poshlost regarding the whole thing--such as naturally goes with the worship of personal power and its brutal exercise so emblematic of our time--will produce slang to match, reflected in, among much else, the extreme disrespect which has become our default mode of dealing with each other, with every use of "Whatever!" as a dismissive one-word sentence reminding us, and making those who have any intelligence or sensitivity left in them shrink in disgust. Meanwhile those of us who can understand the meaning of such things may remember what Captain Kirk had to say about the use of language in 1986--since which year things have gone far, far worse for those struggling for cultured speech.

'90s Nostalgia: A Personal Note

I remember living through the '90s and finding them appalling--an era of backwardness and stupidity as the worst of the "Greed is good" '80s just went on exploding, and hope of anything better seemed at a historic low point, all as far too many opted for the stupid self-satisfaction of ironic poses in the face of it all. (I have said in the past that the irony at least indicates an acknowledgment of something wrong, as against the sheer obliviousness we have seen since, but it was not the healthiest reaction--and did its part to make things worse.)

But hey, there were The Simpsons, and Seinfeld, and any number of other such pleasures--and like many others I have occasion to remember them and laugh all these years later, a good deal more than I am able to do at what pop culture offers me these days.

Nostalgia: A Few Words

There is a well-known tendency on the part of critical commentators to criticize feelings of nostalgia. They point out that the past was not a kind place to most, and attraction to that past can seem to them to elide or even validate that unkindness--with misanthropes of the postmodernist variety perhaps especially likely to point out such things.

Still, consider the common objects of nostalgia--typically bits of pop culture. Films, shows, music and the like. One can see in them not a forgetfulness of the badness of the past, but fondness for those things that helped get us through the bad times, remembering which can be a reminder that we will get through the bad times in which we are living now.

There seems to me something life-affirming in that.

Things Composition Instructors Do That Annoy Students: Shadow Grades

In teaching composition, certainly where I was trained in and first did the job, it was common procedure to employ "shadow grades" on certain writing assignments. In this case what this meant was that students would write papers, the instructors would evaluate them, and provide their feedback, with the "shadow grade" consisting of the grade they had earned--with the understanding that revision was assigned, and the grade improvable. That is to say that here was their grade; and in the written notes, here was what they had to do to turn whatever they got into the grade they actually want.

All of this seems reasonable enough, easy enough to understand, and even comfortable for the student, who is given a chance to fix what may have needed fixing. But it did not always seem that way to the student.

To be quite honest, some did not seem to fully grasp the system. But even where they did what was real to them was the grade they had now, not the hypothetical better grade that was by no means guaranteed--the more in as many found, or said they found, anything but an "A" a rude shock. ("Every English teacher I've ever had has always given me an 'A!'" I heard many a time--while I can say that I did see weeping right there in class, and even tantrums, over Cs, in at least one case on the part of a student who was clearly way, way, way past eighteen and any claim to forgiveness for callowness on the score of age.) And of course, if the gap between the grade they hoped for and the grade they got was wide, and they had enough experience to know just how grinding revision can be, they may have felt despair--as indeed, I found that students commonly decline to do anything more than make simple corrections to grammar and spelling. (For instance, if the instructor tells the student that what they have is not a unified essay, but a collection of disconnected paragraphs, and that they will have to establish a clear thesis and either connect or discard the material that does not fit in with it, they are unlikely to go about that to any great degree.)

Do I pretend to have a tidy fix for the problem I am raising that instructors in such classes can hope to implement with the blessing of the higher-ups? Of course not. Having to revise--and even rewrite--goes with the territory. And students do want to know what grade they are likely to get in the end. It is, after all, what most of them care about--because even if their elders have the luxury of smarmily telling them "It's not what grade you get, it's what you learn!" no one will much care "what they learned" if their grade point average slips below the minimum threshold required by their scholarship, or if they fail the class and are required to take it over again, perhaps paying a stiff financial penalty on top of the setback involved in taking the class the second time around. Indeed, a poor grade may mean enough hardship that they have to drop out of school--in which case, there goes their chance to learn something.

Admitting that I do see, if not a solution, then at least a way in which we could be doing better, specifically that it would not all be so stressful were students not sent in as underprepared for college academically and in other ways as they have tended to be, a problem exacerbated by the reality of the credentialing crisis that people in authority seem happy to go on feeding (and giving economists Nobel Prizes for justifying); were a college education not so expensive and thus so precarious for so many; perhaps, were our grading systems less punitive.

Alas, such things lie far, far outside the purview of anyone likely to be teaching an actual composition course.

School as Hoop-Jumping, and the Failure of Education

I have from time to time discussed here my experience of college teaching--and where college teaching often goes wrong, certainly in those areas where I am most familiar with it (the teaching of college English). Where all this is concerned one can point to failures in the design of curricula and of methods (as with composition teachers who mistake themselves for Zen Masters and run their classes in line with their delusions).

However, that is far from the only source of the problem, with what the student brings with them often an issue as well. Most of the criticism we hear about this has to do with their K-12 preparation, which is a matter not just of what schools fail to instill in their students, but what they them do instill in them instead--a cynicism toward education.

It has long been understand that, besides any narrowly academic content, our schools (designed to prepare workers for employment in nineteenth century mills; education for two centuries ago, today!) go to great lengths to inculcate habits of punctuality, self-restraint, tolerance of boredom, and above all, deference to authority figures. To many it may seem that they do not succeed in all this so well as they ought. However, they succeed in this to a rather greater degree than they do in inculcating a respect for knowledge and an understanding that they may get something useful to them here, such that the so-called "hidden curriculum" of habit inculcation can seem the real curriculum, the only curriculum, as it is the rest of what they are supposed to learn that gets obscured, everything they do really seeming to be about placating these authority figure messing with their lives.

At higher levels of education, where more is demanded of the student intellectually--where unless they can take a genuine interest and make a genuine effort nothing much can be expected--this gets to be a real problem, perhaps especially in classes in the humanities that get so little respect. It seems to me that it should be fairly obvious that, for instance, a composition class intended to aid a student in developing the skills of reading closely, thinking critically, and communicating clearly, can have value to everyone--but no matter what anyone tells those ruined for such training by prior experience, they believe that it is just more hoop-jumping, to be treated accordingly.

What Ever Happened to the Debate About Euthanasia?

The recent "assisted suicide" of former Dutch Prime Minister Dries and Eugenie van Agt put euthanasia back into the news--for a little while, at least.

Back in the 1990s, and their overhang into the early 2000s, the topic of euthanasia and all associated with it--like the philosophical, moral and legal question of whether human beings have a "right to die"--seems to have been much more commonly discussed than in the years since. Indeed, it seemed a "hot button" culture war topic comparable to abortion, sufficiently so that South Park creators Trey Parker and Matt Stone devoted an episode of their short-lived sitcom That's My Bush! to it.

Little of it has been heard since--but I suspect it is not because the matter was somehow "settled" in some consensus. Quite the contrary. The evidence indicates that
a majority, perhaps a very significant majority, of Americans support the right to die, but taking the fifty states as a whole very little changed legally here.

Why a gap so vast, and at the same time so unremarked?

One possibility is that the issue was crowded out by a crisis-packed century, and the explosion of conflicts that were previously suppressed rather than settled. (Remember how a certain sort of commentator was prone to claim that the country had become "post-racial" not so many years ago? They don't say that now.)

In that it may have helped that the existing situation aligned with the preferences of the right, especially given its success in driving the agenda in American politics in recent decades. There was no equivalent to Roe vs. Wade here for them to protest against--and so, as far as the media was concerned, no argument, making it that much easier for the issue to fall by the wayside.

The "'80s Jerk," Again

The last time that I raised the issue of the "'80s jerk" my explanation of the phenomenon was that in that earlier era of film-making we still had lots of grounded, smaller-scale stories where the stakes are personal rather than galactic and petty meanness or ambition could be significant plot points in movies viewed by large numbers of people.

Yet I can also see a possibility of there being more to the matter than that--reflecting the rightward turn of politics and popular culture in that time. One is a strengthening of the tendency (admittedly, probably always stronger in America than other places in the Western world) to think less in terms of society and the way it is structured and more of individuals, and the obstacle to the hero's realization of their goals as a matter of entirely individual villains. ("It's not the apple barrel that's rotten, just a few bad apples," the conventional always say. Like Sigourney Weaver being all that's really wrong with Wall Street so that Carly Simon sings "Let the river run!" over the end credits to Working Girl.)

Another may be that it was a vehicle of selective bashing of authority--the kind of authority that those skewing rightward did not much like, who tend to be the equal and opposite of the ultraconformist "maverick" the conventional so love. (Thus in libertarian Ivan Reitman's Ghostbusters we see the Environmental Protection Agency presented in a very bad light indeed--in contrast with the protagonists who departed academia for private entrepreneurship who save the day.)

So far as I can tell the essential outlook endures, even as the stories that big movies tend to tell have not.

The Truth About the Experts and the Public

One of the eternal laments of the centrist ideologue and their media element is that the public did not defer completely to established expertise, while painting those among the public who fail to do so as a pack of anti-intellectual Know-Nothings. The idea that there could be expertise outside the Establishment; that the Establishment's expertise may be significantly flawed; that those non-Establishment experts, and even others, may have a sound, strong, critique of the Establishment; is something they do not, cannot, concede. In particular they can never admit that the professionalization of knowledge has its dangers, especially where the professionals are especially close to and especially watched by the powerful, such that rather than devoting themselves to fact, truth and public service they just tell the powerful what they want said--because the centrist is famously evasive on the question of power, and anyway conservative enough that they are as respectful toward those on top as they are disrespectful toward those on the bottom, such that even if they did acknowledge such a thing they would not see it as a problem. Still less does the centrist consider the significances of their own lapses in respect for the experts--as in how they disregarded the consensus among climate scientists about anthropogenic climate change as not only real but a problem warranting urgent redress to accommodate the denialists. No, they are sure, the disrespect that bothers them must be a matter of the grubby Know-Nothings being too arrogant and cynical (never mind where that comes from, the centrist is not big on reasoning things out to their causes) to show their upstanding betters the proper respect.

Friday, April 19, 2024

Of Intergenerational Conflict and Military Recruitment

The resurgence of great power conflict over this past decade has brought with it a certain amount of rethinking of the ways in which the Western world has organized its armed forces. The 1990s saw them shift away from emphasizing large-scale conventional conflict to the execution of missions that were smaller in scale but placed a higher premium on rapid-reaction--a reorientation which seemed the safer because of a very high confidence in the capacity of high technology to replace "boots on the ground" in what was essentially a translation of "information age"-type thinking into a "Revolution in Military Affairs". Since then some have been quicker about it than others--but the mainstream of the dialogue seems to have reoriented itself back toward the prospect of older-style, larger-scale conflict.

Amid all that there has naturally been a good deal of talk about expanded military forces, and how they will come up with the required extra personnel, with proponents of such expansion taking yet another chance to sneer at much-maligned "Generation Z" and its supposed lack of the virtues of earlier generations (not least, "the Greatest"). To his, and the journal's, credit, Brian McAllister Linn published a historically grounded piece in the Autumn 2023 Parameters debunking the myth-making, reminding the professional youth-bashers that "since the beginning of the twentieth century the peacetime volunteer Army" (before which the peacetime volunteer Army was a relatively small entity with limited needs) "has been in a crisis more often than not" with respect to its ability to attract recruits, and that "[t]empting as it is to blame 'wokeness,' slacker mentality, Generation Z, or some other nebulous reason," their predecessors as much as they were prone to " join the service for individual reasons, most based on expectations of personal benefit" rather than greater possession of some "warrior" or "citizen-soldier" quality. Alas, lucidly written as Linn's article is, relatively few of the commentariat these days would even seem to possess the required reading level to understand the text at the most elementary level--let alone use it to enlighten a dialogue as depressingly mired as ever in cultural and intergenerational warfare.

Making Sense of the Twenty-First Century: A Few Thoughts

While my academic writing on world affairs initially concentrated on security studies this early on led me away from the field's traditional subject matter toward the economics of the matter--and increasingly to economic questions as such.

It was simply where the action seemed to be (more obviously so in a time in which great power conflict was constrained and declining in comparison with the prior century).

And so I went on writing about the decline of growth rates since the post-war boom and especially the 2007 financial crisis, deindustrialization, financialization, the methods for measuring such developments, the associated economic models and ideologies, and even the social consequences of these troubles, from the erosion of the American middle class to the social withdrawal of the young.

Of course, the events of recent years have made it harder to overlook those more traditional emphases of international relations scholarship--with the result that I ended up writing about such topics as the shifting naval balance between the U.S. on one side and Russia and China on the other, the comparative military-industrial capacities of the U.S. and Russia, the contraction in size of the major West European military establishments, and the changing defense postures of Germany and Japan--and of course, the waning of what has commonly been called unipolarity.

In spite of that one should not mistake this for "security" replacing "economics." The unhappy trend of our economic life is, of course, strongly connected with the increased conflict we have seen in the international scene--though as yet few seem to have troubled themselves much about any linkages, not least because the "experts" so hastened to relegate the crisis of 2007 to the past.

The historian Adam Tooze, indeed, admitted to having shared the conventional wisdom when taking up the subject--only to, when publishing his book on that crisis and its aftermath a decade later, confess that he had since learned better. As Tooze came to realize through his researches the Great Recession was not past in 2018, and is not past now--while the shock of the pandemic, which remains ongoing, is not past either, one crisis merely piled atop another.

Alas, others not in a position to pursue scholarship such as Tooze does, or even properly equipped to cope with his very useful but also long, dense, book demanding more than the "eighth grade and under" reading level of even the college-educated, were aware of the fact all along--through the day to day experience of their own lives, which for many became decidedly more bitter than before.

The Impossible Writing Tasks the Suits Demand of Writers, and the James Bond Series

Recently polishing and putting up a number of book reviews of John Gardner's James Bond novels, and considering the hugely uneven body of work that quite talented author unmistakably contributed to that franchise, it seemed a reminder of how by the time he was enlisted to work on the series continuing the James Bond novels was pretty much one of those impossible tasks that writers are only enjoined to do because the people who own the franchise see more money in it. By the 1980s Ian Fleming's creation was so much out of his time that "more Ian Fleming" was unsalable, and at the same time a more contemporary version of James Bond so unrecognizable that it was pointless to call him James Bond. Reflecting all this Gardner, after (to the extent that his irony toward the character allowed it) going from one extreme to the other in his first two books (a thoroughly updated Bond in Licence Renewed, a thoroughly backward-looking, nostalgic, adventure in the next book, For Special Services), and then mostly offered compromises that did not please many. Unsurprisingly, while Gardner's Bond novels were bestsellers through the 1980s (every one of them from Licence Renewed to Win, Lose or Die making that gold standard of those lists in America, the New York Times hardcover fiction list), they left very little longer impression--certainly to go by that handy index of enduring readership, Goodreads. Customer ratings and reviews are very few for these bestsellers, suggesting that they have not been looked at much in a long time--which in light of what I have seen of the books, and of the franchise recently, bespeaks these books not being forgotten gems, but rather the Bond franchise soldiering on through decades of declining public interest simply because there was just enough money in it for that. Alas, as the audience's response to No Time to Die showed (and as the slowness of Everything-Or-Nothing to get work underway on Bond 26 has also showed), the days of that are likely coming to an end.

Selling Authors, Not Books

As Balzac's vulgarian Dauriat has made clear, the commercial publisher is apt to be more interested in selling the famous name of an author--in selling an author--than in selling an author's writing. One may find this epitomized in the ever-crowdedness of the bestseller lists with celebrity memoirs that, in spite of being ghostwritten on behalf of the illiterates whose names are the object of the traffic, with little regard for the truth, still have nothing to offer an intelligent reader.

Indeed, people have come to expect such selling so much that even those who should know better seem to be taken in by all the nonsense, with the blow-up over a certain gratuitously and superciliously insulting profile of Brandon Sanderson that the editors of Wired magazine bizarrely decided to publish exemplary. Brandon Sanderson was, in the view of the interviewer in question, interesting only as a writer of books that a significant fan base enjoyed reading.

There is nothing wrong with that. In fact, that is all anyone can--or should--ask of a major commercial author. But the interviewer, who did not seem to understand this, started another of those Internet tempests-in-a-teacup that are all people can talk about for hours, but which they have completely forgotten a month, even a week, later.

It is a sad, sad testament not only to what book-selling had already become in Balzac's day, and how it has only gone on getting more disgusting since, but to the way in which the "Fourth Estate" so utterly assimilates and promotes the logic of those whom it should be challenging.

Josiah Bounderby and the Aspirational Society

I have in the past remarked how while Charles Dickens' Ebenezer Scrooge became so familiar a figure culturally that "Scrooge" is a byword for selfish callousness, Josiah Bounderby from Hard Times never got anywhere near so much attention--and indeed reference to him so scarce that the only recent ones I found were in an Indian Express blog post discussing one specimen of the type, and the comparison a South Korean judge drew between the Samsung executives in their recent high-profile trial and Dickens' creation.

It seemed to me that this is because the cult of the "self-made man," if probably universal today (the Indian item specifically attacked this) is so particularly strong in America today--and if discredited over and over again by those sociologists who actually attend to the facts, challenging it a cultural taboo, so much so that few dare do anything that could be construed as doing so.

Thinking about that in the past I have tended to focus on the ways in which this taboo is used to defend the validity of extreme differences in wealth, but that mainly in terms of how it does so by helping defend the claims to great wealth on the grounds that it is the product of great contributions to society. Factoring into this is what this is supposed to mean to the rest of the public--not only that the extreme difference in outcome is valid, but one way in which they and their house intellectuals deflect calls for egalitarianism, telling "the poors" that "You can do it too!"

Admitting to the reality of Josiah Bounderby would be to admit "No, you very likely can't, and certainly shouldn't pin your hopes on it"--and make it a good deal harder to avoid engaging with all to which such an outlook can lead.

Subscribe Now: Feed Icon