Monday, April 4, 2022

Farewell, Zombie Apocalypse?

It seems to me that in twenty-first century popular culture certain themes have been remarkable for their staying power. Novels featuring Dan Brownish religious-historical mysteries and the adventures of youths rebelling against the rulers of dystopian future societies, forensics-themed police procedurals in network prime time, Jason Bourne-ish spy thrillers and Bucket Brigade/Frat Pack comedies at the movies--these burned very brightly indeed for a time, and then flamed out, not leaving all that much of a legacy.

Others have proven much more enduring, some in just one medium (as with reality TV, regrettably), and others across the media spectrum--with, of the latter, the outstanding examples superheroes and zombies, the latter often bound up with a broader societal collapse scenario.

Recently I have found myself writing about that popularity--and particularly attentive to the way in which references to zombies have become standard in economic commentary, with leftist writers remarking a "zombie capitalism" or "zombie culture" in the neoliberal age, while Establishment analysts have increasingly come to use the term "zombie firm" (you can actually find the term used right here on the web site of the Federal Reserve itself!) in regard to businesses that, because they can much more easily do so in this age of ultra-loose monetary policy and ever-more hyperabundant credit (and the old adage that if you owe the bank a million dollars it's the bank's problem) have a Blanche Dubois-like dependence on the kindness of creditors.

Considering the events of the twenty-first century--and how science fiction has responded to them--I find myself thinking of a factor that would seem to merit attention, namely the way in which a "zombie apocalypse" so easily gave audiences a post-apocalyptic scenario without referencing the real-world problems threatening civilization. Rather than having to bring, for example, nuclear war (which never went away, though people seemed to forget about it until recently), or climate change, into the matter, they had this other basis for imagining a survivalist scenario in which they got to slaughter lots and lots and lots of dehumanized Others. Not everyone played things that way, of course. (The Night of the Living Dead films, at least, started off with a left-leaning politics, and I think one can say that of the Resident Evil movies, too. Certainly Max Brooks had something to say about the actual world in World War Z, while he opted not for a tale of survivalism but of "Greatest Generation"-type collective effort.) But it certainly seems to me that this was part of the appeal--a purer "post-apocalyptic porn" as Birdman famously put it, unmarred by any sort of social content, for which there seemed more room in an age in which so many brushed off big dangers like war and ecological crisis.

But that may not be so easy in the coming years. With, as Michael Mann observed, climate denial selling so much less well that the "anti-climate" crowd is changing tactics (emphasizing dealing with climate change as an individual issue rather than a systemic one, and indeed encouraging a defeatist attitude); while in the wake of the conflict in Eastern Europe even the stupidest observer of the international scene finds it much harder to pretend that the danger of nuclear war is not a serious threat to humanity; and also that COVID-19 thing that somehow just keeps going and going and going; I see no reason to expect us to become less cognizant of those things--let alone find good reason to be less concerned about them--but quite the contrary. And amid all that I suspect zombie apocalypses will lose some of their appeal to audiences.

By contrast superheroes will probably keep on selling tickets at the box office through it all . . . or if they don't, not on account of those same factors.

Saturday, April 2, 2022

The Return of Nuclear War to the Public's Consciousness?

Back in the 1950s and 1960s we saw Hollywood take older science fiction works and write the theme of nuclear war into them where it had not been present before--so pressing a problem of the day did the prospect of such war seem. Thus where the reader of Pierre Boulle's original Planet of the Apes contains nothing about such a calamity, the end of the 1968 movie version gave us its most famous scene, and indeed one of the most iconic scenes in film history, in Charlton Heston's character coming to the pained realization in the shadow of a Statue of Liberty reduced to a colossal wreck out of Shelley that nuclear war turned the Earth into the "planet of the apes"--while the sequel developed the theme even more shockingly with its nuclear bomb-worshipping telepaths, and its exceedingly bleak and ironic ending.

So did it also go with The Day the Earth Stood Still. An adaptation of Harry Bates' "Farewell to the Master," the original story also contained nothing about nuclear war. But the theme was also at the center of the film version, where the emissary from the stars came to warn the Earth that it was on a course for self-destruction, and were it to persist in it would not be allowed to endanger other worlds--with the others of the galaxy prepared to stop it by force if need be.

Of course, since the end of the Cold War the exact opposite tendency has prevailed. All too characteristically Hollywood has, rather than going back to the source material and trying to do something new with it, generally remade the adaptations, while eliding the nuclear war theme from the new versions. Thus the two twenty-first century remakes of Planet of the Apes (2001, 2011) have been Luddite, Frankenstein complex-type stories about biotechnology-gone-wrong. Meanwhile the new The Day the Earth Stood Still (2008) switched its emissary's concern from the destructiveness of human nationalism and militarism in the nuclear age to humanity's conduct toward its environment. The theme of nuclear war, one supposes, simply seemed passé to them, or at least, less pressing than other dangers.

The judgment may have not seemed unreasonable at the time. (The biotech plots of the Planet of the Apes movies did not impress me, but it certainly has seemed at times that the ecological crisis was, if less dramatic, then at least in its inexorable progress, and the extreme hostility of business and government to its redress, more likely to get us than the Bomb.) But anyone who thought that the danger of nuclear war was somehow past was plainly ignorant. After all, Cold War or no, the nuclear arsenals of the superpowers remained--reduced, but still vast, with the creakier nature of the arsenal possessed by a fragmented and impoverished Soviet Union elevating the by no means novel risk of a major nuclear war starting as a result of a purely technical accident. This was all the more the case given that the international system and its potential for conflict remained, with the demise of Communism as an international political force far from automatically translating to the "world peace" of a Miss Congeniality. Indeed, if major states coming to blows less likely the period nonetheless saw one crisis after another--the 1995 Norwegian rocket incident; the 1996 Third Taiwan Strait Crisis; the open testing of nuclear weapons by India and Pakistan in 1998 and the fourth Indo-Pakistani War in 1999; and perhaps most incredibly, the stand-off between NATO and Russia over the former's intervention in the Balkans which saw General Wesley Clark order future pop star James Blunt to destroy the Russian force and (General) Michael Jackson overrule Clark with the words "I'm not going to start World War Three for you" (just in case anyone didn't get what was at stake), after which James Blunt has publicly assured the world that even had Michael Jackson not told him to stand down he would have himself refused to obey the order (such that, yes, one can now found innumerable headlines, online comments and the like screaming things along the lines of JAMES BLUNT SAVED THE WORLD FROM WORLD WAR III!--and even MICHAEL JACKSON AND JAMES BLUNT SAVED THE WORLD!).

Still, the fault for that ignorance lies first and foremost with the press, which tended to treat such events lightly, distantly, belatedly, as remote curiosities less important than the latest piece of tabloid idiocy, with the doings of an Amy Fisher, a Tonya Harding, an O.J. Simpson infinitely worthier of the public's attention (Blunt wasn't famous yet, and that part of the story wouldn't come out until 2010 anyway)--while even those who did not go in for this stupidity at least had some room in which to think that those crises were last aftershocks of the twentieth century's conflicts. (For my part I thought that the danger of major war, while not so small as some seemed to think it was, and certainly not incapable of resurging, was at least in the short term on the wane, and that even if the economic and ecological stresses could reverse that, that may have been some way off, and I then dared hope, avoidable.) Amid all that it seemed that the public, and even its so-called opinion-leaders, had forgotten that nuclear war was something to be afraid of--indeed, perhaps even come to fear it less than a "zombie apocalypse."

Recent years have put the issue right back on the agenda. Thus far the public has been slow to remember what it once knew and feared, not at all helped by the way in which governments and the press continue to act as if they have forgotten the harsh realities of the nuclear age (as perhaps, indeed, they have, the people in charge never the geniuses their sycophants would have us believe they are). But it seems unlikely that this can remain the case for long, with the Russo-Ukrainian War dragging into its second month--and unlikely that it will not mean some kind of political or cultural response.

I hesitate to make guesses about what shape it will take. We could see a revival of the anti-nuclear movement, the anti-war movement, calls for world government, and perhaps even a broader renewal of radical politics. Yet it could also be that the reaction will deepen the pessimistic, survivalist impulses already ever more in the air, while translating to more nationalistic, more militarized societies (with the second tendency, the more consistent with the trend of a deeply disappointing twenty-first century, so far prevailing in the entirely predictable press coverage, the actions of governments as austerity in every other area goes along with profligacy in this one). Much will depend on how long this war goes on, and the course it takes, and, one supposes, the way in which all this interacts with the constellation of other problems the world faces (from inflation in the West to food security in Africa and the Middle East). But it still seems bound to mean something, and that enough so that I suspect that intelligent filmmakers remaking a science fiction classic of yesteryear will no longer look at the theme of nuclear war and dismiss it as passé--and that when they shrink from it they will do so because they find it too relevant than because it is insufficiently so.

Thursday, March 31, 2022

Notes on Fyodor Dostoyevsky's The Karamazov Brothers and Theodore Dreiser's An American Tragedy

The names of Fyodor Dostoyevsky and Theodore Dreiser do not seem to be spoken together very often and it is easy to understand why given the immense differences of period, country, political and social perspective, prose style--Dostoyevsky a nineteenth century Russian who, for most of his career, was Christian (specifically, Orthodox Christian) and Slavophile and inclined to a particular blend of romantic realism, while Dreiser was a twentieth century American of leftist, eventually Communist, politics who as a storyteller favored Zolaesque naturalism. Still, I often find myself comparing what seem to be generally accounted their greatest works, The Karamazov Brothers (1880) and An American Tragedy (1925).

In those two books those two authors each take a murder case in a small provincial town and ultimately produce from it a thousand-page epic about everything. I can think of nothing like them since, with that pop cultural supernova of the '90s, John Berendt's "nonfiction novel" Midnight in the Garden of Good and Evil, negatively demonstrating the magnitude of those other authors' achievement. Characteristic of a decade that reveled in "quirkiness" for its own sake (we see it in shows like Northern Exposure and Picket Fences, in the indie films of the time), Berendt's book is in the end small-time, shallow, unilluminating (in a word, postmodern) stuff entirely unworthy of its grandiose title.

By contrast "Midnight in the Garden of Good and Evil" would fit as the title of Dostoyevsky's book perfectly, his Karamazov Brothers seeing in that one provincial murder the struggle over the future of Russia, humanity and even the universe. (Don't forget--Dostoyevsky was personally acquainted with, and apparently influenced by, Nikolai Fedorov, and yes, it mattered in this book.)

But, as one might guess, it would not fit Dreiser's, for whom gardens of good and evil would not really be a relevant, helpful concept, which is a reminder of the compelling differences as well as the similarities. The social science-minded, naturalistic perspective of Dreiser had him bringing the entirety of America's social reality (class and work and economics and prejudice and sexual attitudes and the urban-rural divide, the realities and the illusions, everything) in the murder at the heart of An American Tragedy--his book about everything reflecting a different understanding of what everything happened to consist.

However, the differences hardly end in that distinction between the mystical religious author and the materialist socialist, with one that I find myself thinking about often the cast of characters, and in particular the position of the central figure in relation to the rest. In The Karamazov Brothers the tragedy is distinctly a family tragedy, where four brothers (of which Alyosha is presented as the protagonist) whose lives are closely interconnected with those of each other and their father are all, if only through inaction, only in thought, implicated in the murder of the father in some way. By contrast Dreiser's Clyde Griffiths is alone, adrift, throughout the story--alienated from father and mother and siblings in the early part of the book, forced to leave his family behind entirely after involvement in a deadly hit-and-run, and then after a long shot encounter with his rich uncle in Chicago and going to Lycurgus, New York to try his chances with him, being kept at more than arm's length by that uncle and his family even as they bar him from association with anyone else, with that alienation and isolation and loneliness at every turn playing its part in the disaster that ultimately befalls all concerned--the infamous atomization of American life ("emotional destitutes, as poor in their family connections as Afghans or Sudanese are in money," none other than Edward Luttwak was to write) making it entirely fitting that such atomization plays so great a part in Clyde's story.

Wednesday, March 30, 2022

Are Moral Panics Less Common Than They Used to Be?

Recently revisiting the moral panic over the release of Joker it struck me that such panics seem less common than they used to be--for example, in comparison with the 1990s.

Where explanation of the matter is concerned my guess would be that this is because of the fragmentation and polarization of the public--or at least, that fragmentation and polarization becoming much more difficult to overlook. It seems to me fairly obvious that any really large group of people--and certainly a country of hundreds of millions--can only in the hyperbole of court poets singing for their suppers be imagined as at all usefully generalizable as collectively attentive to the same thing in the same moment and reacting to it in the same, conventional way.

Did we all attend to, for example, the grotesque stupidities of the Tonya Harding-Nancy Kerrigan melodrama with baited breath in the 1990s? I can most assuredly tell you that many of us were more interested in just about anything else you may care to name (and appalled when the venal idiots who run the mainstream news happily let it crowd other, more important events out of the headlines in their gleeful attentiveness to it).

But now in an age of digital media, in which the most extreme or idiosyncratic views have some chance to enter the public's consciousness, the pretense looks that much more foolish.

The result is that even the court poets (not an impressive bunch--court poets generally aren't) find it far more difficult to sing songs of a nation united in alarums and excursions, especially over the stupid things that usually inspire such. Rather the panics, which I suspect are as frequent as ever in this age of ever-escalating cyber-mainlined fear, shock and outrage, and perhaps even more so, simply happen to be the panic of a particular portion of society, not the whole.

Monday, March 28, 2022

Of Isaac Asimov's Foundation Saga and H.G. Wells' The Shape of Things to Come: The Figure of Hari Seldon

In a prior post I discussed some of the parallels and the differences between Isaac Asimov's Foundation novels and H.G. Wells' The Shape of Things to Come--not least both tales envisioning the restoration of civilization by a science- and engineering-minded elite, with a significant difference Asimov's reimagining the Medieval Church as the Empire-restoring Foundation, whereas Wells' more radical book discarded religious pretenses, and emphasized a new World State which would move past the weaknesses of the pre-Fall order.

Another significant difference would seem to be the sheer elitism of the conception, with the Foundation the mover of galactic events, and above and in back of it the psychohistorian Hari Seldon who saw and planned it all out mathematically in a manner that can seem more divine than human, the more so for his odd technological deification--with the opening of the Vault containing his recorded messages in moments of crisis having the quality of religious revelation intended to speed the fulfillment of a "Divine Plan." (Indeed, in the second book, Foundation and Empire, Asimov has a character remark the fact of "a whole culture brought up to a blind, blubbering belief that a folk hero of the past has everything all planned out and is taking care of every little pieces of their . . . lives," a "thought-pattern [that] evoked religious characteristics . . . strong faith reactions.")

I can think of nothing comparable to Seldon in Wells, and certainly not The Shape of Things to Come. Certainly Wells is not without respect for individual achievement, and the movement establishing the World State in his saga certainly has its leaders, among whom one can find personal greatness. (Indeed, he is not without his elitist tendencies--it was, in fact, a significant element in his complex and increasingly unfriendly attitude toward Marxism and its reliance on the working class, of which he did not think much, a fact which is a significant theme of his Experiment in Autobiography.) Yet the protagonists of Shape, taken as individuals, are in the end humans--more intelligent than others, better educated, more responsible, but still, just humans and not gods--who faced facts that had been increasingly apparent to a great many people for a very long time, and acted accordingly. Unsurprisingly the trend of history, and within it the collective efforts of the Control, tower above any one figure. (Indeed, writing this post I easily enough recalled the Air and Sea Control, but I had to look up the names of specific characters in the book in order to recall them.)

Altogether I suspect that the conception of Seldon owes far less to Wells than to American science fiction's taste for scientific supermen who can singlehandedly bend the course of history, exceedingly evident in the Edisonades of the nineteenth century (in the American-style "invasion stories," for example), with which idea John Campbell seemed obsessed--not merely as a teller and editor of stories, but in actual life, as he went from one movement to another in pursuit of something that would render the vision real (Scientology, psi, etc.).

Today, looking at the worship of "tech billionaires" as saviors, and the way tales of superheroes fill our screens, it would seem that this particular legacy of older American science fiction remains very much with us now.

Of Isaac Asimov's Foundation Saga and H.G. Wells' The Shape of Things to Come

Looking back at the Golden Age of science fiction it has seemed to me that H.G. Wells was a hugely important influence. Certainly John Campbell's derivation of inspiration from Wells' The Time Machine in the early story "Twilight" is much discussed--while reading Campbell's discussion of his own ideas about the genre, and how others have interpreted them, has persuaded me that he largely adapted Wells' ideas to his own inclinations (a case I make in Chapter Two of Cyberpunk, Steampunk and Wizardry).

Perhaps unsurprisingly I found myself often thinking of Wells, and in particular what I have tended to view as the summary work capping off his career, The Shape of Things to Come, when recently reading Isaac Asimov's Foundation trilogy, with which Wells, and in particular that then-recent book--scarcely a decade old when Asimov undertook the writing of the Foundation saga--has been less associated. (Instead it is Edward Gibbons' The History of the Decline and Fall of the Roman Empire that Asimov was to speak.)

In both of those tales civilization fell in a manner all too predictable to intelligent observers. And in both a small elite left over from "Before the Fall," with a greater understanding of social reality and a command of critical technical skills, built an organization that sustained the old knowledge and connections, and through its control of essential technology, exerted a civilizing influence (the Air and Sea Control in Wells), not least by letting it consistently outmaneuver the backward, violent elements that attempted to oppose or exploit them for their selfish ends (which maneuvering made the reconstruction of order a far less bloody matter than it would otherwise have been). There is also what both those premises bespeak, a hope for rational control of the course of human life and human history, with such control exercised by scientists and engineers who can apply the same kind of foresight and control they do to physical forces to society, not least by leveraging their various forms of expertise (technological as well as social) into political power.

There even seems some significant resemblance in the narrative structure, if not with Wells' book, then the cinematic adaptation of it, Things to Come. The 1936 movie took Wells' work (which had been written as a precognitive glimpse of an advanced history textbook from the twenty-second century, and read like one) and dramatized it by way of a series of illustrative and allegorical vignettes from the various stages of the history it covered--with the first presenting the experiences of a flier in the world war that wrecks civilization, the second a member of the Control (here known as "Wings Over the World") confronting and defeating a grubby warlord, and then, following a montage depicting the rebuilding of the world on clean, modern lines, the controversy that ensues in the new World State over a planned moon mission. Thus does Asimov's Foundation likewise proceed, not as a history textbook, but through such vignettes showing the work of Foundation founder Hari Seldon, the Foundation's first confrontation with the space age barbarians of Anacreon after the Empire's effective collapse a half century after the first vignette, etc., etc. . . . (entirely befitting, I think, Asimov's view of science fiction as a genre distinguished by its concern for humanity's collective destiny over the destiny of any one individual, a sentiment with which, I think, Wells would have been sympathetic).

Yet along with these impressive similarities I was struck by at least one significant difference reflecting the intellectual inspirations and the politics to which the two writers were permitted to give expression. Asimov reimagined the fall of Rome and the Dark and Middle Ages on a galactic scale, with his Foundation early on appearing a variation on the Catholic Church, using its monopoly of old knowledge and its religious authority to bend the barbarians to its will, with an eye to the restoration of Empire, all as Asimov stressed the specifically intellectual failings of late Galactic Empire culture ("the damming of curiosity," the decline of the scientific ethos, with which Mayor Hardin charged the Foundation's scientists themselves). Certainly the Foundation does evolve, taking on new forms as the galactic situation changes in ways bespeaking a materialist historical sense (with its Church-like aspect giving way to a commercial empire when the "money power" becomes more important), and by the second book there is even an expectation that the new Empire will somehow be better than the version which preceded it--but of that later, better, state we know no details. By contrast Wells, who was more openly radical than any writer had the opportunity to be in the pages of John Campbell's Astounding, and for that matter any really major science fiction writer in Red Scared Cold War America, described not a process of pseudo-Roman decline and Medieval restoration of civilization and its benefits, but explicitly offered a radical critique of the social system that had been before directed at capitalism, the nation-state, the traditional culture of religiosity, nationalism and other elements accompanying them, as out of step with the practical governance of human affairs, as demonstrated when it ineluctably led to economic slump, war and civilizational collapse. The point was therefore not to put what had fallen apart back together, even in tweaked form, but to fill the vacuum with the kind of government the world had really needed all along, and suffered for lacking (explicitly depicted as a scientific, socialist World State clearly, profoundly different from what had stood before).

Still, even allowing for the difference that can itself seem to sum up the tight boundaries within which American genre science fiction developed ("stories based 10,000 years in the future where all the sciences have progressed fabulously except for one," as Mack Reynolds protested in Who Killed Science Fiction?, specifically drawing attention to the feudalism of the Foundation stories), there was still something of the Wellsian regard for science, reason and their potential to order human affairs in a better way that has become much, much less evident in science fiction since, as seen in its proneness to treat the apocalypse not as a challenge to the guardians of civilization, but an occasion to wallow in reptile-brained misanthropy, whether amid the survivalism of the immediate aftermath of civilization's downfall, or the dystopias that, as in The Hunger Games, rose from the ashes to bring the survivors new nightmares.

Saturday, March 26, 2022

The Rules of Post-Apocalyptic Fiction?

I recall some years ago encountering a page on some web site that purported to offer advice to writers of post-apocalyptic fiction. The purveyor of said advice flatly told the reader that the characters in fiction of this type should have no time to think of anything but immediate physical survival among the ruins.

This piece of "advice" annoyed me greatly. Admittedly I am generally suspicious of one-size-fits-all advice broadcast to the world at large as, even when actually, technically, correct (as it is not much of the time), overwhelmingly likely to either be so obvious that no one really needs to be told that advice (like "Don't take a nap in the middle of a highway at rush hour"), or be completely unusable by the vast majority of people in their actual life (like "If you care to abide by the proper protocol, do not bow from the waist when you meet the Queen of England in person for the first time").

I am particularly suspicious of the sort of writing "instruction" which in an area that (in contrast with, for example, the forms of writing a business letter, or the conventions of academic papers) is supposed to be all about creativity and self-expression, barrages the students with "shoulds," and still worse, "musts." Certainly fiction tends to adhere to certain principles, with particular techniques achieving particular effects, and we can discuss them in rational ways in at least some degree, and it can certainly be useful to be familiar with them on that level. There is undeniably such a thing as craft knowledge that one will (probably) have to get one way or the other in order to produce something worthwhile. But even there one ought to be careful of "should" and "must," especially at such a basic narrative level as the aforementioned author discussed. This not least because of the fact that particular techniques achieve particular effects does not necessarily settle for us which effects we should go far, especially if we are not playing the usually self-defeating game of chasing after "What's hot now" (or the still worse game Oscar Wilde described of "degrad[ing] the classics into authorities," and using them "as a means for checking the progress of art"--of which cultural crime I only hesitate to accuse the dubious instructor precisely because they failed to show any knowledge of the classics, post-apocalyptic or otherwise, whatsoever).

Making matters worse the author here seemed to me demonstrably wrong, and in ways all too telling of a present-minded shallowness. After all, there is much more to post-apocalyptic fiction than reptile-minded survivalism. There is also the struggle to move past survivalism--to save what remains of civlization, or rebuild it, perhaps along new and better lines. And this has seemed to me the far more compelling kind of story, as I have recently been reminded when picking up once more Isaac Asimov's classic, Foundation, most assuredly a piece of post-apocalyptic fiction in its recounting the struggle to abbreviate the Dark Age following civilization's collapse and relapse into barbarism. It is, of course, hardly a fashionable work these days, but that is as much, or more, because of its considerable virtues (its scope, its interest in society, its rationalism) as any failings it may supposedly have--virtues which have me rethinking yet again my old appraisal of the book, such that I expect I will have more to say about it and the original trilogy of which it was a part in the coming days.

The Small Screen Generation Gap: Why Old Folks Go for Comfort Food

It is an old pop cultural cliché that CBS is an older person's channel, apparently rooted in how that channel was king in the '70s and early '80s (home of Dallas and MASH and Norman Lear's sitcoms and much, much else), and then saw other channels pretty much eat its lunch in the following years. This gave the impression that people still watching CBS were overwhelmingly loyalists from its boom years, with this reaffirmed by the way such hits as it managed to have in the late '80s and '90s looked like shows for the old folks--Murder She Wrote, Diagnosis Murder, Touched by an Angel (the kind of stuff that has since ended up staples of the Hallmark Channels), while the cool kids were watching FOX.

Now network TV's audience generally looks like that of CBS used to--one reason why, it would seem, that network that may well have catered to older viewers best is the current ratings champ.

Much of this generation gap may be a matter of simple habits--like, at the end of the day, when looking to relax, picking up the remote and turning on the TV and looking at a particular channel rather than going online and sifting through streaming offerings looking for something to watch, arguably a more choice-intensive process, reflecting the difference in what the old and young, respectively, look for from TV. Older people grew up on TV-as-comfort-food, which on some level remains their expectation and good enough for them for most of the time. This is all the more the case as they are more likely to find richer content elsewhere than younger folks--more likely to CRACK OPEN A BOOK, or just watch a movie that may not be a Big Dumb Blockbuster (such as they have so many of on, for instance, Turner Classic Movies), and, again, prone to treat TV watching as distraction rather than vocation, as younger folks seem to do.

It may also be that these generational differences are not just a matter of growing up in different media eras, but of being at a different point in their lives. I suspect that young people, especially if they have had relatively comfortable, sheltered, constrained existences, are particularly attracted to material which is the exact opposite--the bigger, wider world, the sensational, the intense. After all, even where those lives are comfortable they don't have much freedom or power, and freedom and power are what they want to experience vicariously, while the combination of a certain age with a certain mentality means trying to show off, if only to oneself, just how tough and worldly and mature one is (when they really aren't). The edgelord and grimdark have a certain appeal in that place.

It isn't quite the same for older folks, who may not be wholly averse to the bigger, wider world, the sensational, the intense, but probably mostly just want a break—the more for their having learned to find entertainment in the familiar, the everyday. (I couldn't imagine myself fining much pleasure in those fat "realist" nineteenth century novels, but a couple of decades on I was happily reading Balzac and Trollope.) Older persons have also moved past, and maybe become ironic toward, the fantasies of youth—not least, of freedom and power, seen through them to the illusions they are. And certainly they have nothing to prove to themselves or anyone else about how tough and worldly and mature they are--the wiser of them having realized they aren't that much of any of those things, that very few of us if any ever get to be so and that we'd be happier not going through what it takes to get there. ("Until a man is twenty-five, he still thinks, every so often, that under the right circumstances he could be the baddest motherfucker in the world . . ." Neal Stephenson wrote in Snow Crash's most memorable line. Well, they're not twenty-five anymore and they're damned glad the "right circumstances" never came up even if they really could have been so bad.) Edginess? Darkness? At this stage of things they have probably had more edgy experiences than they would like--had their fill of darkness, especially if they have learned just what a dark place that everyday world is, noticed and processed the things the young don't notice and haven't processed. And so they may be up for an adventure or a drama every now and then--but perhaps more often would prefer to have a laugh instead.

Reflections on Network TV as Comfort Food

Not long ago a TV critic who, like most of his ilk, worships at the altar of "prestige TV," recently reported eschewing it for a mere week and watching network TV instead as an experiment (the Nobel committee must be informed at once!) and then discussed his findings. One (for him, anyway) significant theme of those findings is that network TV tends toward "comfort food." Of course, I am not sure how much that can be said of network TV today as against before. He talked quite a bit about reality TV, which I have regarded as nasty stuff from the start, while sitcoms certainly seem to have got much meaner over the years--with the asshole behavior on Friends giving way to the still more thoroughly asshole behavior on How I Met Your Mother or The Office (which contributed much to the current, wildly excessive use of the word "cringe"). And when it isn't reality TV or sitcoms it is more often than not procedurals dealing with serious, life-and-death stuff--crime, medical emergencies, and the rest, in increasingly graphic fashion (CSI a long way from the old Quincy episodes).

Still, precisely because of the seriousness of the subject matter the procedurals seem especially worth mentioning. Even as the images got increasingly graphic they still tended to deal with ugly things order upheld by episode's end--making comfort food (or at least, what some were ready to accept as such) out of ingredients that looked unpromising at first glance. And it seems worth thinking about why this was so characteristic of TV. The most basic thing seems to me to be that, given the limited resources early TV had to work with, and the tightness of the censorship under which the makers of television had to work in the form's early days (you couldn't even show a pregnant woman on TV), comfort food was the medium's most plausible offering--a fact reinforced by the structure of the market. The competition was limited among a few outlets each going for mammoth shares of the market, while "prime time" was king--which is to say people sitting on a comfortable piece of furniture at the end of the day and staring at their screen like cavemen at the fire in their hearth.

Between the limited options, and sheer weariness--TV was what you watched when you didn't have the energy for anything else, like reading a book, or going out for a movie--people easily became habituated not to seeking out the scintillating, but leaving on what was least distasteful, with the name of the game keeping them from changing the channel at the end, one show leading to another to another so that you had them through the night, week after week, year after year. Where the broader audience is concerned it's easier to get by, especially if your material isn't all that great, by not taking yourself too seriously. (Indeed, looking at old reruns of Magnum P.I., The A-Team, MacGyver, I am constantly reminded of just how much those action-adventure shows got by on their sense of humor.) Meanwhile the practical difficulty of accessing the content (if you didn't catch it when it was aired, that was pretty much it), and the importance of summer reruns and a lucrative syndication market, meant there was a heavy stress on "rewatch" value, on things people would be pleased, or at least willing, to see again and again. This, too, encouraged the making of material people could take easily in bite-sized pieces, again and again. (Think of how we watched certain episodes of, for instance, the original Star Trek, or The Simpsons, until even their minutiae became part of the common frame of reference--like a beard that wasn't there before denoting the evil parallel universe version of an individual, or the way just about every line of dialogue in the first eight seasons or so of The Simpsons seems to have its own Reddit page, its own memes.) All of that encouraged a lowest common denominator logic that, again, pushed producers toward offering up comfort food, and audiences toward taking it.

All of that has since changed. The resources available to producers expanded, the censorship to which they were subject by no means disappeared but became more relaxed in respects that were not insignificant, the competition intensified, and people started treating TV-watching as a serious cultural activity, the more in as it became so much more accessible--rather than something looked at the end of the day something they watched on the smart phone on the way to work, or even at work when they should have been working. Meanwhile, their consumption of other media changed. Certainly no matter how much some in the press deny it, people read less, and what they read is, well, not what it once was. (I still can't get over the fact that John le Carrè was one of the biggest bestsellers around back in the '80s. Today even Robert Ludlum is likely too tough, as indicated by how a "young adult" version of Dan Brown is thought to be called for.) At the movies what we get are "Big Dumb Blockbusters" that can make us, as Joe Bauer did, look back longingly to a day and age that had "movies that had stories so you cared whose ass it was and why it was farting." Thus TV has become that much more important as their cultural and intellectual "nourishment"--all as people have been positioned to see what they want when they want by conveniences ranging from programmable VCRs to streaming, and the concern for the rewatch value of the show in reruns has fallen by the wayside, again, because there is so much out there from which to choose.

The result is that there is much more opportunity--and incentive--to produce TV which is not comfort food. But that hardly means that there is no audience for the old comfort food stuff, even if there would seem a generation gap here, with the Big Three/Big Four networks counting on those older audiences (one reason, I suppose, why they have been cranking out "new seasons" of The X-Files, Roseanne and Law & Order), while young people eat up the rather less comfortable stuff the other cable channels, streaming and the rest serve up as a matter of course.

Friday, March 11, 2022

Sam & Cat & the Techno-Hype of the ‘10s

Those who were attentive to children’s television in the late ‘00s and early ‘10s are likely to remember Dan Schneider’s Nickelodeon sitcoms iCarly and VICTORiOUS--the former one of the bigger hits of those glory days for original basic cable programming drawing in as many as 11 million viewers at its peak (numbers a Big Four network would have been happy to see in their own prime time programming). Of course, even those hits ran their course, and a predictable recourse on Schneider’s part was pairing two particularly well-received characters, Sam Puckett from iCarly and Cat Valentine from VICTORiOUS, in a spin-off show, Sam & Cat (2013-2014).

I suspect that for most the show is memorable merely as a minor element in the Schneiderverse that added a little more iCarly and VICTORiOUS content, and a footnote in the careers of the personages involved. However, recalling Sam & Cat a few years on it interests me as a time capsule from the prior round of techno-hype. After all, wacky as the adventures of the two leads could be, the show was never presented as science fiction in the manner of, for example, Schneider’s superhero-themed sitcom Henry Danger. It was set in the "mundane" present. But all the same visits to a restaurant with robot waiters were a regular feature of the episodes (the eatery "Bots" appeared in 21 of the 35 episodes), while one episode revolved around the chaos that ensued when a drone delivery went very badly (the drone carrying off a baby the two leads, who make their living with a babysitter service, were watching for a client). That was a reflection of how near at hand the technologies were thought to be, an impression which itself seems to have furthered the impression--inspiring a restauranteur in the Chinese metropolis of Ningbo to use robot waiters himself.

Of course, eight years after the show’s end what was supposed to be just around the corner remains remote--our restaurants still reliant on human servers to the point that robotized service remains sufficiently a novelty to bring in tourists and, when one shows up in your town, warrant mention in the local news, while you are very unlikely to see a drone bringing you your next retail order (and that in spite of two years and counting of pandemic-related disruption and a flood of inflationary cheap credit that gave business ample incentive and opportunity to aim for more automated operations. Just as the technology was "not yet there" in 2014 it does not seem to quite be here in 2022 either, with the result that Sam & Cat unintentionally looks a bit like The Jetsons has to later viewers, yesteryear’s vision of tomorrow.

A Note on the Directorial Career of Elaine May

Where the history of the "New Hollywood" is concerned some figures, of course, get far more attention than others--and others less--in ways that do not seem wholly explicable simply on the basis of their presence or the influence of their contributions (for better and worse), with two factors seeming to me to be worth noting.

The first would seem to be the particular importance accorded directors in that era, auteur theory-minded chroniclers often making it a story of auteur directors--a Robert Altman, a Peter Bogdanovich, a William Friedkin, a Francis Ford Coppola, a Hal Ashby, a Martin Scorsese, a Michael Cimino. As a result those New Hollywood figures who were a huge part of the scene, and even directed notable films, but on the whole made a bigger mark as screenwriters than as directors--a John Milius or a Robert Towne or a Paul Schrader--get that much less attention.

The second is that the New Hollywood tends to be identified with particular themes and a particular aesthetic—work which was socially critical, genuinely gritty, and taboo-breaking and envelope-pushing with regard to what could be put on the screen, with this extending not only to sex, violence and politics, but to cinematic technique (with Peter Biskind remarking the "Brechtian" tendency in '70s Hollywood). The result was that an auteur New Hollywood movie that did not fit the pattern has been less likely to be recognized as part of that scene--with an obvious example George Lucas' Star Wars (very much a New Hollywood film in a lot of ways, but very far removed from this stereotype, and indeed much more likely to be talked about as the beginning of the post-New Hollywood of crowd-pleasing merchandised, franchised action movie blockbusters).

Considering the career of Elaine May it seems to me that attentiveness to her career--which had her very much part of that scene, from her time as Mike Nichols' comedy partner forward, and her collaboration with Warren Beatty, as well as a maker of three major feature films in her own right in the 1970s (complete with typically New Hollywood battles over "final cut") and the very New Hollywood denouement to her time as a director with her fourth, Ishtar (bad buzz full of accusations of a crazed director going wildly overbudget on an unworthy project, movie critics dutifully bashing it in an atmosphere of frenzy, etc.)--has suffered in the same ways. As with Towne and Schrader her status as a screenwriter (with credits on, among other films, Heaven Can Wait and Reds and Tootsie, and even in the '90s Wolf and The Bird Cage and Primary Colors) overshadowing the reception for the movies she helmed. And as with Star Wars her films, like A New Leaf and The Heartbreak Kid, while deservedly regarded as classics, are a far cry from the New Hollywood stereotype, while, if perhaps coming closer as a crime drama, 1976's Mikey and Nicky was badly chopped up by the studio and practically buried (so much so that her career as a director could be considered to have already ended once before)--all as 1987's Ishtar got an undeservedly bad reception, precisely because of the ways in which it was actually New Hollywood (not least, its politics). Still, recent years have seen Ishtar undergoing a rehabilitation (very belated, and perhaps limited in comparison with other New Hollywood classics, but a rehabilitation all the same), which may, besides leading to a reevaluation of her work, also contribute to a fuller understanding of New Hollywood's history extending beyond the clichés--and with it, just how it was that Hollywood has ended up in its current, artistically none-too-fortunate state.

Ishtar, Vindicated?

In the waning days of the New Hollywood, when the Suits were getting their own back and breaking the artists to their Neuordnung of executive-controlled high concept guck, the press--then as now dominated by sycophants, if perhaps not quite so totally--followed a particular pattern in regard to the release of movie after movie. Simply put, a filmmaker who had rubbed the execs the wrong way would, in the midst of a major would be accused of going "out of control" in the pursuit of something not worth pursuing--going wildly overbudget in their obsession with some half-baked vision. Afterward the critics would dutifully say the movie was terrible, the film would be deemed a box office flop, and the filmmaker would never get another chance like that again.

Thus did it go with Francis Ford Coppola on Apocalypse Now, and Michael Cimino on Heaven's Gate. Of course, the reputation of those films has improved greatly since they got that treatment, to such a degree that it is they and not their duck-talking detractors who have been vindicated by history. It may have been too little, too late, to save their careers (whereas the careers of said hecklers, of course, went unscathed), but for whatever it may be worth that late vindication demonstrated the primacy of industry politics, and often politics politics, as "liberal Hollywood" cracked down on those artists who were even a little too far to the left of the liking of those in charge. (Heaven's Gate star Kris Kristofferson, certainly, has been very frank about what he believes happened with that movie, declaring that in a political atmosphere in which the Attorney General of the United States would go so far as to tell "studio heads that 'there should be no more pictures made with a negative view of American history'" the "right-wing revolutionaries" of the Reagan period "assassinated my last real movie.)

As it went with Apocalypse Now and Heaven's Gate so did it go with Ishtar, an Elaine May-helmed film costarring Warren Beatty (the folks who brought you Reds), with both a big budget and an outlook that can, from the standpoint of the Rambo-Top Gun '80s, seem audaciously leftish, with the CIA a backer of brutal despots who live in luxury while keeping their people in grinding poverty ("The dome of the emir's palace is gold, and the people of Ishtar have never seen a refrigerator"), and the Agency a picture of both murderousness and incompetence, and the good guys the rebels fighting their client, who win in the end.

I have to admit that there seems less prospect of this movie ever enjoying the "unfairly maligned masterpiece" status Apocalypse Now or Heaven's Gate has acquired over the years. May's film included some clever ideas, like the casting of Warren Beatty as the confidence-lacking, forlorn "loser" and Dustin Hoffman as the self-confident "ladies' man," but this was probably too "'60s inside joke" to go over well with the general audience in 1987 (or after), while the movie never comes close to the comic flair of Elaine's prior A New Leaf or The Heartbreak Kid. Those movies' best scenes had a tightness about the writing and direction--literally tight, in that they tended to take place among people clustered around a small table (think, for example, of Charles Grodin's Lenny Cantrow asking Eddie Albert for Cybil Shepherd's hand in marriage, or trying to break up with his wife Lila on their honeymoon, in The Heartbreak Kid), with this reflecting the source of the comedic tension, namely the collision between the aggressively one track-minded stupidity of the protagonist and the hapless individual unfortunately in his path with no way to turn. Alas, May is working on a broader canvas here--as wide as the Sahara in which the story is set--while the characters as written lack that forcefulness. (Looking at Dustin Hoffman's Chuck Clarke and Warren Beatty's Lyle Rogers I found myself remembering what Roger Ebert said of A Night at the Roxbury, calling it "the first comedy I've attended where you feel that to laugh would be cruel to the characters." Thus did it seem in regard to Clarke and Rogers--who in an early scene attempts suicide after his girlfriend breaks up with him.) The approach is different, the characters different (even the ladies-man-in-his-own-mind Clarke is no Cantrow), and I came away with the impression that May, while admirable in attempting to do something different, had in this case undertaken an experiment that did not entirely work. Still, that leaves Ishtar a good deal better than its longtime "one of the worst movies ever" status, and, even if, again, this is happening after the damage to her career (and Beatty's career), was done, it seems only fair that the movie's reputation has been improving in recent years--and a reminder of the real politics of Hollywood, and the entertainment press, in the '80s and since.

The Hypocrisy of Respectful Disagreement--and the Scarcity of Genuine "Respect" in the World

I have previously remarked the reality that, especially if one takes the word respect to mean what it has long meant, deference, then "respectful disagreement" is a contradiction in terms.

Yet people insist that there can be such a thing as that.

Apart from the not inconsiderable reasons that they don't know the meanings of simple words, why do so many insist on that?

Simply put, they don't want to give up their disagreement while at the same time not admitting that they are being disrespectful--while, perhaps, determinedly overlooking the disrespect with which they are being treated for the sake of avoiding confrontation.

The pretension of respectful disagreement also enables them to overlook the reality that the "respect" of which we hear so much overwhelmingly derives from sources other than genuine esteem for those upon whom respect is bestowed. "The essence of any hierarchy is retaliation" Samuel Shem wrote in The House of God, and so does it go with the "respect for authority" incessantly propounded by so many as the highest of virtues--respect far, far more likely than anything else to be a matter of tribute extorted by the more-powerful from the less-powerful in a manner that, like so much else we take it for granted, gives the lie to all our talk of freedom, equality, individual dignity and the rest.

It also reminds us that what respect is supposed to mean in many minds is a rare thing indeed. After all, respectfulness as something other than an outward show of deference to another out of fear of a blow requires humility, patience, self-restraint on our part. It means admitting that perhaps, just perhaps, another person knows better than we do (as in "I respect your knowledge" or "I respect your judgment")--or that even if we are sure we really do know better we have no right to dispute their decision (as in "I respect your privacy"). It means that because of this we listen rather than speak, and resign our right to a say.

That said, now may be a good time to ask yourself honestly just how many people in your life who have actually displayed such qualities, as against how many loudmouthed idiots who think they know best about all things and insist that everyone hear what they have to say, all the time, on everything, regardless of whether they want to or not, no matter what, commonly inflicting on those around them not only their insufferable disrespect but what may be their even more insufferable stupidity. If there is even one person in your life who genuinely has these qualities you can count yourself lucky--and I dare say, owe them at least a measure of genuine respect for having those qualities.

'90s Ludlum: Notes on The Apocalypse Watch

In thinking about Robert Ludlum’s career I have tended to compare the Ludlum of the '70s, and especially the early and mid-'70s with the later Ludlum--who produced a lower output of larger and more formulaic books that significantly reused older ideas, which relied less on their political edge, or plain and simple suspense, and more on shoot 'em up action for their interest (books like The Bourne Identity and The Parsifal Mosaic rather than Trevayne or The Chancellor Manuscript). Still, reconsidering what I think of as Ludlum's three "Fourth Reich" novels--the '70s-era The Holcroft Covenant, the '80s-era The Aquitaine Progression, and the '90s-era The Apocalypse Watch--it seemed to me that there was something worth saying specifically about the Ludlum of the '90s, whose work did, indeed, seem distinctly of the decade, certainly to go by that particular book.

A logical starting point is that particular ‘90s trait, the thinness of the political premise of The Apocalypse Watch. What are these "Nazis" all about? What exactly are they looking to do and what is driving them, and enabling them? In contrast with the prior books' evocations of history, the sense they offered of the villains' demons, the high places in which those villains were situated, the sense of society's vulnerability to the madmen and their manipulations, this time around the villains' objects and sponsors are discussed in only the vaguest manner, which along with their less formidable situation leaves them looking a hazier, weaker foe than in Ludlum's preceding uses of the theme, the more so for this book's particular plot structure. In those earlier books the villains seemed like giants, and the hero very small next to them, because of how untouchable the villains were, and how these untouchable villains put a target on his back, forcing him on the run. In line with this pattern, when writing professional intelligence operatives Ludlum made a point of alienating them from their organization--to the point that their own people are pursuing them with a readiness to kill (The Matarese Circle, The Bourne Identity, The Parsifal Mosaic). By contrast The Apocalypse Watch's Drew Latham comes off as merely a cop walking a beat, and in line with that duty tackling a pack of particularly nasty but essentially commonplace gangsters. (It factors into the sense of cop drama-ness about it that in contrast with the usual globetrotting Ludlum narratives Latham's activities are mostly confined to a single metro area--while it is all too apt that in the midst of his running about his superior calls him a "loose cannon.")

As one might guess the scenario, even beyond its familiarity, has less edge, and the author endeavors to make up for it with action, the reliance on which is exceedingly heavy even by late Ludlum standards. The book really piles shootout atop shootout atop shootout--indeed, piles commando raid atop commando raid atop commando raid--through its six hundred pages. At their best serviceable rather than, to use that descriptor Ludlum liked so much, "brilliant," the sequences' details become repetitious, two separate scenes of the type entailing the heroes' stumbling upon prostitutes who just so happen to have been hired by their targets, and whose disgust with their "johns" makes them amenable to aiding the commandos' infiltration of the target’s residence. Connected with this was another break with Ludlum's usual pattern, namely the techno-thriller/sci-fi touches--from mind-controlling computer chips implanted in people's heads, to giant World War II-era Messerschmitt gliders, to a "They saved Hitler’s brain!"-type twist at the end, that, unfortunately, are of a piece with the comparative sloppiness of the plotting (with much in the overstuffed concoction amounting to little, and much not properly followed up, like, you know, the Nazis setting the Bundestag on fire).

Meanwhile, in those portions of the book in which there isn’t actually shooting (there are some) the author offers up the kind of "snappy" dialogue that drags out a scene without enriching it, and which while far from unprecedented in Ludlum’s work, still felt as if it belonged to a different "voice." Where Ludlum himself remarked particular characters of his older works as sounding like they came from the "thirties . . . [of] the late night television films" (The Matlock Paper) or "thirties Harvard with . . . pretentious emphasis" (Trevayne) in apparent obliviousness to how much his books in general sound like that, here what the dialogue recalled was stereotyped hard-boiled crime, with the evocation displaying a self-consciousness and show-offiness and glee about their anachronism. This seemed all the more telling given the metafictional indulgences (the pseudo-hardboiled dialogue apart, a minor plot point centering on a character's attempt to buy an Aston Martin DB4, which car's appearance in Goldfinger is mentioned more than once, while Latham's father’s work as an archaeologist inevitably meant an Indiana Jones reference).

All in all picking up this Ludlum novel felt like picking up a big Jack Ryan hardback and finding a really long Op-Center novel inside. In fact, I would not be surprised to find that Ludlum had started the book (there is a genuinely Ludlumesque feel to some of the earlier material--something of his prose and his sensibility evident the initial scene in the Nazi facility in Austria, and later the Jean-Pierre Villiers subplot), that the author had for whatever reason been unable to meet his schedule (burning out, perhaps, after so many years and books and amid the collapsing market for novels like these in the ‘90s?), and then handed the work over to some jobbing writer to finish it, who either decided, or was told, to go for a book that was more "contemporary" or "hipper," with this seeming the more plausible given how Ludlum was so soon to be "co-authoring" books in great numbers, by and large blander, virtually mass-manufactured books of the kind that have characterized so much more of popular fiction since--and which, when I find myself looking for a piece of light reading, send me running back to the more satisfying thrillers of an earlier day.

Monday, March 7, 2022

The Sopranos and the Decline of Television Criticism

As I have remarked in the past the country was clearly going off the rails mentally back in the ‘90s--but it knew it was going off the rails, and at least gave some sign of caring that it was going off the rails, in part because it occasionally criticized itself for doing so. And the main thing that distinguishes the cultural commentary of the time from where we are now is the demise of that self-awareness and that capacity for self-criticism (even if by that point they had been reduced to a mere flicker of what they once were).

One small example of that would seem to be how television criticism has changed. I remember, for instance, how unhinged the TV critics got in their "rapturous praises" of The Sopranos--and also remember a hilarious Saturday Night Live sketch that presented a commercial for the show’s upcoming season, saturated with quotes plucked from (mock) reviews claiming that The Sopranos "will one day replace oxygen as the thing we breathe in order to stay alive" (!).

Since then it seems that this kind of over-the-top praise has become a common mode for TV criticism, particularly for the more prestigious cable/streaming productions (They're mad for Mad Men, euphoric about Euphoria! Breaking Bad! House of Cards! Ad nauseam.)--and no one thinks twice about it. Instead "everyone" insists that this is the Golden Age of TV--because the critics tell them it is, and they have lost their own critical perspective on those critics.

What happened to make it so? I suspect one factor was that so much TV became such a "middlebrow," "Midcult" affair--using the "modern idiom in the service of the banal," as Dwight Macdonald put it, doing arthousey things that made the typical "educated" person think it was art even when it wasn’t. An aspect of this is how television became so much more complicated even when it wasn’t being particularly intelligent--with Arrow a good example of this, I think. What I saw of the show was pretty standard stuff as far as comic book action-adventure goes. (Thus did season four, for the millionth time, have a villain whose big plan for "regenerating" the world was instigating a massive nuclear exchange.) But the nonlinearity of the episodes, the fact that not just the "present day" material of the episodes was connected with season-length arcs but so were the constant flashbacks, with the former significantly evoking the latter, made the show demanding viewing, even when one could unkindly characterize it as stale and stupid.

If you will forgive what may seem a pretentious allusion--putting in the brain-work to follow the tale told by an idiot, to other idiots, for the delight of idiots, can make people who may not neccesarily be idiots miss the fact that the tale signifies nothing. Thus do English students go on battering their heads against the middle finger to the principle of coherent, rational meaning that is T.S. Eliot’s "The Wasteland" a century after the publication of that unfortunate poem, in spite of the complete frankness of Eliot about just what it was that he did in writing it, while convalescing after a "nervous breakdown" ("just sayin’," as they say).

The students, of course, would not be doing this were they not unfortunate in having instructors who, monomaniacally fixated on claiming to find meanings in works other academics told them were "important" (they usually don’t make that judgment themselves); and completely accepting of the sheer, nihilistic depth of Modernism’s irrationality and anti-rationality in all its absurdity and contradiction without really understanding it; prove they don't understand it by presuming to dig such meanings out of such works in quasi-rational fashion, with points given for abstruseness and implausibility in what can seem more parlor game than scholarly enterprise. (John Crowe Ransom, whose The New Criticism, frustrating as it is, was for me indispensable to getting this stuff, wrote that for the Modernists, for whom it seemed "the bottom [had been knocked] out of history and language and [they] become early Greeks again" thought a work's "ontological density . . . proves itself by logical obscurity"--its incoherence from the logical, rational standpoint proof that it is the truth about the world in which we live. So yeah, if this stuff seems crazy to you it’s not you, it’s them, definitely them.)

The critics at the more upmarket and mainstream outlets, being middlebrows, are those who accepted the instructors’ beliefs unquestioningly, or have come to imitate those who did. Naturally they fall for the trick every time, while, a scale ranging from "good to excellent" not being satisfactory anymore, they grade it all from "excellent to outstanding," while taking the fulsomeness of their praises totally for granted.

Subscribe Now: Feed Icon