Sunday, December 23, 2018

"Are the Arts Just For the Rich?"

Watching Patrick Stewart in William Shatner's documentary The Captains' Close-Up talk about his childhood, I was surprised by his working-class Yorkshire upbringing. And afterward I couldn't help thinking that where the current generation of British actors is concerned, none seem to have a background remotely like that. Look them up, and you are far more likely to find they are a graduate of Eton, an alumnus of the Dragon School.

Naturally it was with interest that I read a series of pieces in The Guardian on the phenomenon--some dealing with actors specifically, others the arts more generally. The pieces were far from as incisive as they could have been--not least, in their use of the term "middle class." A famously fuzzy term, it can be used to deliberately cloud the issue, frankly to downplay the extent to which a person from an extremely privileged background actually comes from an extremely privileged background, and often is, not least by The Guardian when discussing artists. Consider the extract from its typically puff piece profile of Killing Eve creator Phoebe Waller-Bridge:
Waller-Bridge grew up in an upper-middle-class family with baronets on both sides. Her father co-founded Tradepoint, the first fully electronic stock market . . . and her mother works for the Ironmongers' Company in the City.
On what planet is someone with "baronets on both sides" of the family, someone whose father cofounded a company with its own Wikipedia page (Tradepoint, later SWX Europe), merely "upper middle-class?"1"

Even in a culture so mendaciously pretending that "we're all middle class now," this is too, too much.

All the same, if the term is used in a way so loose as to be obfuscating (at best), the pieces offer a robust round-up of a sizable body of anecdotal evidence (Helen Mirren, Judi Dench and other luminaries observing their chances would have been very different today), and a growing body of solid statistical study, that confirms what David Graeber (writing of America rather than Britain, though there is no reason to think all this is less applicable to it) remarked when looking at the social, class realities of intellectual and creative life that "if your aim is to pursue any sort of value [but money]--whether that be truth . . . beauty . . . justice . . . charity, and so forth . . . [and] be paid a living wage for it"--those "jobs where one can live well, and still feel one is serving some higher purpose"--only those with "a certain degree of family wealth, social networks, and cultural capital" have much of a shot, while for the rest "there's simply no way in" (253).

In spite of which they are typically told by the "successful," and sometimes simply "everyone," that their failures are entirely their own fault, that they simply were not "good enough."

Which bullshit--the philosopher Harry Frankfurt's term certainly seems applicable here--is the conventional wisdom in an era that has never had much patience with the claims of artists to support or even sympathy.

The attitude shows up the illusion that the contemporary world is a meritocracy, makes a mockery of its slogans about opportunity, and not only diminishes the lives of those cut off from such careers solely for having made the mistake of not being born rich and well-connected, but the arts themselves, and through them, society as a whole.

Bullshit Jobs and the New Hollywood

What one might call the "myth" of the "New Hollywood" of the 1960s and 1970s is that the artists' hubris led to the alienation of key partners and allies and sponsors, personal antagonisms and artistic and commercial flops that destroyed their careers; that the limited appetite for artistically daring and politically radical content had been exhausted, as risque content became mainstreamed and as the politics of the country moved right, eliminating both inspiration and audience; and the rush to imitate the success of Steven Spielberg's Jaws and George Lucas' Star Wars.

However, reading the relevant history I found myself struck by the changing business conditions above all. The New Hollywood happened in a period of sharp changes in media and entertainment (above all, the rise of TV), but also an interregenum between business models--the studio system was dying, but the hyper-financialized multinational multimedia corporation with its mission of barraging global audiences with "high concept" content had not yet established itself. The real, significant but still very limited margin of freedom that the New Hollywood artists enjoyed in between was inconceivable except in that interregenum, and came to an end with its close.

Interestingly, it is one of the many smaller jobs David Graeber takes up in Bullshit Jobs, where his emphasis is on the aftermath of that closure, described as "a corporatization far more stifling than anything that had come before" (186). In the book's discussion the key element is the complication of the process of "development" which has seen vast numbers of executives having to successively sign off on a project while getting in their two cents--a case of too many cooks spoiling the broth, the more so because none of them know a thing about cooking, and are often just trying to justify their existences (because, by and large, they know nothing of film, and by any reasonable measure, epitomize "bullshit" in the sense Graeber writes of it).

As it happens, contemporary Hollywood comes up a second time in the book, namely its closed nature. As Graeber remarks,
Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste (252).
The combination of the industry's high-concept, global-market imperative, with this production process that looks like a recipe for incoherence or worse, and the bizarre situation in which the film industry is dominated by a closed "aristocracy" (his term), do not seem at all irrelevant to the combination of artlessness, and extraordinary divorce from lived reality, of what passes for "cinema" in our time.

Sunday, December 16, 2018

Are They Trolling Me? A Test

I have already written a blog post about spotting trolls, but thought I'd present some of those thoughts again in a somewhat more systematized, handier fashion--in the form of a test to which people can reasonably put an interlocutor's statements. Admittedly based on limited personal experience I don't claim that it is scientific or pretend that it is foolproof, but I find the essentials handy, think others might find them handy too, and that feedback from others might help develop this into something better, so here it goes--a set of questions with differing weights.

1. Did they butt into a conversation already ongoing between other people? (10 points)

2. If they butted in, was it into the conversation of people who clearly had a very different attitude toward the subject matter? (For example, a #MAGA butting into a conversation between two #Resist in a thread started by another #Resist account's Retweeting an item from a progressive media outlet?) (10 points)

3. Are they just using this as an occasion to inflict talking points on people they seem well aware have no interest in hearing them (rather than engaging others in a manner indicative of genuine interest in their thoughts and opinions)? (20 points)

4. Are they demanding that other people provide lavish defenses for their opinions? (10 points)

5. Are they trying to press the other person into agreeing with them? For example, do they keep repeating their talking points in a manner suggestive of pressuring them, or asking the same questions over and over again after the object of their questioning has given them all the answer they have to offer? (20 points)

6. Do they use insults or make personal attacks? (30 points)

Bonus Question: Does their profile flaunt a troll's attitude? Trolls are often proud of their activity, such that I have seen some declare that they are trolls in their Twitter profile, while others, only slightly less subtle, boast in transgressive ways about their mean-spiritedness and sadism to shock, offend or intimidate, in their imagery as well as in their words. (This can include reference to or glorification of violent acts in their handles and profile pictures; it may also include the celebration of much maligned figures or evocations of their ideology as studied provocations.) (40 points-and just 40 points because even a troll might not be a troll all the time.)

Ask each of these questions in regard to the suspected troll, and add up the points warranted by each "Yes." Someone who scores a 30 is suspect as a troll, someone who scores a 40 is likely to be that, someone who scores a 50 or more probably that.

The first six questions allow for a score of 100, while adding in the bonus question permits a maximum score of 140.

As that suggests, trolls, by nature unsubtle, tend to give themselves away very early on in the game. I think it best to take the hint, and best that you also do the same. Do not fall for the lie that you are being a thin-skinned "snowflake" or intolerant of "free speech." Accusing people for being thin-skinned for simply not accepting abuse is what a bully does. ("What, can't take a joke? Don't you have any sense of humor? You're no fun.") And respect for freedom of speech does not mean that you are personally required to spend every waking moment of the rest of your life being an audience for people who disagree with you--and still less, people who are promoting ideas that are genuinely loathsome. (I'm making a value judgment here, and not apologizing for it.) Feel free to cut a troll off as soon as you have become convinced of their purpose. Mute, block and REPORT as seems appropriate to you.

Sunday, December 9, 2018

What Makes a Troll a Troll?

We all know that the Internet, and especially social media, are inundated with trolls--pathetic, repugnant, dark triad-afflicted losers (narcissistic, Machiavellian, psychopathic) whose idea of a good time is ruining them for everyone else.

Many will acknowledge that it is a bad idea to respond to a troll. They are uninterested in anything you have to say, only in distracting you from what you were doing, knocking the conversation off track, making you waste your time and effort dealing with them, and causing whatever harm they can. This may be so much the case that they will say things they don't mean simply to satisfy their desire to disrupt and to hurt--but much of the time, maybe, probably, most of the time they believe them, if not always wholeheartedly.

But how does one tell the difference between trolling and an opinion they simply do not like?

I find that trolls tend to butt into ongoing conversations among other people--usually, people of quite a different sensibility than themselves. (To choose an admittedly non-neutral example, one might find, for example, a few people who might all be considered left-of-center responding to an item to which such persons might be expected to be more attentive than their counterparts on the right when a right-winger suddenly turns up.)

I also find that when they do butt in they do one of three things:

1. Fling insult and abuse. (All they can do is call the participants in the conversation stupid--which often betrays that this is exactly what they are.)

2. Rub their opinions in the faces of people they expect to be repulsed by those opinions, rather than try to actually discuss anything. It is a little harder to be sure of this than, for example, insult. Still, there are giveaways. Such opinions are typically canned, often by someone else (they don't actually do much thinking for themselves, or they'd have better things to do than this), and commonly irrelevant to the conversation into which they have entered. (I recall a thread where people discussed the President's reply to the Thanksgiving Day question "What are you thankful for?"--and only that--and someone felt the need to inject the claim that "Socialism killed 100 million people.") When others react, they commonly display satisfaction, perhaps repeating the action.

In cases I have had the impression that they are like exhibitionists, deriving satisfaction from others' disgusted reactions. In others, they seem to delight in lobbing a grenade into a crowd of bystanders. In still others, it seems they are more purposeful--intent on diverting a discussion, for example. (I recall the comments thread of a news story about the Netherlands' police training eagles to catch drones, and seeing it from the start diverted into an attack on solar energy, with the hundreds of comments that followed caught up in the ensuing flame war. Alas, renewable energy and sane climate policies are very common troll targets.)

3. In the rare case that they are able to actually interact with others regarding the subject, they behave in very unreasonable, bullying fashion. They do not ask for an explanation of another person's opinion, they demand that they defend it, and raise the bar for such defense very high--asking for the equivalent of a doctoral dissertation, complete with footnotes, as the price of their having opened their mouth. Even after the person has given such explanation as they have to offer, the questioner refuse to accept that the other person has said their piece and keeps coming at them, as if intent on making them recant, on converting them to their cause. (I have, unfortunately, found myself having such talks with proponents of atomic energy who want me to "admit" that renewable energy can never be the foundation of our energy base.)

Can normally decent people find themselves acting in ways similar to this? It's not impossible. Annoyance, a foul mood, and they slip into some bad behavior. But I think one can go too far with the "Everyone can be a troll" line, and certainly anyone who has doubts about a particular interlocutor can, on Twitter at least, just look at their account, see the way the suspected troll has chosen to present themselves to the online world, see the things they choose to post and share.

Some I have seen proudly and not at all ironically write the word "Troll" in their bio. And I find it best to take them at their word.

When I run into someone fitting someone fitting the profile described here, I don't mute the conversation, I block them. Permanently.

Yesterday, one of my comments proved to be pure troll-bait, alas, inciting dozens of attacks from people who responded in exactly these ways. I have blocked each and every one, and at the time of this writing, find myself continuing to block them--setting a one-day record, which is, I suppose, why I buckled down and wrote up this post.

I strongly urge everyone else to similarly block those who conduct themselves in such a manner. Trolls, especially as defined here, may have the right to speak, but you have no obligation to listen to them, let alone answer them. They have no right to take your time and attention, let alone inflict harm upon your mental well-being.

Do not let them do it.

Wednesday, October 31, 2018

Yes, Tax Breaks ARE Subsidies

Many have encountered the claim, typically made by right-wing and especially libertarian commentators, that tax breaks are somehow not subsidies.

As is usually the case when libertarians and other champions of the most hardline economic orthodoxy make such "educational" pronouncements, the assertion is, all at once, utterly contrary to any conventional understanding of economic reality; Olympian in its arrogance and contempt toward anyone who would presume differently; and opaque in its reasoning.

After all, Investopedia, no bastion of left-wing, communistic thought, defines a subsidy as:
a benefit given to an individual, business or institution, usually by the government. It is usually in the form of a cash payment or a tax reduction. The subsidy is typically given to remove some type of burden, and it is often considered to be in the overall interest of the public, given to promote a social good or an economic policy.
Note that "tax reduction" is explicitly included in the definition they provide.

Indeed, one does not find the essentials contested even in the explanation of what a subsidy is provided in the Mises Institute article that seems to usually be at or near the top of the list of search hits on Google when anyone uses it to research the topic, "No, Tax Breaks are Not Subsidies." While the article does charge an "economic and ethical" difference between a cash payment and a tax break (the difference between keeping what you have and being given something taken from someone else), this does not in and of itself make a tax break not a subsidy. In fact, the article acknowledges that "entrepreneurs who take advantage of tax breaks will incur fewer costs than entrepreneurs who don't," and that such breaks are indeed "beneficial to those who claim them"--which is totally in line with the idea that, in conferring advantage to some business at public cost (in this case, forgone tax income, which is shifted to other parties who accordingly bear a greater part of the tax burden, or the burden of forgoing such expenditures), a tax break has the same practical costs and benefits as a subsidy.

So where, one might wonder, is the argument?

The real objection of the piece's author to the term's use in this case appears later, when the article continues with the explanation that government is "not a wealth creator," but only a taker of others' incomes. Because of this such seeming special advantages are really just reprieves from taxation on business that it compares in its various metaphors to a "life jacket in a sea of wealth redistribution," with capitalism only able "to breathe" through such loopholes. Indeed, the web site followed up this piece with another article titled "The Answer to 'Unfair' Tax Breaks is More Tax Breaks."

In short, the "tax breaks are not subsidies" argument assumes that governments have no business taxing business, so that any such taxation is illegitimate (rather than a We-the-people argument for "No taxation without representation" their stance is a pre-1789 French aristocrat's arrogant "No taxes on us, ever!"), and accordingly any respite from such taxation only just, natural, appropriate, and not at all a show of special favor.

None of this is self-evident to most. And when explained to them, most will flatly (and justly) reject the premise of the argument.

Moreover, it will appear the case not only that a subsidy IS a subsidy IS a subsidy, but that, as is so often the case the arrogance and opacity are mere cover for unbelievably shoddy reasoning in the name of the self-serving ideas to which persons and institutions of this type are so prone. Like the claim that one can "never run out of" a given resource, or that involuntary unemployment "cannot exist," an economists' insistence that "tax breaks are not subsidies" is simply a matter of market fundamentalist theologians posing as social scientists, in the way that orthodox economists have been doing for at least a century now; of the staff of right-wing think tanks and the like pretending to be public intellectuals interested in and explaining the actual world rather than PR hacks for the interests that pay their salaries, utilizing theories formed and held without interest in or regard for the real world because they conveniently legitimize their political positions; of the inmates in the asylum insisting that they are really sane while it is everyone else who is crazy.

Saturday, October 27, 2018

George Orwell's Nineteen Eighty-Four: A Note

Some time ago a poll in Britain identified George Orwell's Nineteen Eighty-Four as the book that people most often lie about having read.

Naturally the ways in which we hear the book discussed often seem to miss the point. In line with the prejudices fostered by Cold War orthodoxy, they also identify it with socialism--without which the book might not be nearly so widely cited and promoted as it is today.

Alas, the latter conveniently overlooks the fact that Orwell, while an anti-Stalinist, was still a socialist; and what he described in his book was not meant to be thought of as socialism but rather a pseudo-socialism intended to prevent the genuine kind from ever arriving. Put another way, it was intended to preserve the existence of inequality, a power elite, the mechanisms of oppression as ends in themselves. The key tool was the militarization of society--keeping it in a state of perpetual war.

I direct the reader--you know, the one actually reading the book rather than saying they did, to Chapter III of the book-within-a-book that is Emmanuel Goldstein's The Theory and Practice of Collective Oligarchism, "War is Peace." There, as Goldstein observes, the rising productivity of machine-equipped industrialized society was making it possible to eliminate "hunger, overwork, dirt, illiteracy, and disease" before very long, and indeed led to a "vision of a future society unbelievably rich, leisured, orderly, and efficient"--what we call the capital "F" Future of which the flying car is the symbol--"was part of the consciousness of nearly every literate person." Those in power saw that this would spell the end of any need or justification for "human inequality," and indeed, "hierarchical society"--distinction conferred by wealth, and "POWER . . . remained in the hands of a small privileged caste," and this "privileged minority," now functionless, swept away by the well-fed, literate, thinking lower orders who no longer had use for them, and knew it.

Indeed, Goldstein declares that "[i]n the long run, a hierarchical society was only possible on a basis of poverty and ignorance." Deliberate restriction of economic output to prevent this proved politically unworkable (as the post-World War I stagnation and Great Depression proved), but perpetual war afforded an option for "us[ing] up the products of the machine without raising the general standard of living," which indeed was the principal function of the eternal war amongst Oceania, Eurasia and Eastasia. After all, there is nothing like the cry "We're at war!" to make people stomach austerity, deprivation, repression; to make them think the stifling of dissent justified--with even liberal societies seeing war time and again prove the end of progressive hopes.1

This commitment to inequality and oppression for their own sake, and the extremes to which an elite fearful of its position might go to resist a movement toward a more egalitarian order of things; the recognition of how eternal emergency can be an excuse for such a state--these were arguably Orwell's greatest insights, and warnings we have ignored at our very great peril.2

1. Chris Hedges wrote about this in The Death of the Liberal Class. John K. Galbraith wrote about his own experience of this in his A Journey Through Economic Time: A Firsthand View.
2. In the closing essay of his collection On History, Eric Hobsbawm reflects on the sheer bloodiness and brutality of the twentieth century and suggests that it was as horrible as it was because it was, at bottom, a "century of counterrevolution," and no one can be more vicious than the privileged when they feel that their selfishness is threatened.

Friday, October 26, 2018

Review: On History, by Eric Hobsbawm

New York: New Press, 1997, pp. 305.

As the title of his book implies, here historian Eric Hobsbawm is writing less of history here than of historiography. Academic a subject as this may sound, however, this is not a collection of minutely academic articles for ultra-specialists. Over half of the pieces (eleven) are lectures given not in the classroom but at special events at various institutions around the world; two more are conference papers; and two of the pieces originally written for publication were book reviews in non-academic forums--the Times Literary Supplement and the New York Review of Books. It is the case, too, that Hobsbawm is an unfashionably staunch, forceful and persuasive defender of reason and the Enlightenment that he is, and of the value of history and historians as well.

Consequently, despite the methodological emphasis of the book, and the fact that most of his pieces raise more questions than answers, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure to, even disinterest in, grappling with a historical reality that time and again proves the utter worthlessness of their models and their apologia. Despite the methodological emphasis of the book, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure at, and even disinterest in, grappling with a historical reality that time and again demonstrates the utter worthlessness of their models and their apologia (fitting them, as he remarks, for the "dog-collar of the (lay) theologian" (106)).

Of course, more than two decades have passed since the publication of this collection, and even at the time many of the pieces were already a generation old. Still, if in places they show their age by treating old phenomena as if they were new (as when he writes of the arrival of Marxism in the Academy, Walt Whitman Rostow's modernization theory, cliometrics, and the break-up of Yugoslavia), they retain their interest because what is past is not past, and in some cases more present than ever--and often, more pervasive, more dominant, more damaging than before. Certainly this the case with his critique of orthodox economists' pieties, but this goes, too, with what he has to say of other unhealthy tendencies of the late twentieth century, like postmodernism and identity politics. As he memorably writes regarding the latter,
few relativists have the full courage of their convictions, at least when it comes to deciding such questions as whether Hitler's Holocaust took place or not. However, in any case, relativism will not do in history any more than in law courts. Whether the accused in a murder trial is or is not guilty depends on the assessment of old-fashioned positivist evidence . . . Any innocent readers who find themselves in the dock will do well to appeal to it. It is the lawyers for the guilty ones who fall back on postmodern lines of defence (viii).
Indeed, reading Hobsbawm I find myself remembering another book by another great of the past century who "writes of the Enlightenment without a sneer," C. Wright Mills' The Sociological Imagination. Mills was a sociologist who called on social scientists (or "social students?") to be more historically minded, while here Hobsbawm as a historian made a not dissimilar case. Both were entirely right, and that they have been so little heeded has impoverished the lines of inquiry with which they were concerned, and our collective understanding of our own world.

Saturday, September 8, 2018

Shelley, Verne and Wells--especially Wells

When we recount the history of science fiction, it is common to point to some early figure as its founder--for instance, Mary Shelley, Jules Verne or H.G. Wells.

The emphasis on identifying a single, founding work by a single author strikes me as unsatisfying for numerous reasons, only one of which I want to go into here right now---that it makes much more sense to credit them with each defining a crucial strand of science fiction, already in place before it coalesced into a genre (in the '20s, on the watch of Hugo Gernsback).

Note that I write defining here, not founding, because it would be excessive to claim that they did something that had never been done before to any degree. Rather they did what others might have done before in a certain way, got noticed for it, and became a model for those who followed in the process.

The Gothic writer Shelley defined the science-based horror story--where the scientific endeavor goes very, very bad. Verne defined the science-based adventure--where a discovery or invention sets us up for a thrill ride to the center of the Earth or the moon or for twenty thousand leagues under the sea which might teach us something along the way, most likely in the way of scientific facts. And Wells defined the science-based story not about some hero or anti-hero, but where, to borrow Isaac Asimov's phrase, humanity as a whole is the protagonist; the story which uses science to think about society, and considers how science, by way of ideas and technology, might change it; the story about a future that is clearly not the same as today, broadly, deeply and densely imagined; the science fiction story which can criticize things as they are, and bear the hope of progress.

Of these the last perhaps accounts for a smaller part of the genre's overall output than the other, older strands. Certainly it is less present in the more popular work than the influences of Shelley or Verne. (Horror stories, adventure stories, are the easier sell.) Still, it is Wells' strand that seems to me to not only have been far and away the most intellectually interesting, but to have given science fiction a cultural role beyond mere entertainment, or in some general way getting people excited about science; to have enabled science fiction to really matter. And it seems to me that the vigor of that particular tradition is, more than any other factor, the determinant of the health of the genre as a whole.

Are Books Too Long These Days?

Are books too long these days?

I will say up front that many of the novels that have most impressed, most affected, most influenced me were thousand-pagers. Fyodor Dostoyevsky's The Karamazov Brothers, for example. (I can't imagine Tales From the Singularity without that one.) Or Anthony Trollope's The Way We Live Now. (Which is still in a lot of ways The Way We Live Now in the twenty-first century.) Or Theodore Dreiser's An American Tragedy. (Has anything equally ambitious, sweeping, worthwhile been written about American life since?)

And reading my way through the classics, I encountered a good many that don't have a membership in that pantheon, but where I could appreciate what they were going for, and that trying to do it took half a million words (as Victor Hugo did in his national epic of France, Les Miserables, and Leo Tolstoy did in War and Peace).

Still, not every book needs to be so long as that. Not every story requires so much sheer mass. Most are better off without it. And in general I think those books that most of even that small minority that actually reads tends to actually read--the romances and thrillers and romantic thrillers--are ill-served by the demand for doorstops. What might be a brisk entertainment instead ends up bloated and slow, and often pretentious, and I find myself nostalgic for the quick and dirty writing of a half century ago, and the still older pulps. Reading Dirk Pitt at the series' best was a lot of fun, but there is a lot to be said for those who came before him, not least that other Clark-from-New-York-with-a-Fortress-of-Solitude, Doc Savage.

The Summer 2018 Box Office: Solo and The Meg

I haven't done an overview of the summer box office in quite a while, in part because it has become so damned repetitive, in its commercial successes--and perhaps even more consistently, its artistic failures. Every year Hollywood laments its earnings, every year the more critical critics decry the shallowness and sameness and staleness of it all, every year we hear promises that Hollywood will change, and every year, it demonstrates that not only has it not done so, but that its lack of memory is utter and total, the promise unremembered.

So I'll restrict my comments on the whole tedious thing to the two films I felt I actually had somethijng to say about: Solo and Meg.

After over three months of release in which it has had long play in every major market (even Japan got it before the end of June), Solo remains short of the $400 million global mark, like I guessed it would be after what, only in the context of the money poured into it, was regarded as a dismal weekend. What it will mean for the franchise remains very much a matter of rumor and speculation. But the shock seems undeniable.

By contrast Meg proved that rarity, a movie that performs above expectations rather than below them, and that still greater rarity, the seemingly written-off dump month release that proves a blockbuster. (The predictions were $10-20 million in its opening weekend in North America, but it actually pulled in more than twice the high end of the range, $45 million.) A somewhat more modest production with much more modest expectations, The Meg has already outgrossed Solo globally (the China market has helped a lot), and if the figures discussed earlier are to be believed (a $400 million break-even point, due to the advantages it enjoys as a Chinese coproduction in that makret), already well into the black, and still raking it in (in its fourth weekend of U.S. release, still at #2).

A follow-up is not a sure thing (given the nine figure budgets and equally hefty promotional bills, the hefty competition and the terms of Meg's success as a bit of goofy fun, the margins are not exactly vast), but it still looks quite likely. We might even see other Steve Alten works finding their way to the big screen as a result, in what looks like at least partial redemption of the dashed hopes held for it all way back when we first heard of the hopes for the project.

From the standpoint of the business Meg's success falls far short of balancing out Solo's underperformance, and all it represents (the ultimate in the Hollywood franchise mentality, by way of the franchise that did more than any other to establish the blockbuster as we know it). Still, looking at the two trajectories together, I suppose there's a certain symmetry in them.

Friday, September 7, 2018

Ian Watt, Irony and Criticism in Our Own Time

For me one of the most memorable aspects of Ian Watt's The Rise of the Novel (reviewed here) is his discussion of the proneness of critics to read irony where there is often actually none--which specifically cited Samuel Taylor Coleridge and Virginia Woolf's readings of Daniel Defoe.

Defoe, in Watts' view, had no such attitude to the apparent contradictions in his characters' behavior--their combination of relentless money-grubbing and relentless, verbose declarations of their piety, for example. This was not only because such things did not look as ridiculously hypocritical to them as they do to people of our own time, but because a display of such irony required a level of technical mastery in this kind of "realist" writing that eighteenth century novelists had yet to achieve. Indeed, Defoe's sloppiness as a writer is something Watt discusses quite some length, replying to Woolf's declaration that Defoe subdued "every element to his design" with the opinion that there is no
design whatsoever in the usual sense of the term . . . such an interpretation really a kind of indirect critical repayment for the feeling of superiority which Defoe enables us to derive from his humble and unconsidered prose, a feeling of superiority which enables us to convert the extreme cases of his narrative gaucherie into irony . . .
There is far, far too much such "conversion of narrative gaucherie into irony" today--more than in his, more perhaps than in any other time in history--with at least some of it coming from people who ought to know better. (I hesitate to name names, but one recent critic whom I hold in a good deal of regard reviewed Suzanne Collins' The Hunger Games in exactly this way, and then admitted that it was his way of spicing up the boring task of reviewing a book with such artless world-building--essentially, an admission that he wasn't doing his job of reviewing the book at all.) After all, in our postmodern day, the most inane subjective reaction can be held up as profound insight, and "irony for irony's sake" might be the critical slogan--irony for irony's sake because they cannot resist that "worst form of snobbery," because there is no better barrier to really thinking about anything than blowing it off in this pseudo-literate person's equivalent of the eternal "Whatever!"

Ian Watt and Shakespeare

Ian Watt's Rise of the Novel was, as discussed here, a study of eighteenth century literature. Still, that outstanding piece of literary analysis, history and sociology was comprehensive enough to have much to say about other subjects--not least, the works of William Shakespeare.

His remarks about the Bard were, of course, offhand. Still, in noting that Shakespeare, as very much a Medieval rather than a modern, and noting that such writers dealt in universal types rather than specific individuals; that they had their eye on abstractions rather than concrete facts; that they were prone to be loose in handling the flow of time or cause and effect relationships; and that in describing it all they were inclined to prettily decorate rather than rigorously denote and describe; he strikes me as having sum up a very large part of the challenge that reading Shakespeare presents a twenty-first century reader, a challenge they tend to fail.

Thus we read Julius Ceaser and find instead of a historical drama about ancient Rome as we would understand the term--just the dilemma of Brutus. Thus we read Hamlet--and feel that he's endlessly dithering, which becomes ammunition for pompous lectures on the character's lack of decisiveness. Or we don't find those things, because we don't really have enough of a handle on what's going on to have those reactions. We just read them because we're supposed to, without worrying about whether we "get it" or not, and then, if the statistics are accurate, after completing the obligatory school requirement (under the eye of a teacher who might not get all this themselves; they're probably taking all this on authority just as much the student), most of us probably don't read much of anything ever again.

Is there anyone else who thinks this isn't how it's supposed to go?

Of Character and the Larger Scene: A Note on Ian Watt's The Rise of the Novel

Recently reading Ian Watt's classic of literary criticism, The Rise of the Novel, one of the book's more compelling aspects seemed to me his attentiveness to the differing emphases novels can have--what I tend to think of as the Henry James-like emphasis on character and the "play of individualities" as the cornerstone of good writing, and an H.G. Wells-like stress on the larger social scene. Where the "highbrows" are concerned, James carried the day.

As Watt makes clear, however, both approaches were strongly represented among those foundational writers of the eighteenth century English novel. Watt identified Samuel Richardson with the stress on character (and the domestic themes to which such writing inclines), Henry Fielding with society. Consequently, while they share comparable status as founders of the English novel (indeed, in that long-ago eighteenth century lit course I had, we read both Richardson's Pamela and Fielding's Tom Jones), it would seem to be that James' victory over Wells' was also Richardson's over Fielding's.

The Way We Live Now by Anthony Trollope: A Second Note

As Anthony Trollope's great satire of late Victorian society opens, we are looking at Lady Carbury, who is in the midst of preparing for a release of her book Criminal Queens, soliciting what she hopes will be favorable reviews from the major London newspapers.

A writer anxiously soliciting reviews in the hope that they will make her book a success!

Alas, all the advances in technology since that time when railroads were the stuff of tech bubbles has not spared writers the burdens and annoyances and headaches and embarrassments and nerves of publicity-seeking, as every self-published author knows only too well.

The Way We Live Now by Anthony Trollope: A Note

Anthony Trollope's classic The Way We Live Now is his classic satire of late Victorian society, which bears more resemblance to our own than most of us can appreciate. It is something of a truism that railroads were the dot-coms of Trollope's time, but reading that novel, centered on a massive financial scandal centered on such a steampunk dot-com, shows in dramatic fashion just how much this was so. ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company.")

Trollope's take on it all has real bite, one reason why critics in his day were unappreciative of that book, and why they might be similarly unappreciative in ours (as a cursory look at a number of lists of nineteenth century classics has suggested to me), but it seems to have enjoyed a bit of an upsurge in popularity in recent years, because of its relevance--and I suppose, also by more recent writers' failings. Among their many disservices Modernism, postmodernism and the rest have rendered today's "serious" literature too toothless to properly write such an epic in our own time, and so for satire we can hardly do better than look to a tale of comparable doings in a time long past.

Subscribe Now: Feed Icon