Sunday, December 23, 2018

Review: Bullshit Jobs: A Theory, by David Graeber

New York: Simon & Schuster, 2018, pp. 368.

Back in 2013 David Graeber penned an article for Strike! Magazine regarding what seemed to him the explosion of pointless, economically irrational unemployment, which he colorfully termed "Bullshit Jobs."

In an extraordinarily rare case of an item with actual intellectual content going viral, it became something of an international sensation, leading to a broader research project and book-length treatment of the issue five years later. The book, like his earlier Debt, presents a large, intricate and conventional wisdom-smashing argument, but one that leaves the densely written and documented macrohistory to a minimum while in more sprightly fashion making a case rooted in anecdotal stories of present day individuals. In particular he relies on his collection of employees' self-reports about their work being "bullshit," defining the bullshit job as
a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case (9-10).
In line with this emphasis on the subjective, Graeber is also more concerned for the "moral and spiritual damage" done by the situation; the reasons why this is happening; and what might be done about it.

As one might guess, bullshit jobs consist substantially of administrative and bureaucratic functions of questionable value, especially at the middle management level. What may be more surprising to adherents of the "conventional wisdom" is that these are every bit as present in the private sector as in the public (in case it needed to be done, Graeber once more debunks the rightist/libertarian stereotype of bureaucracy as unique to government). Also surprising to those of the conventional turn of mind is the extent to which this has been bound up with the growth of those sectors that cheerleaders for neoliberalism endlessly exalt, and which have tended to shape other sectors in their turn--namely finance and information technology, whose ways have penetrated such non-profit institutions as the university. Thus everywhere one looks one sees an abundance of "flunkies" (whose job is to make the boss "look important," as a receptionist often does); "goons" whose essential task is to enable their employers' exploitation of the public, and often exist only because their competitors have them (as is often the case with telemarketers, lobbyists, corporate lawyers, public relations specialists); "duct tapers" (whose job consists of patching over quite fixable problems the bosses are too lazy, incompetent or short term-minded to resolve properly, as is often the case in computer programming); "box tickers" (whose task is to "allow an organization to claim it is doing something it is in fact not doing," like going through the motions on due diligence); and "taskmasters" (who invent additional unnecessary work for all of the unfortunate underlings named above).

Graeber also notes the ways in which "real" jobs rather than the "bullshit" kind may be infiltrated by or even an outgrowth of bullshit. As an example Graeber presents members of the teaching profession, who perform an essential function, but find their time increasingly taken up by administrative tasks; or researchers and their institutions, which devote increasing amounts of time to chasing grants (ironically, squandering billions that could be used to fund a vast amount of research).1 Janitors likewise remain essential, but janitorial work may be classifiable as a bullshit task when the workplace they happen to clean serves only a bullshit purpose--such as a PR firm. Given polling data indicating that as many as 40 percent of employees feel their jobs fall into the category he describes; and the other jobs which increasingly consist of bullshit activities, or service such activities in some respect or other; Graeber estimates that 50 percent of all work done today qualifies as bullshit "in the broadest sense of the term."

Graeber contends that this situation where half the country and the developed world are giving their waking hours to bullshit jobs entails enormous "spiritual" violence. The reason is that even if human beings are not naturals for the routine of intense and continuous toil from nine-to-five (or longer) demanded by their employers, human beings do have a significant need for purposeful (and even altruistic) activity. The make-work, the parody of work, that is a bullshit job denies them that, indeed harms them by inflicting on them the opposite, the anguish of pointless activity, while subjecting them to many of the worst features of employment--like the loss of personal freedom to the will of a boss who is often arbitrary and abusive, and more generally the alienations of which Karl Marx and C. Wright Mills wrote. Where the job pays relatively well, there is guilt and shame at their remuneration being unearned; and where the job makes the world actually a worse place, guilt and shame at their contribution to having made it so. (Indeed, many of those whose testimonials Graeber recounts quit well-paying, prestigious "bullshit" jobs to take less remunerative but more meaningful work.)

The absurdity, and tragedy, of the situation appear all the greater for the contrast with and ramifications for those doing productive, non-bullshit jobs, whose conditions deteriorate and whose compensation declines (turning them into "shit" jobs, making a genuine contribution but unpleasant to do). Graeber explains this partly in terms of know-nothing, corporatized administrators subjecting those workers to speed-ups, pay cuts and the like (there being actual productivity to squeeze further here); in part because right-wing populists stoke public hostility to such groups as teachers, nurses, and even auto workers. Indeed, Graeber goes so far as to contend that an envy of those who are genuinely productive is part of the political game being played. ("You get to do something meaningful with your life, and you expect a middle class standard of living too? How dare you!")

How could this perverse situation have been permitted to come about in a system that may not be respectful of workers, but at least prides itself on efficiency? For Graeber an important part of the story is that, rather than mass technological unemployment being a fear for the future, it is something that has already happened to us, leaving us with a "real" unemployment rate of 50 to 60 percent. However, what happened was that the gap was "stopp[ed] . . . by adding dummy jobs that are effectively made up." Indeed, Graeber argues (as he did in his essay "Flying Cars and the Declining Rate of Profit") that post-1960s neoliberalism is not a project for maximizing economic efficiency and growth as it claims to be (objects it has signally failed to deliver), but for preserving the existing order--capitalist property and labor relations (an object at which it has been rather more successful).

An Orwellian notion, Graeber admits that this sounds like "conspiracy theory," but regards it as an emergent phenomenon, resulting from the failings of the system, and of our misconceptions about what work is and what it ought to be. As he notes, the tendency to large numbers of flunkies and duct tapers and box tickers and taskmasters ordering them all about has been a result of the way financialization (and IT) led to a more ruthless, more conflictual, more alienating workplace overseen by managers increasingly divorced from and ignorant of the actual productive task, and in greater need of surveilling their work force. (Indeed, it was this unfamiliarity of managers with actual work that led to the blooming of the "consulting" industry.)

However, Graeber devotes more of his attention to the deeply distorted perceptions of work we carry around in our heads. The most basic would seem the identification of work with the "production" of material goods--when, he argues, most work has not been about production, but rather "caring" for people (child-rearing, housework, education, medical services, etc.) and things (maintenance and repair), which has been overlooked. (As the above list demonstrates, caring work has largely been women's work, and so less valued--and indeed, Graeber credits leftist feminists with developing the concept.)

Hardly less important is the tendency, already emergent in Medieval North European culture, and reinforced by the "Puritan" ethos (and capitalism, of course), to equate work with "alienated," mind-and-body destroying labor done for something other than the work's intrinsic value to the doer, under the supervision of others, all day, every day, and indeed, with the misery normally attendant in the situation.2 All other activity is "idleness." Work is deemed virtuous, idleness sinful, so that being miserable in one's work marks them as "deserving," and someone who is not working, and made miserable by their work, is "undeserving" even of the means to live.

Consequently, as production became more automated, caring work was not acknowledged as an alternative locus for labor; and there was a timidity about or hostility to changes in the social system that would reduce workloads (a shorter work week) or detach income from work, productive, caring or otherwise (through income floors, for example). Public well-being was instead identified with the maintenance of high levels of employment, particularly "full-time" employment, to the extent that this could be invoked as an excuse for wasteful or destructive social policies, Graeber citing the way in which the preservation of millions of jobs directly connected with the private health care bureaucracy has been given as a justification for not shifting to a more efficient alternative--not least by President Barack Obama himself.3 (Indeed, social safety nets themselves drove the growth of bullshit employment, through a massive bureaucracy devoted substantially to making the lives of qualified applicants seeking benefits more difficult through means testing and refusals.)

As a result, despite his aversion to the demand that social critics serve up policy solutions as the price of opening their mouths (which he observes is a way of distorting or suppressing dissent), Graeber argues for a Universal Basic Income as a solution to the problem, on several grounds. Such an income, he contends, would affirm the right of human beings to live, and aid our moving past our flawed conceptions of its connection with a perverse understanding of work by delinking income from labor. It would also eliminate the problem of means-testing and the associated welfare state bureaucracy. (The problem of how those made redundant by the change are to get on is, of course, resolved by their getting such an Income too.) It would eliminate, too, the incentive of people to do bullshit jobs (which would be a quick way of forcing employers to do without positions that are often unnecessary and wasteful for even their own firms), and more generally enable employees to walk away from a raw deal, likely improving the pay and conditions of work for all. This would be all the more significant, as he expects that the great majority of people will still be working in one way or another, at tasks that may well prove more beneficial not just to themselves but to society and the world as a whole (precisely because so much of the work being done was bullshit, and much potential creativity and ingenuity wasted in the process). Graeber also contends that this could contribute to resolving other major problems, citing that experiments with such income policies in the past (he cites India as an example) also reduced social distinctions, prejudices and ills to which the "rat race" and all the rest contribute so much; while pointing out that perhaps the single best thing we could do for the environment is to scale back our wasteful working.

In discussing what Graeber has to say it seems only fair to admit that I have long been convinced of the truth of what many will regard as his most radical claims--the immense wastefulness of contemporary capitalism, that technological employment has already happened (after picking up Jeremy Rifkin's The End of Work back when it was still current). Perhaps because of that I had reservations about just how much he stuck to his emphasis on subjective perceptions, to the exclusion of other examinations of the issue proffered by heterodox economists under the heading of "waste" for over a century (generally, in more objective, materialist ways), matters like built-in obsolescence, or marketing efforts like advertising and "product differentiation." In fact it seemed to me that doing so could have strengthened his argument considerably, and perhaps, helped him avoid what seemed to me certain exaggerations that came from his reliance on a narrower basis for approaching the issue than he might have used--like going too far in downplaying productive labor in his (justified) correction of the tendency to overlook caring labor, or his trying too hard to explain the widespread hostility to particular occupational groups in terms of this theory when others have at least as much to offer. (While he seems to me entirely right to say that the public begrudges artists a living because they seem to be avoiding their share of the common misery, it is also the case that artists are not generally respected as "useful" people; while Galbraith's concept of "convenient social virtue" seems to me to account for much of the disrespect constantly shown members of the teaching and nursing professions.4)

Still, these are comparatives quibbles with a case that is as compelling as it is unconventional. Materialist, hard data-and-statistics grubber that I am, I consistently found his argumentation regarding not just the existence of the problem, but its discoverability through his particular approach, persuasive, with the same going for his treatment of its causes and implications. At the same my interest in those other approaches was, on the whole, not a matter of skepticism but of my finding his argument so compelling that I could not but think of the possibilities for a synthesis of Graeber's argument with that longstanding tradition of writing on economic waste. (What, for instance, would it take to produce a pie chart that shows the share of bullshit in Gross World Product? How about time series' of GDP and GWP presenting figures adjusted and unadjusted for the bullshit component side by side?) It strikes me, too, that if Graeber is less concerned with the macroeconomics of the solutions he recommends than he might have been, that his case for a guaranteed minimum income is one of the more robust I have seen in recent years. Indeed, in a market full of books which are really just extensions of banal magazine articles (especially when we look at those produced by major commercial publishers for a general audience), this book, like Graeber's earlier Debt, is a diamond in the rough.

1. See Richard Harris' Rigor Mortis (reviewed here) for an in-depth discussion of this in the medical sector.
2. Graeber cites Alfred Marshall's definition of labor in Principles of Economics as work done "with a view to some good other than the pleasure derived from the work."
3. I was dismayed to see James K. Galbraith make a similar case in his otherwise excellent The Predator State.
4. As he put it in Economics and the Public Purpose, this "ascribes merit to any pattern of behaviour, however uncomfortable or unnatural for the individual involved, that serves the comfort or well-being of, or is otherwise advantageous for, the more powerful members of the community. The moral commendation of the community for convenient and therefore, virtuous behaviour then serves as a substitute for pecuniary compensation. Inconvenient behaviour becomes, deviant behaviour and is subject to the righteous disapproval or sanction of the community." It seems noteworthy that Graeber has over the years cited the works of the elder Galbraith a number of times, but also that while four of Galbraith's mid-century classics are listed in his bibliography, including American Capitalism, The Affluent Society and The New Industrial State, Economics and the Public Purpose (the work that capped that line of research) is not.

"Are the Arts Just For the Rich?"

Watching Patrick Stewart in William Shatner's documentary The Captains' Close-Up talk about his childhood, I was surprised by his working-class Yorkshire upbringing. And afterward I couldn't help thinking that where the current generation of British actors is concerned, none seem to have a background remotely like that. Look them up, and you are far more likely to find they are a graduate of Eton, an alumnus of the Dragon School.

Naturally it was with interest that I read a series of pieces in The Guardian on the phenomenon--some dealing with actors specifically, others the arts more generally. The pieces were far from as incisive as they could have been--not least, in their use of the term "middle class." A famously fuzzy term, it can be used to deliberately cloud the issue, frankly to downplay the extent to which a person from an extremely privileged background actually comes from an extremely privileged background, and often is, not least by The Guardian when discussing artists. Consider the extract from its typically puff piece profile of Killing Eve creator Phoebe Waller-Bridge:
Waller-Bridge grew up in an upper-middle-class family with baronets on both sides. Her father co-founded Tradepoint, the first fully electronic stock market . . . and her mother works for the Ironmongers' Company in the City.
On what planet is someone with "baronets on both sides" of the family, someone whose father cofounded a company with its own Wikipedia page (Tradepoint, later SWX Europe), merely "upper middle-class?"

Even in a culture so mendaciously pretending that "we're all middle class now," this is too, too much.

All the same, if the term is used in a way so loose as to be obfuscating (at best), the pieces offer a robust round-up of a sizable body of anecdotal evidence (Helen Mirren, Judi Dench and other luminaries observing their chances would have been very different today), and a growing body of solid statistical study, that confirms what David Graeber (writing of America rather than Britain, though there is no reason to think all this is less applicable to it) remarked when looking at the social, class realities of intellectual and creative life that "if your aim is to pursue any sort of value [but money]--whether that be truth . . . beauty . . . justice . . . charity, and so forth . . . [and] be paid a living wage for it"--those "jobs where one can live well, and still feel one is serving some higher purpose"--only those with "a certain degree of family wealth, social networks, and cultural capital" have much of a shot, while for the rest "there's simply no way in" (253).

In spite of which they are typically told by the "successful," and sometimes simply "everyone," that their failures are entirely their own fault, that they simply were not "good enough."

Which bullshit--the philosopher Harry Frankfurt's term certainly seems applicable here--is the conventional wisdom in an era that has never had much patience with the claims of artists to support or even sympathy.

The attitude shows up the illusion that the contemporary world is a meritocracy, makes a mockery of its slogans about opportunity, and not only diminishes the lives of those cut off from such careers solely for having made the mistake of not being born rich and well-connected, but the arts themselves, and through them, society as a whole.

Bullshit Jobs and the New Hollywood

What one might call the "myth" of the "New Hollywood" of the 1960s and 1970s is that the artists' hubris led to the alienation of key partners and allies and sponsors, personal antagonisms and artistic and commercial flops that destroyed their careers; that the limited appetite for artistically daring and politically radical content had been exhausted, as risque content became mainstreamed and as the politics of the country moved right, eliminating both inspiration and audience; and the rush to imitate the success of Steven Spielberg's Jaws and George Lucas' Star Wars.

However, reading the relevant history I found myself struck by the changing business conditions above all. The New Hollywood happened in a period of sharp changes in media and entertainment (above all, the rise of TV), but also an interregenum between business models--the studio system was dying, but the hyper-financialized multinational multimedia corporation with its mission of barraging global audiences with "high concept" content had not yet established itself. The real, significant but still very limited margin of freedom that the New Hollywood artists enjoyed in between was inconceivable except in that interregenum, and came to an end with its close.

Interestingly, it is one of the many smaller jobs David Graeber takes up in Bullshit Jobs, where his emphasis is on the aftermath of that closure, described as "a corporatization far more stifling than anything that had come before" (186). In the book's discussion the key element is the complication of the process of "development" which has seen vast numbers of executives having to successively sign off on a project while getting in their two cents--a case of too many cooks spoiling the broth, the more so because none of them know a thing about cooking, and are often just trying to justify their existences (because, by and large, they know nothing of film, and by any reasonable measure, epitomize "bullshit" in the sense Graeber writes of it).

As it happens, contemporary Hollywood comes up a second time in the book, namely its closed nature. As Graeber remarks,
Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste (252).
The combination of the industry's high-concept, global-market imperative, with this production process that looks like a recipe for incoherence or worse, and the bizarre situation in which the film industry is dominated by a closed "aristocracy" (his term), do not seem at all irrelevant to the combination of artlessness, and extraordinary divorce from lived reality, of what passes for "cinema" in our time.

Sunday, December 16, 2018

Are They Trolling Me? A Test

I have already written a blog post about spotting trolls, but thought I'd present some of those thoughts again in a somewhat more systematized, handier fashion--in the form of a test to which people can reasonably put an interlocutor's statements. Admittedly based on limited personal experience I don't claim that it is scientific or pretend that it is foolproof, but I find the essentials handy, think others might find them handy too, and that feedback from others might help develop this into something better, so here it goes--a set of questions with differing weights.

1. Did they butt into a conversation already ongoing between other people? (10 points)

2. If they butted in, was it into the conversation of people who clearly had a very different attitude toward the subject matter? (For example, a #MAGA butting into a conversation between two #Resist in a thread started by another #Resist account's Retweeting an item from a progressive media outlet?) (10 points)

3. Are they just using this as an occasion to inflict talking points on people they seem well aware have no interest in hearing them (rather than engaging others in a manner indicative of genuine interest in their thoughts and opinions)? (20 points)

4. Are they demanding that other people provide lavish defenses for their opinions? (10 points)

5. Are they trying to press the other person into agreeing with them? For example, do they keep repeating their talking points in a manner suggestive of pressuring them, or asking the same questions over and over again after the object of their questioning has given them all the answer they have to offer? (20 points)

6. Do they use insults or make personal attacks? (30 points)

Bonus Question: Does their profile flaunt a troll's attitude? Trolls are often proud of their activity, such that I have seen some declare that they are trolls in their Twitter profile, while others, only slightly less subtle, boast in transgressive ways about their mean-spiritedness and sadism to shock, offend or intimidate, in their imagery as well as in their words. (This can include reference to or glorification of violent acts in their handles and profile pictures; it may also include the celebration of much maligned figures or evocations of their ideology as studied provocations.) (40 points-and just 40 points because even a troll might not be a troll all the time.)

Ask each of these questions in regard to the suspected troll, and add up the points warranted by each "Yes." Someone who scores a 30 is suspect as a troll, someone who scores a 40 is likely to be that, someone who scores a 50 or more probably that.

The first six questions allow for a score of 100, while adding in the bonus question permits a maximum score of 140.

As that suggests, trolls, by nature unsubtle, tend to give themselves away very early on in the game. I think it best to take the hint, and best that you also do the same. Do not fall for the lie that you are being a thin-skinned "snowflake" or intolerant of "free speech." Accusing people for being thin-skinned for simply not accepting abuse is what a bully does. ("What, can't take a joke? Don't you have any sense of humor? You're no fun.") And respect for freedom of speech does not mean that you are personally required to spend every waking moment of the rest of your life being an audience for people who disagree with you--and still less, people who are promoting ideas that are genuinely loathsome. (I'm making a value judgment here, and not apologizing for it.) Feel free to cut a troll off as soon as you have become convinced of their purpose. Mute, block and REPORT as seems appropriate to you.

Sunday, December 9, 2018

What Makes a Troll a Troll?

We all know that the Internet, and especially social media, are inundated with trolls--pathetic, repugnant, dark triad-afflicted losers (narcissistic, Machiavellian, psychopathic) whose idea of a good time is ruining them for everyone else.

Many will acknowledge that it is a bad idea to respond to a troll. They are uninterested in anything you have to say, only in distracting you from what you were doing, knocking the conversation off track, making you waste your time and effort dealing with them, and causing whatever harm they can. This may be so much the case that they will say things they don't mean simply to satisfy their desire to disrupt and to hurt--but much of the time, maybe, probably, most of the time they believe them, if not always wholeheartedly.

But how does one tell the difference between trolling and an opinion they simply do not like?

I find that trolls tend to butt into ongoing conversations among other people--usually, people of quite a different sensibility than themselves. (To choose an admittedly non-neutral example, one might find, for example, a few people who might all be considered left-of-center responding to an item to which such persons might be expected to be more attentive than their counterparts on the right when a right-winger suddenly turns up.)

I also find that when they do butt in they do one of three things:

1. Fling insult and abuse. (All they can do is call the participants in the conversation stupid--which often betrays that this is exactly what they are.)

2. Rub their opinions in the faces of people they expect to be repulsed by those opinions, rather than try to actually discuss anything. It is a little harder to be sure of this than, for example, insult. Still, there are giveaways. Such opinions are typically canned, often by someone else (they don't actually do much thinking for themselves, or they'd have better things to do than this), and commonly irrelevant to the conversation into which they have entered. (I recall a thread where people discussed the President's reply to the Thanksgiving Day question "What are you thankful for?"--and only that--and someone felt the need to inject the claim that "Socialism killed 100 million people.") When others react, they commonly display satisfaction, perhaps repeating the action.

In cases I have had the impression that they are like exhibitionists, deriving satisfaction from others' disgusted reactions. In others, they seem to delight in lobbing a grenade into a crowd of bystanders. In still others, it seems they are more purposeful--intent on diverting a discussion, for example. (I recall the comments thread of a news story about the Netherlands' police training eagles to catch drones, and seeing it from the start diverted into an attack on solar energy, with the hundreds of comments that followed caught up in the ensuing flame war. Alas, renewable energy and sane climate policies are very common troll targets.)

3. In the rare case that they are able to actually interact with others regarding the subject, they behave in very unreasonable, bullying fashion. They do not ask for an explanation of another person's opinion, they demand that they defend it, and raise the bar for such defense very high--asking for the equivalent of a doctoral dissertation, complete with footnotes, as the price of their having opened their mouth. Even after the person has given such explanation as they have to offer, the questioner refuse to accept that the other person has said their piece and keeps coming at them, as if intent on making them recant, on converting them to their cause. (I have, unfortunately, found myself having such talks with proponents of atomic energy who want me to "admit" that renewable energy can never be the foundation of our energy base.)

Can normally decent people find themselves acting in ways similar to this? It's not impossible. Annoyance, a foul mood, and they slip into some bad behavior. But I think one can go too far with the "Everyone can be a troll" line, and certainly anyone who has doubts about a particular interlocutor can, on Twitter at least, just look at their account, see the way the suspected troll has chosen to present themselves to the online world, see the things they choose to post and share.

Some I have seen proudly and not at all ironically write the word "Troll" in their bio. And I find it best to take them at their word.

When I run into someone fitting someone fitting the profile described here, I don't mute the conversation, I block them. Permanently.

Yesterday, one of my comments proved to be pure troll-bait, alas, inciting dozens of attacks from people who responded in exactly these ways. I have blocked each and every one, and at the time of this writing, find myself continuing to block them--setting a one-day record, which is, I suppose, why I buckled down and wrote up this post.

I strongly urge everyone else to similarly block those who conduct themselves in such a manner. Trolls, especially as defined here, may have the right to speak, but you have no obligation to listen to them, let alone answer them. They have no right to take your time and attention, let alone inflict harm upon your mental well-being.

Do not let them do it.

Wednesday, October 31, 2018

Yes, Tax Breaks ARE Subsidies

Many have encountered the claim, typically made by right-wing and especially libertarian commentators, that tax breaks are somehow not subsidies.

As is usually the case when libertarians and other champions of the most hardline economic orthodoxy make such "educational" pronouncements, the assertion is, all at once, utterly contrary to any conventional understanding of economic reality; Olympian in its arrogance and contempt toward anyone who would presume differently; and opaque in its reasoning.

After all, Investopedia, no bastion of left-wing, communistic thought, defines a subsidy as:
a benefit given to an individual, business or institution, usually by the government. It is usually in the form of a cash payment or a tax reduction. The subsidy is typically given to remove some type of burden, and it is often considered to be in the overall interest of the public, given to promote a social good or an economic policy.
Note that "tax reduction" is explicitly included in the definition they provide.

Indeed, one does not find the essentials contested even in the explanation of what a subsidy is provided in the Mises Institute article that seems to usually be at or near the top of the list of search hits on Google when anyone uses it to research the topic, "No, Tax Breaks are Not Subsidies." While the article does charge an "economic and ethical" difference between a cash payment and a tax break (the difference between keeping what you have and being given something taken from someone else), this does not in and of itself make a tax break not a subsidy. In fact, the article acknowledges that "entrepreneurs who take advantage of tax breaks will incur fewer costs than entrepreneurs who don't," and that such breaks are indeed "beneficial to those who claim them"--which is totally in line with the idea that, in conferring advantage to some business at public cost (in this case, forgone tax income, which is shifted to other parties who accordingly bear a greater part of the tax burden, or the burden of forgoing such expenditures), a tax break has the same practical costs and benefits as a subsidy.

So where, one might wonder, is the argument?

The real objection of the piece's author to the term's use in this case appears later, when the article continues with the explanation that government is "not a wealth creator," but only a taker of others' incomes. Because of this such seeming special advantages are really just reprieves from taxation on business that it compares in its various metaphors to a "life jacket in a sea of wealth redistribution," with capitalism only able "to breathe" through such loopholes. Indeed, the web site followed up this piece with another article titled "The Answer to 'Unfair' Tax Breaks is More Tax Breaks."

In short, the "tax breaks are not subsidies" argument assumes that governments have no business taxing business, so that any such taxation is illegitimate (rather than a We-the-people argument for "No taxation without representation" their stance is a pre-1789 French aristocrat's arrogant "No taxes on us, ever!"), and accordingly any respite from such taxation only just, natural, appropriate, and not at all a show of special favor.

None of this is self-evident to most. And when explained to them, most will flatly (and justly) reject the premise of the argument.

Moreover, it will appear the case not only that a subsidy IS a subsidy IS a subsidy, but that, as is so often the case the arrogance and opacity are mere cover for unbelievably shoddy reasoning in the name of the self-serving ideas to which persons and institutions of this type are so prone. Like the claim that one can "never run out of" a given resource, or that involuntary unemployment "cannot exist," an economists' insistence that "tax breaks are not subsidies" is simply a matter of market fundamentalist theologians posing as social scientists, in the way that orthodox economists have been doing for at least a century now; of the staff of right-wing think tanks and the like pretending to be public intellectuals interested in and explaining the actual world rather than PR hacks for the interests that pay their salaries, utilizing theories formed and held without interest in or regard for the real world because they conveniently legitimize their political positions; of the inmates in the asylum insisting that they are really sane while it is everyone else who is crazy.

Saturday, October 27, 2018

George Orwell's Nineteen Eighty-Four: A Note

Some time ago a poll in Britain identified George Orwell's Nineteen Eighty-Four as the book that people most often lie about having read.

Naturally the ways in which we hear the book discussed often seem to miss the point. In line with the prejudices fostered by Cold War orthodoxy, they also identify it with socialism--without which the book might not be nearly so widely cited and promoted as it is today.

Alas, the latter conveniently overlooks the fact that Orwell, while an anti-Stalinist, was still a socialist; and what he described in his book was not meant to be thought of as socialism but rather a pseudo-socialism intended to prevent the genuine kind from ever arriving. Put another way, it was intended to preserve the existence of inequality, a power elite, the mechanisms of oppression as ends in themselves. The key tool was the militarization of society--keeping it in a state of perpetual war.

I direct the reader--you know, the one actually reading the book rather than saying they did, to Chapter III of the book-within-a-book that is Emmanuel Goldstein's The Theory and Practice of Collective Oligarchism, "War is Peace." There, as Goldstein observes, the rising productivity of machine-equipped industrialized society was making it possible to eliminate "hunger, overwork, dirt, illiteracy, and disease" before very long, and indeed led to a "vision of a future society unbelievably rich, leisured, orderly, and efficient"--what we call the capital "F" Future of which the flying car is the symbol--"was part of the consciousness of nearly every literate person." Those in power saw that this would spell the end of any need or justification for "human inequality," and indeed, "hierarchical society"--distinction conferred by wealth, and "POWER . . . remained in the hands of a small privileged caste," and this "privileged minority," now functionless, swept away by the well-fed, literate, thinking lower orders who no longer had use for them, and knew it.

Indeed, Goldstein declares that "[i]n the long run, a hierarchical society was only possible on a basis of poverty and ignorance." Deliberate restriction of economic output to prevent this proved politically unworkable (as the post-World War I stagnation and Great Depression proved), but perpetual war afforded an option for "us[ing] up the products of the machine without raising the general standard of living," which indeed was the principal function of the eternal war amongst Oceania, Eurasia and Eastasia. After all, there is nothing like the cry "We're at war!" to make people stomach austerity, deprivation, repression; to make them think the stifling of dissent justified--with even liberal societies seeing war time and again prove the end of progressive hopes.1

This commitment to inequality and oppression for their own sake, and the extremes to which an elite fearful of its position might go to resist a movement toward a more egalitarian order of things; the recognition of how eternal emergency can be an excuse for such a state--these were arguably Orwell's greatest insights, and warnings we have ignored at our very great peril.2

1. Chris Hedges wrote about this in The Death of the Liberal Class. John K. Galbraith wrote about his own experience of this in his A Journey Through Economic Time: A Firsthand View.
2. In the closing essay of his collection On History, Eric Hobsbawm reflects on the sheer bloodiness and brutality of the twentieth century and suggests that it was as horrible as it was because it was, at bottom, a "century of counterrevolution," and no one can be more vicious than the privileged when they feel that their selfishness is threatened.

Friday, October 26, 2018

Review: On History, by Eric Hobsbawm

New York: New Press, 1997, pp. 305.

As the title of his book implies, here historian Eric Hobsbawm is writing less of history here than of historiography. Academic a subject as this may sound, however, this is not a collection of minutely academic articles for ultra-specialists. Over half of the pieces (eleven) are lectures given not in the classroom but at special events at various institutions around the world; two more are conference papers; and two of the pieces originally written for publication were book reviews in non-academic forums--the Times Literary Supplement and the New York Review of Books. It is the case, too, that Hobsbawm is an unfashionably staunch, forceful and persuasive defender of reason and the Enlightenment that he is, and of the value of history and historians as well.

Consequently, despite the methodological emphasis of the book, and the fact that most of his pieces raise more questions than answers, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure to, even disinterest in, grappling with a historical reality that time and again proves the utter worthlessness of their models and their apologia. Despite the methodological emphasis of the book, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure at, and even disinterest in, grappling with a historical reality that time and again demonstrates the utter worthlessness of their models and their apologia (fitting them, as he remarks, for the "dog-collar of the (lay) theologian").

Of course, more than two decades have passed since the publication of this collection, and even at the time many of the pieces were already a generation old. Still, if in places they show their age by treating old phenomena as if they were new (as when he writes of the arrival of Marxism in the Academy, Walt Whitman Rostow's modernization theory, cliometrics, and the break-up of Yugoslavia), they retain their interest because what is past is not past, and in some cases more present than ever--and often, more pervasive, more dominant, more damaging than before. Certainly this the case with his critique of orthodox economists' pieties, but this goes, too, with what he has to say of other unhealthy tendencies of the late twentieth century, like postmodernism and identity politics. As he memorably writes regarding the latter, "few relativists have the full courage of their convictions, at least when it comes to deciding such questions as whether Hitler's Holocaust took place or not," while at any rate, "relativism will not do in history any more than in law courts." There, "[w]hether the accused in a murder trial is or is not guilty depends on the assessment of old-fashioned positivist evidence," and any of his readers who find themselves wrongly placed "in the dock will do well to appeal to it. It is the lawyers for the guilty ones who fall back on postmodern lines of defence."

Indeed, reading Hobsbawm I find myself remembering another book by another great of the past century who "writes of the Enlightenment without a sneer," C. Wright Mills' The Sociological Imagination. Mills was a sociologist who called on social scientists (or "social students?") to be more historically minded, while here Hobsbawm as a historian made a not dissimilar case. Both were entirely right, and that they have been so little heeded has impoverished the lines of inquiry with which they were concerned, and our collective understanding of our own world.

Saturday, September 8, 2018

Shelley, Verne and Wells--especially Wells

When we recount the history of science fiction, it is common to point to some early figure as its founder--for instance, Mary Shelley, Jules Verne or H.G. Wells.

The emphasis on identifying a single, founding work by a single author strikes me as unsatisfying for numerous reasons, only one of which I want to go into here right now---that it makes much more sense to credit them with each defining a crucial strand of science fiction, already in place before it coalesced into a genre (in the '20s, on the watch of Hugo Gernsback).

Note that I write defining here, not founding, because it would be excessive to claim that they did something that had never been done before to any degree. Rather they did what others might have done before in a certain way, got noticed for it, and became a model for those who followed in the process.

The Gothic writer Shelley defined the science-based horror story--where the scientific endeavor goes very, very bad. Verne defined the science-based adventure--where a discovery or invention sets us up for a thrill ride to the center of the Earth or the moon or for twenty thousand leagues under the sea which might teach us something along the way, most likely in the way of scientific facts. And Wells defined the science-based story not about some hero or anti-hero, but where, to borrow Isaac Asimov's phrase, humanity as a whole is the protagonist; the story which uses science to think about society, and considers how science, by way of ideas and technology, might change it; the story about a future that is clearly not the same as today, broadly, deeply and densely imagined; the science fiction story which can criticize things as they are, and bear the hope of progress.

Of these the last perhaps accounts for a smaller part of the genre's overall output than the other, older strands. Certainly it is less present in the more popular work than the influences of Shelley or Verne. (Horror stories, adventure stories, are the easier sell.) Still, it is Wells' strand that seems to me to not only have been far and away the most intellectually interesting, but to have given science fiction a cultural role beyond mere entertainment, or in some general way getting people excited about science; to have enabled science fiction to really matter. And it seems to me that the vigor of that particular tradition is, more than any other factor, the determinant of the health of the genre as a whole.

Are Books Too Long These Days?

Are books too long these days?

I will say up front that many of the novels that have most impressed, most affected, most influenced me were thousand-pagers. Fyodor Dostoyevsky's The Karamazov Brothers, for example. (I can't imagine Tales From the Singularity without that one.) Or Anthony Trollope's The Way We Live Now. (Which is still in a lot of ways The Way We Live Now in the twenty-first century.) Or Theodore Dreiser's An American Tragedy. (Has anything equally ambitious, sweeping, worthwhile been written about American life since?)

And reading my way through the classics, I encountered a good many that don't have a membership in that pantheon, but where I could appreciate what they were going for, and that trying to do it took half a million words (as Victor Hugo did in his national epic of France, Les Miserables, and Leo Tolstoy did in War and Peace).

Still, not every book needs to be so long as that. Not every story requires so much sheer mass. Most are better off without it. And in general I think those books that most of even that small minority that actually reads tends to actually read--the romances and thrillers and romantic thrillers--are ill-served by the demand for doorstops. What might be a brisk entertainment instead ends up bloated and slow, and often pretentious, and I find myself nostalgic for the quick and dirty writing of a half century ago, and the still older pulps. Reading Dirk Pitt at the series' best was a lot of fun, but there is a lot to be said for those who came before him, not least that other Clark-from-New-York-with-a-Fortress-of-Solitude, Doc Savage.

The Summer 2018 Box Office: Solo and The Meg

I haven't done an overview of the summer box office in quite a while, in part because it has become so damned repetitive, in its commercial successes--and perhaps even more consistently, its artistic failures. Every year Hollywood laments its earnings, every year the more critical critics decry the shallowness and sameness and staleness of it all, every year we hear promises that Hollywood will change, and every year, it demonstrates that not only has it not done so, but that its lack of memory is utter and total, the promise unremembered.

So I'll restrict my comments on the whole tedious thing to the two films I felt I actually had somethijng to say about: Solo and Meg.

After over three months of release in which it has had long play in every major market (even Japan got it before the end of June), Solo remains short of the $400 million global mark, like I guessed it would be after what, only in the context of the money poured into it, was regarded as a dismal weekend. What it will mean for the franchise remains very much a matter of rumor and speculation. But the shock seems undeniable.

By contrast Meg proved that rarity, a movie that performs above expectations rather than below them, and that still greater rarity, the seemingly written-off dump month release that proves a blockbuster. (The predictions were $10-20 million in its opening weekend in North America, but it actually pulled in more than twice the high end of the range, $45 million.) A somewhat more modest production with much more modest expectations, The Meg has already outgrossed Solo globally (the China market has helped a lot), and if the figures discussed earlier are to be believed (a $400 million break-even point, due to the advantages it enjoys as a Chinese coproduction in that makret), already well into the black, and still raking it in (in its fourth weekend of U.S. release, still at #2).

A follow-up is not a sure thing (given the nine figure budgets and equally hefty promotional bills, the hefty competition and the terms of Meg's success as a bit of goofy fun, the margins are not exactly vast), but it still looks quite likely. We might even see other Steve Alten works finding their way to the big screen as a result, in what looks like at least partial redemption of the dashed hopes held for it all way back when we first heard of the hopes for the project.

From the standpoint of the business Meg's success falls far short of balancing out Solo's underperformance, and all it represents (the ultimate in the Hollywood franchise mentality, by way of the franchise that did more than any other to establish the blockbuster as we know it). Still, looking at the two trajectories together, I suppose there's a certain symmetry in them.

Friday, September 7, 2018

Ian Watt, Irony and Criticism in Our Own Time

For me one of the most memorable aspects of Ian Watt's The Rise of the Novel (reviewed here) is his discussion of the proneness of critics to read irony where there is often actually none--which specifically cited Samuel Taylor Coleridge and Virginia Woolf's readings of Daniel Defoe.

Defoe, in Watts' view, had no such attitude to the apparent contradictions in his characters' behavior--their combination of relentless money-grubbing and relentless, verbose declarations of their piety, for example. This was not only because such things did not look as ridiculously hypocritical to them as they do to people of our own time, but because a display of such irony required a level of technical mastery in this kind of "realist" writing that eighteenth century novelists had yet to achieve. Indeed, Defoe's sloppiness as a writer is something Watt discusses quite some length, replying to Woolf's declaration that Defoe subdued "every element to his design" with the opinion that there is no
design whatsoever in the usual sense of the term . . . such an interpretation really a kind of indirect critical repayment for the feeling of superiority which Defoe enables us to derive from his humble and unconsidered prose, a feeling of superiority which enables us to convert the extreme cases of his narrative gaucherie into irony . . .
There is far, far too much such "conversion of narrative gaucherie into irony" today--more than in his, more perhaps than in any other time in history--with at least some of it coming from people who ought to know better. (I hesitate to name names, but one recent critic whom I hold in a good deal of regard reviewed Suzanne Collins' The Hunger Games in exactly this way, and then admitted that it was his way of spicing up the boring task of reviewing a book with such artless world-building--essentially, an admission that he wasn't doing his job of reviewing the book at all.) After all, in our postmodern day, the most inane subjective reaction can be held up as profound insight, and "irony for irony's sake" might be the critical slogan--irony for irony's sake because they cannot resist that "worst form of snobbery," because there is no better barrier to really thinking about anything than blowing it off in this pseudo-literate person's equivalent of the eternal "Whatever!"

Ian Watt and Shakespeare

Ian Watt's Rise of the Novel was, as discussed here, a study of eighteenth century literature. Still, that outstanding piece of literary analysis, history and sociology was comprehensive enough to have much to say about other subjects--not least, the works of William Shakespeare.

His remarks about the Bard were, of course, offhand. Still, in noting that Shakespeare, as very much a Medieval rather than a modern, and noting that such writers dealt in universal types rather than specific individuals; that they had their eye on abstractions rather than concrete facts; that they were prone to be loose in handling the flow of time or cause and effect relationships; and that in describing it all they were inclined to prettily decorate rather than rigorously denote and describe; he strikes me as having sum up a very large part of the challenge that reading Shakespeare presents a twenty-first century reader, a challenge they tend to fail.

Thus we read Julius Ceaser and find instead of a historical drama about ancient Rome as we would understand the term--just the dilemma of Brutus. Thus we read Hamlet--and feel that he's endlessly dithering, which becomes ammunition for pompous lectures on the character's lack of decisiveness. Or we don't find those things, because we don't really have enough of a handle on what's going on to have those reactions. We just read them because we're supposed to, without worrying about whether we "get it" or not, and then, if the statistics are accurate, after completing the obligatory school requirement (under the eye of a teacher who might not get all this themselves; they're probably taking all this on authority just as much the student), most of us probably don't read much of anything ever again.

Is there anyone else who thinks this isn't how it's supposed to go?

Of Character and the Larger Scene: A Note on Ian Watt's The Rise of the Novel

Recently reading Ian Watt's classic of literary criticism, The Rise of the Novel, one of the book's more compelling aspects seemed to me his attentiveness to the differing emphases novels can have--what I tend to think of as the Henry James-like emphasis on character and the "play of individualities" as the cornerstone of good writing, and an H.G. Wells-like stress on the larger social scene. Where the "highbrows" are concerned, James carried the day.

As Watt makes clear, however, both approaches were strongly represented among those foundational writers of the eighteenth century English novel. Watt identified Samuel Richardson with the stress on character (and the domestic themes to which such writing inclines), Henry Fielding with society. Consequently, while they share comparable status as founders of the English novel (indeed, in that long-ago eighteenth century lit course I had, we read both Richardson's Pamela and Fielding's Tom Jones), it would seem to be that James' victory over Wells' was also Richardson's over Fielding's.

The Way We Live Now by Anthony Trollope: A Second Note

As Anthony Trollope's great satire of late Victorian society opens, we are looking at Lady Carbury, who is in the midst of preparing for a release of her book Criminal Queens, soliciting what she hopes will be favorable reviews from the major London newspapers.

A writer anxiously soliciting reviews in the hope that they will make her book a success!

Alas, all the advances in technology since that time when railroads were the stuff of tech bubbles has not spared writers the burdens and annoyances and headaches and embarrassments and nerves of publicity-seeking, as every self-published author knows only too well.

The Way We Live Now by Anthony Trollope: A Note

Anthony Trollope's classic The Way We Live Now is his classic satire of late Victorian society, which bears more resemblance to our own than most of us can appreciate. It is something of a truism that railroads were the dot-coms of Trollope's time, but reading that novel, centered on a massive financial scandal centered on such a steampunk dot-com, shows in dramatic fashion just how much this was so. ("The object of Fisker, Montague, and Montague was not to make a railway to Vera Cruz, but to float a company.")

Trollope's take on it all has real bite, one reason why critics in his day were unappreciative of that book, and why they might be similarly unappreciative in ours (as a cursory look at a number of lists of nineteenth century classics has suggested to me), but it seems to have enjoyed a bit of an upsurge in popularity in recent years, because of its relevance--and I suppose, also by more recent writers' failings. Among their many disservices Modernism, postmodernism and the rest have rendered today's "serious" literature too toothless to properly write such an epic in our own time, and so for satire we can hardly do better than look to a tale of comparable doings in a time long past.

Friday, August 31, 2018

Craig Thomas' Sea Leopard and British Naval Power

WARNING: MILD SPOILERS

I have remarked previously that by the '80s techno-thrillers centered on Britain had become a rarity. Still, there were some, like John Gardner's Bond novel Win, Lose or Die (clearly written to capitalize on the height of the genre's boom, and the success of Top Gun), and Craig Thomas' Sea Leopard, which centers on the intrigue surrounding a British submarine equipped with a revolutionary stealth technology. Of course, in imagining British industry achieving such a technical triumph the book may appear to be a bit of pious hope or wish fulfillment that Britain could, at the least, still "punch above its weight class in the military-industrial realm," enough so as to be a major actor in world affairs, thanks to the prowess of its boffins (on which such hopes had so often been based). The same might also go for Britain's saving its own bacon in this rare, Britain-centric techno-thriller, at least where the high-tech is concerned. (An American intelligence officer completes the critical action on the ground, but it is a British Harrier jet that infiltrates him into the country, while a British Nimrod controls the operation.)

Still, any such feeling is mixed with an acute consciousness of Britain's decline as a major power since 1945, and especially as a naval power. One sees it in the fact, that just as in Firefox, a British spymaster ends up relying on an American agent to pull off the mission he dreamed up, but it is given explicit treatment when Commander Richard Lloyd, the captain of the sub at the story's center, after his vessel is captured and brought to the Soviet harbor of Pechenga. He sees
[t]wo "Kara"-class cruisers at anchor . . . Three or four destroyers . . . Frigates, a big helicopter cruiser, two intelligence ships festooned with electronic detection and surveillance equipment. A submarine support ship, minesweepers, ocean tugs, tankers.
It all makes a very strong impression on him, the sheer "numbers" of the vessels " "overaw[ing] him, ridiculing Portsmouth, Plymouth, Faslane, every naval port and dockyard in the UK" in his time, the only point of comparison in Britain's naval history he could think of "some great review of the fleet at Spithead between the world wars, or before the Great War," though even that is undermined by "the threatening, evident modernity" of the ships on display. That Pechenga is a mere "satellite port" of Murmansk, that the Soviet coast is dotted with dozens of facilities equally or more impressive--and this not a "great review of the fleet" but Soviet navy business as usual, makes it all the more daunting.

Review: Sea Leopard, by Craig Thomas

WARNING: MILD SPOILERS

A nuclear submarine equipped with a revolutionary stealth technology becomes the prize in a contest between the Soviet and Western navies in the cold waters of the North Atlantic. At the center of the resulting international crisis we have not a sea captain or professional field agent, but an intelligence service desk man who has to figure out the intentions of the other actors, and come up with the right course of action, which soon enough sends him flying to the scene of the action, with the ultimate outcome hinging on the ability of an American to (with help from his British friends) get aboard a submarine in Soviet hands, make contact with the captain, and then fight for his life against a Soviet officer fighting him in a rear-guard action against the plan . . .

It sounds an awful lot like Tom Clancy's The Hunt for Red October, but it is actually a description of Craig Thomas' earlier submarine-themed techno-thriller, Sea Leopard, to which it is at least a precedent and perhaps even an inspiration. (I know Clancy was an admirer of that crucial proto-techno-thriller writer Frederick Forsyth, a fact which shows from Red October on, and which led to a full-blown homage in Red Rabbit, but I recall nothing regarding comparable interest by Clancy in Thomas.*)

Thomas is, of course, better known for another, earlier book than Sea Leopard, Firefox, and this book resembles that one too, sufficiently so that it can seem like an underwater version of his earlier hit. Again, the balance of power depends on a technological breakthrough contained in a stealthy military vehicle that a set of Cold Warriors tries to steal; and again a principal character of that novel (actually Thomas' principal series' protagonist), British spy chief Kenneth Aubrey, is the man with the plan, which ultimately hinges on his infiltrating a borrowed American military officer into the Soviet Union to see that the good guys commandeer said vehicle and get it to the West. (It even seems worth remarking that a submarine operating in the far northern waters off the Kola peninsula did play a crucial role in Firefox, refueling and rearming the stolen plane in aid of Mitchell Gant's flying it out of the country. We even get flying scenes over some of the same territory as in Firefox and Firefox Down, both books having their characters note Finland's Lake Inari from the windows of their military aircraft while flying on a secret mission.)

Still, if Thomas clearly reused much here, there is also much that he is doing differently, not least the manner in which he distributes his attention. While his earlier plane-stealing story was attentive to the bigger picture (more so than Martin Caidin's Cyborg, one reason why Firefox seems to me to have the stronger claim to being a founding techno-thriller), it had a clearer center in Gant's performance of his mission. In the style Forsyth had already helped pioneer, and of which he and Clancy were to make careers, this time around his attention is more widely diffused about the broad situation as it fills in "the big picture."

Alas, the results are not all that might be hoped for, this larger-scale narrative seeming to me to contain a good deal that was unnecessary--in, for example, Soviet attempts to board the British submarine they were trying to capture, complete with lengthy description of the hazards of trying to getthe team into place. (As the ostensible villains of the piece--as one might expect of Thomas, this is very much an orthodox Cold War thrillers in its politics--how did their suffering setbacks contribute to the suspense?) In particular the thoroughly fleshed out subplot about Secret Intelligence Service agent Patrick Hyde's hunt for a scientist who has gone AWOL at a crucial moment could easily have been excised from the narrative (while remaining in it its main effect seemed to be to fatten and slow down a story I would have preferred to see lean and mean). This went even more for the events involving Hyde in its aftermath. Other, more essential scenes went on longer than they should have, not least the sneaking around in the climactic operation. All of this made the story fatter and slower where it ought to have been lean and mean, while the bits involving the scientist dragged in a few cliches that really rankled (and which actually get their own piece, here).

I might add that despite the breadth of the narrative, and its flights from reality, Sea Leopard's high-tech military action also offers nothing so visceral as the flying scenes from Firefox at their best (though this may also be because of the familiar problem of the slow pace of underwater action in large and lumbering subs compared with high-speed aerial combat); or in its cast of characters, anything to compare with the tensions inhering in Firefox's strung-out super-pilot Mitchell Gant. Still, the thriller mechanics are competent throughout, enough so that it never bores, while the book now has a fair measure of novelty--as a pre-boom techno-thriller, and as an example of what by its time had already become a rarity, a British-centered techno-thriller.

* It is interesting, too, that the last name of the American officer who must pull of the book's central covert action is Clark--just like the ex-Navy man Clark who was Jack Ryan's field counterpart in the Ryanverse novels.

Sea Leopard's Quin

For me one of the weaker elements of Craig Thomas' Sea Leopard was the character of the scientist Quin. A very large part of the novel is devoted to his going missing, to the Secret Service's subsequent hunt for him, and what happens to the agent who had to pursue him.

I must admit I didn't care for it, in part because it made for a fatter, slower narrative, but also because of its trading in a number of unfortunate cliches. There is, of course, the way the character figures into the plot in the first place--his insight into the submarine stealthing system that is presented as his own invention, apparently with scarcely anyone else understanding much about it, even as the British navy has taken the program so far as to put it in a sub and have it prowl around Soviet waters as a test. (It's a matter of outworn Edisonade cliche and silly romanticism of the tech start-up where we should have had some acknowledgement of military-industrial Big Science reality, for which Britain is the birthplace, after all.)

Making matters worse, the scientist is a fragile neurotic for whom the Practical Men we are expected to sympathize and identify with have only disgust and contempt, even as they rely on his intelligence and skills to save the day, with much made of their need to "kick him in the ass" to do it.

On top of that the particular Practical Man who is most involved with Quin, Hyde, and that scientist's daughter, Trish, are a tiresome case of backlash politics-flavored generation gap between middle-aged security state functionaries and "these rotten college kids today!"--imagined as an incomprehensible and perverse pack of promiscuous, irresponsible leftists that at times crosses the line into (unintentional?) caricature.

It doesn't ruin the novel, but the book would have been a lot better off without it.

Saturday, August 18, 2018

Review: The Third World War: The Untold Story

John Hackett's The Third World War: August 1985 (1978) was followed up four years later by The Third World War: The Untold Story, updating and elaborating aspects of the original. Where the updates are concerned the most conspicuous is a chapter that, in light of Egypt's turn to the West, the Iranian Revolution, and the outbreak of the Iran-Iraq War, thoroughly rewrites events in the Middle East, the region not falling under Soviet-aligned control before the war, but instead a pro-Western Saudi-Egyptian alliance squeezing the Soviets out, while Turkish-based Iranian exiles topple that country's revolutionary government, stabilizing the region. The book also devotes chapters to elaborating previously slighted portions of the fighting in the European theater, particularly events in Ireland and Scandinavia, while detailing events in the Caribbean and Latin America; and the Far East, where there is a new round of Sino-Vietnamese fighting, North Korea's skirmishing with its neighbors, and after the American victory, the problem of receiving the surrender of the massive Soviet forces in the area. The Untold Story also describes much of what was already known to have happened from the Soviet side, following a motor rifle regiment officer through the early part of the war; matching the nuclear destruction of the prior book's depiction of the nuclear destruction of Birmingham with an equally detailed depiction of Anglo-American retaliation against Minsk; and devoting considerable space to portraying the subsequent rioting-turned-to-revolution against the Soviet leadership in the aftermath.

All in all, it is a mild rewrite in light of a number events that Hackett's team signally failed to guess at, and an exercise in "filling in the corners" of its already dated scenario. Compared with the novelty of the first book, this does not seem very much, and perhaps it is unsurprising that it made less of a splash. Where the first book lasted 40 weeks on the New York Times bestseller list, I searched in vain for anything like comparable notice of the sequel, which seems to be just a footnote by comparison with its predecessor's place in the history of this kind of fiction.

Tuesday, August 7, 2018

The Savage Doctor--Doc Savage

Decades before the action film subordinated filmic structure and pacing to thrills (the "thirty-nine bumps" that became standard for the Bond films, and which in turn set the standard), pulp writers did the same thing for print fiction, as Lester Dent demonstrates in his Doc Savage novels.

And I must say that the approach has its charms here. In contrast with the more measured pace of so much other earlier fiction, the briskness of old pulp writing holds up surprisingly well, even by the standards of today's action films--and still more, by the standard of today's novels. I won't deny the roughness of the approach, reflecting not just the pace of the story, but the pace at which it was written in those paid-by-the-word days. In line with the priority on pace and thrills logic is casually tossed out the window, while the writing is more tell than show, slight on details and even where minimalist, far from crisp and lean given its tendency to repeat the same few details over and over again. There can hardly be much suspense when the author harps on the hero's invincibility every chance he gets, and we learn very early on that there will always be a hokey out.

Yet the tale, helped by that spareness with description, is so brisk and there is such a spirit of fun that I didn't really care how little sense it made, and if none of the cliffhangers left me wondering if the heroes would make it through this one, I still read to find out how they made it, and even though I was sure the answer would be ridiculous (and was usually right), I felt fairly forgiving. In the process Dent crams into fifty thousand words as much in the way of plot twists, action and adventure as Clive Cussler (whose Dirk Pitt owes a very great deal to Savage) does into books three times' that size, and few use such space nearly so well as Cussler--one reason why, even after having a lot less interest in most contemporary popular fiction of the kind than I used to, I still find myself enjoying these sorts of brief, punchy, vigorous tales.

About That Doc Savage Movie . . .

Shane Black, who has spoken of his love for yesteryear's pulp and paperback hero many a time (rather more seriously than most Hollywood types talking about their inspirations and influences, I think, citing editions of The Executioner by name and number), is every so often reported to be bringing one to the screen. Back in 2014, for example, almost fresh off his success with Iron Man 3, he was reportedly working on a new big-screen edition of the Destroyer.

Little was heard of that project, however, but in spring 2016 he was supposed to be getting ready to helm a new production based on Doc Savage, with Dwayne Johnson cast in the lead. Two years on one might have expected the film to be out by now. As it is the would-be filmmakers continue to deal with legal hassles nad scheduling issues that must be settled before they can even begin production.

Have the Suits damned the project to development hell? Perhaps. But perhaps not. Still, there is room for doubt about how the film will do. The material has dated, but Hollywood's rarely cares about the purists anyway, except when they are an excuse to lazily and shamelessly recycle old material. (Remember The Force Awakens?) Accordingly I have little doubt that Savage will be bumped from the commercially risky interwar era (Indiana Jones pretty much has a monopoly on that anyway) to the present, and the Fabulous Five wholly reinvented with the expectations of today's critics in every regard from their demographics to the updating of yesteryear's preferred version of hokey humor to today's preferred version of hokey humor.

If they go with an updated, and likely generic, product, the sales pitch will be tougher, especially given the issue of brand name. That John Carter was the first great outer space hero did not do much to sell tickets back in 2012. Doc Savage's undeniable status as the forerunner to innumerable action-adventure heroes who have since made their debut should not be expected to do much more for him. For better or worse the name simply does not have much drawing power with today's moviegoers, while the scene has become very crowded indeed, many, many, many other figures doing this sort of thing on screen in recent decades and years, many of them played by Dwayne Johnson himself. Indeed, assuming Shane Black sticks with something like the premise of Savage's first adventure, audiences watching the star run around a jungle might think they're watching a sequel to Jumanji. Or Journey 2. Or G.I. Joe. Or even The Rundown. (Or another edition of Predator, one of which is heading to a theater near you this September, directed by Mssr. Black himself.)

Doc Savage and Dirk Pitt

If I really got started discussing or even listing the characters who have been influenced by Lester Dent's classic protagonist Doc Savage, I would probably never finish. After all, Superman, another New Yorker named Clark of far more than ordinary human ability whose deceased father raised him from infancy for a life of world-saving heroism and periodically retreats to a Fortress of Solitude in the Arctic (yes, Dent uses the term, and near the beginning of the very first Savage adventure too), owes a very great deal to him--and we all know how much the rest of comics owes just to Superman.

Still, given how much I have written about Clive Cussler here it seems worth discussing Savage's influence on Cussler's creation, Dirk Pitt. Pitt lacks Savage's combination of omnicompetence and ultracompetence. I am not sure he can be regarded as a genius at any one thing, certainly not the more esoteric skills Savage possesses, let alone everything. (In fact, the only contemporary character I can think of as really comparable in that respect would be Martin Caidin's Doug Stavers.*)

All the same, Pitt shares the rootless, larger-than-life merry swashbuckler aspect of the character, and something of the dynamics among Savage's group is evident in Pitt's own inner circle as well, down to the ways they annoy each other. (Al Giordino's stealing Admiral Sandecker's cigars recalls for me Monk's relationship with Ham.) It is undoubtedly an important part of his appeal, contributing to the series' continuation for nearly a half century now.

* You may have noticed the initials--Doc Savage, Doug Stavers--are identical, while the first name is similar in ring, "Doug" the closest real name to "Doc" I can think of. And Caidin's lavish tributes to Doug's superhuman prowess are just as (unintentionally) funny as Dent's to his character. Still, Stavers is the very opposite of Savage's goodness, making him an awfully "Dark Messiah."

Thursday, July 19, 2018

Review: FlashForward, by Robert J. Sawyer

I was one of those who saw the TV adaptation of Robert J. Sawyer's Flashforward before I read the book.

But I did get to it not long afterward. Sawyer's tale of an international team of scientists at CERN coping with the implications of the totally unexpected event--that two minutes' of unconsciousness in which everyone got a glimpse of their future--was, for me, an appealingly old-fashioned science fiction novel. Reasonably compact, comparatively lacking in the bulkiness and clunkiness that has made me read less and less recent fiction of any kind, it was genuinely interested in its "What if" and straightfoward in its storytelling, as idea-driven fiction generally ought to be. Enough so that Sawyer didn't hesitate to follow his characters' trains of thought about the issue at hand, or permit them to have "explicit dialogues" in the Wellsian (or Shavian) manner. (I, for one, must admit admit I am fond of such dialogue, much fonder of it than the tenth rate Flaubert to which the advocates of "good form" expect us to aspire.) The kind of thing that, with so many people less inclined to it, encouraged me in the view that science fiction was waning as a distinct genre.

Reading the book so short a time after seeing the show I found myself inevitably drawing comparisons between one and the other. On the ABC version we got not an international team working at a particle collider outside Geneva, but a thoroughly Americanized cast of characters and setting, and these turned law enforcement types tackling an international conspiracy, the intellectual interest of the tale cast aside in favor of conventional thriller mechanics and soap opera, and mawkishness about Big Collective Moments like hack journalists write endless amounts of drivel about.

It was predictable that his idea would be forced to fit into the conventions of American prime time network television, and cease to be recognizable in the process. And there have certainly been worse shows. But I preferred the book all the same.

Remember Meg?

No, probably not.

But I do. I remember it because back in the '90s the terms of the publication of Steve Alten's Meg was the kind of media sensation the rags-to-riches story loving press likes to blow way out of proportion at every opportunity--a first novel by an unknown getting them a seven figure advance (impressive numbers today, even more impressive back in those comparatively uninflated, pre-J.K. Rowling, pre-E.L. James days). The big money was tied up with plans for a big movie, and why not, when Meg, short for the Megalodon that was at the center of the story, looked like it could combine the underwater terror of Jaws with the scale and sci-fi exoticism of Jurassic Park.

But the book's sales did not live up to the blockbuster expectations, while the movie plans went down quick into development hell.

For twenty years.

During which, as Jan de Bont and Guillermo del Toro and lots of other people whose names you know better than Alten's were attached and detached from the project, the summer blockbuster of the monster/disaster type went from being a seasonal treat to a week-to-week, year-round thing, not just on the big screen but the small, where Syfy supplied us with an endless string of deliberately bad movies about the theme (mostly because it's an easier thing to do than deliberately make good movies).

Hearing "This time they're serious" I'd think "I'll believe it when I see it," but last year I heard a report that the film actually was shooting, with a $150 million budget and Jason Statham in the lead (I don't think he's been that before in a megabudget movie like this one) as a Sino-American coproduction.

It seemed real enough this time.

I also heard that it would be coming out in March, which it didn't, instead bumped to August 10, as the commercial I caught on TV last week indicated.

Of course, getting one's release date bumped by five months is not a good sign. Still less is it a good sign when the new date is in August--traditionally a "dump month" where the competition is less intense and not too much expected from the receipts.

But then one might imagine that this Chinese coproduction (due to come out in China the same weekend as in the U.S.) is being timed to take advantage of China's notorious late summer blacking out of its film market, giving its domestic productions a chance to clean up, and that it will help the numbers in that country--by itself, quite enough to make or break many a big production. (And as even the quickest review of the numbers over at Box Office Mojo shows, monster movies certainly play well in the Chinese market. The Monster Hunt franchise, The Great Wall--while Hollywood's own Rampage did much better business in China than at home.1) And here in the States August sometimes produces a winner. (Guardians of the Galaxy proved a surprise hit in that period a few years back.)

What's your guess on how the movie will do?

1. Rampage pulled in $99 million in America (yes, just shy of the $100 million mark), but took in $156 million in China.

Review: Trojan Odyssey, by Clive Cussler

New York: Putnam, 2003, pp. 496.

WARNING: SPOILERS AHEAD

Clive Cussler's Trojan Odyssey marks the end of an era in many ways. Not only is it the case that this is the last Dirk Pitt novel on which Clive Cussler's name appears without a coauthor. It is also the first novel to include Dirk Jr. and Summer in the adventure, just as the single, adventurous Dirk Sr. puts that part of his life behind him, marrying his longtime girlfriend Loren Smith and taking over the directorship of NUMA from Admiral Sandecker. These changes of life will pretty much keep him out of the field (in both those ways), and pave the way for the principal "Dirk Pitt" of the novels to be the son from this point on.

Naturally, I would like to be able to say that the sequence of "Dirk Pitt Sr." novels goes out strong. Alas, the first time I tried to read it I didn't make it past page one hundred and fifty. I suppose that part of the problem was that I had become less forgiving of Cussler's literary weaknesses in the decade since I started the series. However, it is also the case that this part of the book--which consists mostly of an ocean resort and the Pitt twins being inconveniently in the way of a hurricane--comes across as both overfamiliar to readers of his previous books, and agonizingly slow.

Hoping that the rest of the book would be better I later returned to it and pressed on. Having done so I can say that the pace and interest do pick up in the second half of the story. Still, while the story gets better, it does not get very much better. Despite being shorter than many of the preceding Pitt epics, ending on just page 485 in the hardback edition, it felt overlong, with many later bits excessively drawn out.

Moreover, that sense of overfamiliarity remained. Cussler's previous novels are not merely referenced (as would be appropriate in a transitional work like this), but recycled. There is a lot of Sahara here in particular--for instance, in the spread of conspicuous, rapidly spreading, ocean-killing pollution from a mysterious source ("brown crud" in the Caribbean), and the dispatch of Pitt, Al Giordino and Rudi Gunn aboard a disguised high-tech vessel to check it out early in the story. Same as in that other novel, the subsequent adventure involves a river journey, an old colonial fortress, an ecologically destructive high-tech facility incorporating cutting-edge energy technology, action underground, and a party of foreigners enslaved by the baddie.

Where the book does not reuse familiar elements it is often simply hokey--as with the holographic pirates Pitt encounters off the coast of Nicaragua, which as a deception intended to keep people from the area would embarrass the clumsiest Scooby-Doo villain; or the unmasking of a major villain in the middle of a Congressional hearing, which likewise comes across as something out of a Scooby-Doo episode. One does not expect, or usually get, much logic in the schemes of supervillains in novels like these, but the connection of their revolutionary technological breakthrough with a plan to change the climate of much of the Earth seemed especially shaky. At the same time, while the novel's exercise in fringe history is one of Cussler's more intriguing--an alternative explanation of the true history behind the legend of the Trojan War based on the work of Iman Wilkens--we never quite get clarity on the meaning of of the revelations (like why a Druidess is buried in an elaborate tomb in the West Indies).

And the vigor of the handling leaves much to be desired, Dirk and Al's Central American adventure paling next to what we saw in earlier installments (not least, the West African adventure Cussler recycled so much of here). When the narration remarks Pitt and Giordino feeling more tired than they can remember ever being, this comes across as a reflection of their aging bodies, rather than the author's topping his earlier efforts. Perhaps advancing age is also the reason for the painful slowness of the protagonists to grasp what became obvious to the reader--the secret the villain conceals behind the unusual attire, the source of the brown crud.

Meanwhile, NUMA's next generation--Dirk Jr. and Summer--have little interest in themselves, and what Cussler does with them this time around, at least, is not terribly interesting either. Nor does it contribute meaningfully to the sense of transition the rest of the book seems to strive for. Rather than their being given a chance to come into their own, which would give us a sense of the torch being passed, they come off as marginal and hapless in the tale's bigger events.

Cussler partially redeems himself in the final chapter, which manages to be suitably charming, and takes some of the sting out of the disappointment in what came before--but just as Loren Smith may feel that she has waited longer than she should have to make her longtime relationship with Dirk legal, the reader may come away feeling that this should all have happened a few novels earlier.

Review: The Rise of the Novel, by Ian Watt

Berkeley, CA: The University of California Press, 1957, pp. 319.

It is a commonplace that the literary genre of the novel is distinguished by its telling about the lives of everyday individuals in everyday circumstances, in a manner distinct in two ways--its plainness of style, and its attentiveness to the "inner lives" of its characters.

In his classic discussion of the matter in The Rise of the Novel Ian Watt does far, far more than present that view, instead getting to the root of the matter. As he makes clear the ancient-Medieval tendency had been to think of the "universal" and "timeless" as real, and particularities as ephemeral and meaningless. That earlier logic had authors striving to portray the universal above all, serving up characters who are "characteristic" types down to their names, and time and place often conceived imprecisely, with the authors' ornamentation of the portrayal the principal object and measure of their skill.

By contrast, this era saw writers aspiring to tell the stories of particular individuals, in particular places and times--and indeed, the unfolding of the events of their lives through cause and effect sequences, over time. That shift, in turn, was the reason for the plainness of style--the prioritization of a denotative, descriptive usage of language, with the lavishness in detailing taking the place of the old lavishness of ornament. He stresses, too, how this came together in a genre of private accounts, read privately, permitting an "intimacy" unattainable in other, publicly performed and publicly enjoyed styles of work (plays, poems) that both extended the bounds of what seemed permissible in art, and allowed a new intensity of audience identification with protagonists.

All of these features--the strong sense of particularity, time and cause and effect; the use of language as description rather than ornament; the private, "intimate" novel-reading experience--seem implausible outside the rise of a rationalistic, individualistic, science-touched Modernity that afforded individuals a meaningful range of economic choices (a capitalism, increasingly thoroughgoing and increasingly industrialized), and sanctioned them ideologically (Puritan ideas about salvation, secular Enlightenment thought). And indeed, Watt's classic is best remembered for its stress on the historical context in which the novel emerged, which reflects a good deal of what Mills was to call "the sociological imagination."

Appropriately the book begins with a good deal of historical background, specifically two chapters regarding the philosophical developments underlying the rise of what we think of as "realism," and a reconstruction of the eighteenth century reading public from the concrete facts of the era. In these the territory ranges from the epistemology of John Locke to the nitty-gritty of how many and who would have had the time and money to buy this kind of reading material and a place in which to read it in that private way he described (delving into prices and incomes and the rest). Afterward, when turning to the founding authors and works themselves--Daniel Defoe's Robinson Crusoe and Moll Flanders, Samuel Richardson's Pamela and Clarissa, Henry Fielding's Tom Jones--he continued to draw on historians' understanding of aspects of the period ranging from the rise of Grub Street to the rise of suburbia, and how these factored into the authors' creative work by way of their personal backgrounds, and quite profitably. (Wary as I tend to be of the biographical approach, I found it impossible to come away from this study thinking it accidental that Richardson, a retiring, low-born, Low Church suburbanite earnest about "middle class morality" wrote epistolary novels about domestic themes, and Fielding, a robust Tory squire who attended Eton with William Pitt, served up Jones' picaresque adventure.)

Watt is all the more effective and interesting for his attentiveness to the finer points of form so much more difficult to discuss than content, and the ways in which form, too, was remade by practical imperatives of the writers' business. (That exhaustive descriptiveness, and the need of the writer to make themselves understood to readers of a less certain educational level--both connected with their selling to a wide public rather than catering to an elite patron--combined with per-word pay rates to put a premium on prolixity.)

All this is considerably enriched by a good many observations of wider literary significance. Not the least of these concerned the tendency of writers to essentially write themselves when presenting a protagonist, especially in a first-person narrative. ("Defoe's identification with Moll Flanders was so complete that, despite a few feminine traits, he created a personality that was in essence his own" (115).) Interesting, too, is his quip about an aspect of the style of fiction Pamela helped pioneer far less likely to be acknowledged by any critic in our day wishing to retain their mainstream respectability. ("[T]he direction of the plot . . . outrageously flatters the imagination of the readers of one sex and severely disciplines that of the other" (153-154)--the female and male sexes respectively here.)

Indeed, the breadth of his vision and scope of his interest is such that some of his more interesting remarks have nothing to do with eighteenth century fiction at all. (As Watt observes Shakespeare, the "inventor of the human," is less a modern than a Medieval--similarly different in his thought about time and causality so that these come across as loosely handled, and different in his thought about language, too, so that the propensity for the purple prevails over the comprehensible in his poetry, some of the reasons why he is less accessible and compelling to most of us than we think he is "supposed to be.") Watt is attentive, too, to the trade-offs that writers have to make when they actually create something--like that between plot and character (the one as a practical matter attended to in "inverse proportion" to the other (279)), and many of the pitfalls of literary criticism in his time, and our own. ("Coleridge's enthusiasm" as critic, he remarks in one instance, "may . . . serve to remind us of the danger . . . of seeing too much" in a work (120).)

As the cited passages indicate, Watt, for all his richness in insight, is also extraordinarily accessible, partly because in comparison with our pretentiously and trivially theoretical, jargon-laden contemporary work Watt is clearly focused on his subject and straightforward in his communication, while also exceptionally gifted as a wordsmith himself. Altogether this makes the book not just a key work for anyone trying to understand eighteenth century literature, or the novel that remains the central fictional form of today, but literature in general.

Subscribe Now: Feed Icon