Wednesday, April 10, 2019

A Note on Richard Hofstadter's Anti-Intellectualism in American Life

In his classic Anti-Intellectualism in American Life Richard Hofstadter pointed to four principal sources--business, organized religion, "Jacksonian democracy" and "progressive education."

It was easy enough to understand business, religion, and even Jacksonian democracy being listed, but I couldn't help wondering, why put progressive education on a level with them? Certainly no one would consider what he meant by progressive education (the Rousseau-Dewey tradition) as at all comparable in its effect as those other forces even on education, let alone American life.

A significant clue lay in the ideological characterization of these four forces. Hofstadter identified two of those forces with the right--business and religion--and two others with the left--Jacksonian democracy and progressive education. Such an identification seems to me awkward. Others (myself included) read Jacksonian democracy as right-wing populism, not the other kind. Additionally the progressive education of Hofstadter's time was not without conservative roots--not least, in its accommodating students' expectations with regard to career to an economic system which cannot give everyone "the good job."

The strain involved in putting progressive education into this grouping as if it were somehow equivalent to the others, the problematic characterization of these four forces, gives me the impression that in line with the liberal centrism toward which he was tending by this time Hofstadter wished to create an image of ideological balance. He ended up with a false balance, as the centrist version of balance so often tends to be, obscuring the reality that anti-intellectualism, at least where it has counted, has overwhelmingly come from the right.

Sunday, April 7, 2019

Review: Armada, by Ernest Cline

Ernest Cline followed up the success of Ready Player One with Armada, in which he clearly meant to produce another work of the same kind without too ostentatiously repeating himself. And so once more we receive a young adult novel about an IRL loser who goes on a virtual action-adventure that might change everything for him, and everyone else, drenched in '80s geek culture nostalgia-flavored pop cultural references (often, the same references), our hero's useless knowledge about which actually proves helpful to him and the rest of the good guys.

Still, he takes considerable pains to differentiate the assembly of those elements. Instead of an ill-treated, impoverished orphan in a post-apocalyptic dystopia, the protagonist, Zack Lightman, is a present-day suburban kid raised by a loving mother who, despite being a single parent working as a nurse, and cut off from her own family, is well enough off that he doesn't have to worry about paying for college; while he identifies closely with his deceased father, the mystery of whose death obsesses him, and proves central to the story's course.1 (It seems that dad might have been onto something, just before he lost him . . .) Zack is also not the overweight, virginal social outcast of the first book. He has real-life friends, and has even had a girlfriend with whom he has been intimate. (No wondering if doing it with a robot counts as a first time, here.) Instead of a struggle of the little guy against corporate power it is humanity vs. the aliens, and rather than a scavenger hunt across a virtual world, he, and all of us, prove to have been fighting in the real world all along.

As one might guess from what has been changed, and how it has been changed, and the fact that Armada appeared in 2015 rather than 2011, Armada is the product of what, at least to go by what one saw in the mainstream, was a more complacent time, with the recession declared officially over, and the public shrugging off the energy shock--and a complacent book.2 Discovering what was really going on Zack thinks to himself that humanity's seeming disregard for all its grave environmental problems--the exhaustion of the fossil fuel supplies, the wrecking of the climate and the rest--was not self-destructiveness after all, but the necessary price of a grand plan to meet an alien menace, and considering it declares that it "filled me with a strange new sense of pride in my own species." (I repeat: forty years of neoliberal-neoconservative-postmodernist disaster and all the grief and despair they have wrought were really just us getting those nasty aliens right where we wanted them!3 The irresponsibility of making such a claim at a moment like this cannot be overstated.)

As one might guess from all this, too, it is also a less original work, with the premise making for a less fluid integration of pop cultural homage and action-adventure. Ready Player One's mix of nostalgia and adventure was rooted in the reality that its quest was a monument to its maker's obsessions. There is an extent to which such nostalgia is connected with the mystery of his father's passing, and what he became part of, but this is not a puzzle the hero puts together significantly, but rather a collection of clues that remain as just clues until others reveal the truth to him, and which become quickly overshadowed by that other, far bigger mystery--"Just what do the aliens want?" In all that it seems less evocative of its pop cultural references than derivative of them (not least, Wargames and The Last Starfighter), while the freighting of the dialogue with generally more obvious references making it sound like a geekier version of the writing for Gilmore Girls, and no less grating (at perhaps its most painful in the first meeting of Zack and Alex).

Alas, even on an action-adventure level it did not quite equal the first. While Armada's space battles are consistently well-written, fast-paced, dense, coherent, the elements of them seem generic rather than evocative, nothing coming close to the wacky extravagance of the climax of Ready Player One; or its intensity, the more so because of the enemy's facelessness, and the point made early on that it was not all as it seemed, which made it clear to me that this was not really going to be settled by fighting. Indeed, reading the battle comprising the last quarter of the book I was simply impatient for the story to get on with the explanation of what that was instead.

Ultimately the answers Zack gets raise still more questions, opening the door to a sequel. A story that followed Zack's pursuit of their answers has its potential, but I suspect that making it really work--making it ascend above the serviceable but generally generic book he produced this time around--will not be easily reconcilable with a desire to give the reader more Ready Player One. Perhaps it is for that reason that a direct sequel to his 2011 hit is reportedly what we can next expect from Cline.

1. Yes, the protagonist was named after the hero of Wargames, because apparently Cline didn't give us enough of it the last time, and yes, this is not wholly irrelevant given the decision that has to be made at the climax.
2. Alas, watching politics grow increasingly radicalized in the three years since, it is clearer than ever that the complacency was a luxury of the privileged and sheltered.
3. This aspect of the story was a reminder that, despite Ready Player One's post-energy shock, post-Great Recession angst and appeal to the masses against a corporate villain, it was not a sophisticated political critique, actually being quite conventional and conformist in many a respect, not least in the biography of James Halliday. In the early '80s fewer than a tenth of American households had personal computers. That the son of a machine operator and a waitress had one, and so many other of the day's electronic goodies, is not impossible, but certainly less common than one might think. That a kid from a blue collar Rust Belt family, while still far from the tech hubs of the day, without social connections or capital or even a high school education, managed to launch a gaming empire at that relatively late date (this is long past the moment when Gates and Jobs were getting their start), is little more than the fantasy of tech-entrepreneurship so beloved of libertarians and their fellow travelers pushed to its outermost limit--pure right-wing economic propaganda.

An Anorak's Thoughts On Ready Player One (Film Adaptation)

MILD SPOILERS BELOW

As a bestselling pop culture-soaked and action-packed young adult sci-fi novel set in a safely near future on Earth, Ernest Cline's Ready Player One at first glance looks like obvious material for a Hollywood blockbuster. On closer inspection, however, one sees that it lends itself less easily to adaptation to a blockbuster movie than one might guess.

The book's looseness is certainly a factor, with its numerous incidents, the sometimes significant lapse of time between key events, and general sprawl. The way the book is written is still a greater factor in this. Narrated by the protagonist (Wade Watts), the story is substantially told rather than shown (as is so often the case when the first-person point-of-view is employed), not a minor point given the significance of the characters' back stories, and the world building. There is the associated fact that we spend so much time in Wade's head, sharing his considerable angst, and watching him work his way through the intricate puzzles--by way of which, we indirectly share OASIS creator James Halliday's own angst as well. All this is also bound up with two key traits of the book that make it problematic from the standpoint of content as well as form, its bleakness and its gleeful, almost Baroque geekiness of certain very distinct flavors--the challenges Watts having to overcome deeply based in '80s era geek culture.

The thick detailing of all this gives the story its emotional charge, and none of it translates easily to the screen, especially within the framework of a fast-paced action film (even one drawn out to a 140 minute length). Much of the content is not the sort of thing a major studio picture would convey even if it could. Bleakness did not stop the Hunger Games series from being one of the most successful franchises of all time, but the mass audience's appetite for dystopia is finite, and was already waning well before shooting started on this movie.

It is worth noting, too, that the Hunger Games saga offered a more grounded vision, rather lighter on the science fiction trappings, arguably easing much of the audience into absorption in the events on screen. (Indeed, this seems to me sufficiently the case that I used it as a reference point in discussing alienation effects in Cyberpunk, Steampunk and Wizardry--or more precisely, discussing the minimization of such effects for the sake of dramatic interest.) The extent to which the book is utterly saturated with (often obscure) geek nostalgia, and how central the minutiae of that nostalgia is to the puzzles the hero spends so much of the book unraveling, is even less forgivable. Halliday was an anorak par excellence, with the quest he laid out for his successor itself an extreme realization of the geek fantasy of others' coming to share one's obsessions rather than treating one as an awkward bore because of them (and along with it, the fantasy of one's "useless" knowledge changing and even saving the world), and it is hard to picture a commercial film treating anything like that faithfully. It is harder still to picture a major film having the hero play through one retro video game after another, reenact one old classic of geek cinema after another--dramatizing to any extent the portion where Wade Watts has to act out the role of Zack Lightman in a Wargames simulation, completing the task by remembering every beat and line of the movie.

Naturally it is implausible that it could have been a purist's film, and indeed it was not. It seems predictable enough that the quest was scaled down to conveniently fit in a two hour movie, and the events of the narrative played out over days rather than months.

It is only slightly less predictable that the back story and world-building are also trimmed--in this case, to the minimum necessary for minimal coherence, with the inevitable voice-over confined to an expository beginning and the conclusion. The film downplays the bleakness of the scenario considerably in the process. The ruin of the world is only briefly and vaguely alluded to. The combination of energy scarcity, environmental collapse and economic crash, the Mad Max-ish condition of much of the country, are not mentioned at all. Our first glimpse of the Stacks, so vividly horrible in its poverty and ugliness and harrowing danger in the book, instead presents them in images so bright and vibrant and colorful it actually looks like a fun place to visit, even if one would not want to live there--and despite putting more images of graffiti-covered urban decay on the screen than any comparable film of this century, the film practically shrugs it off. Likewise the villainy of Innovative Online Industries, its ambitions for OASIS, its reduction of human beings to serfs; the resentment felt toward it and opposition engendered toward it; are treated slightly. (Even the more personal aspects of their anguish are downplayed, with the aunt Wade lives with made to seem less horrible, and the tensions of Halliday's home life when he was growing up in the '80s elided.)

The same goes for the geekiness. In the challenges Wade had to face the film emphasized grandiose action-adventure set pieces over puzzle-solving. In what remains of the mystery the film privileges a Citizen Kane-like pursuit of "Rosebud" (the characters actually use the term more than once in the film), which is, alas, Halliday's failed romance, over the pop cultural obsessions in which Halliday so immersed himself. All of this helps the film keep the pop cultural references easy, marginal to the mechanics of the quest, and usually both, while being less '80s geek oriented. (We get, for instance, a Saturday Night Fever-themed dance scene at one point, which is quite the divergence from the matter of the original.)

Ultimately the spareness with the information so abundant in the narrative, the strategic changes so critical to turning a cult book into a mass-appeal entertainment, took its toll not just on the book's sense of immersion in the virtual world of the OASIS, and the pop culture of a bygone time, but that more conventional matter, characterization. Simply put, Wade, Halliday and the rest are less troubled and less interesting figures. Art3mis' motivation is reduced to simple revenge for the destruction of her father (about which we know only because of a few lines), a less compelling motive than her original, grander aspirations, and less compelling, too, than the revenge narrative of the character who originally acted on such motives in the novel. For all the lush CGI the film does not quite do justice to the vastness and variety of the OASIS as conceived in the book, while the "real world" outside it seems insubstantial, along with the stakes in the competition--which, after the revision of the puzzles and the compression of the time frame, felt easier than it ought to have done.

The movie does improve as it goes on. At the climax the film becomes more faithful to the source material, not quite attaining the visually spectacular hyper-geekiness of the final battle, the movie version scaled down by comparison, but still very recognizable and about as satisfying as one might hope for from the Hollywood version. As one who enjoyed the book on its own terms this was some compensation. Still, while Ready Player One is considerably better than average as blockbusters go these days, not only proving fast and slick and visually flashy, but a bit more original and with a bit more sense of fun than most, it falls well short of matching the oddity and extravagance, the texture and tension, and the epic feel of the first, print, version of Wade's adventure.

Reading Ready Player One: A Few Thoughts

When Ernest Cline's Ready Player One first came out (2011) I was pretty much burned out on science fiction. (This was about the time when I put together the first edition of After the New Wave, thinking of it as an act of closure, which it really proved to be for a while.)

And even before I decided to take a break from it I was paying little attention to the young adult-oriented work that was getting so much press then, because my glances at it gave me the impression (accurate, I think) that it did not offer those things that most attracted me to science fiction--conceptual originality and audacity, elaborate world-building, political teeth. And as Cline's book got rather less media attention than, for example, Suzanne Collins' The Hunger Games saga (or even Veronica Roth's Divergent), it was easy for me to forget about it until the movie came out and got people talking about it again.

Yet, looking back it seems to me that Ready Player One is a quintessential piece of early twenty-first century fiction, because of the way it draws together so many of the prevailing tendencies of the period so far.

There is post-cyberpunk--by which I mean that what we have come to think of as cyberpunk tropes (extremes of wealth and poverty in a decaying society, super-corporations, exploding information technology) were presented without the ostentatiously avant garde narration and prose style once associated with the original, '80s-era cyberpunk fiction, making for a more straightforward and accessible (and for most of us, more appealing) narrative.

However, reflecting that it was 2011 and not 2003 or 1997, this particular post-cyberpunk work's bleakness is informed by the experience of the 2006-2008 energy crisis, the Great Recession, the element of apocalypse and dystopia so fashionable in those years; while similarly fashionable the protagonist is a young adult.

And of course, there is the sense of nostalgia, the density of pop cultural reference, that had so come to suffuse popular culture in general and science fiction in particular by that point, with in both cases an emphasis on the things of childhood in the '80s, and especially the geekier things of childhood, like video games.

There is even a nostalgic element in the centrality of virtual reality in Cline's world that those with short memories may find it difficult to appreciate today. In the '90s, after all, we were all promised that immersive virtual reality, and indeed immersive virtual reality as our primary manner of engagement with the Internet, was just around the corner, but then it wasn't, and when Cline wrote his book it still wasn't, instead looking like a flying car of the digital age, one of many. (Only years after, circa 2014, did VR seem to get its second wind, and even with the number of VR users in the U.S. expected to approach 40 million this year, it remains far from certain that the technology has truly arrived.)

Reading the book I found that where the pop cultural invocations had only a faint charge for me, or none at all, Cline's enthusiasm comes through, and it can be infectious. Cline's book, however, impressed me as more than just a reflection of the pop cultural and science fiction zeitgeist of the time. Some aspects of his future are fairly well thought-out, especially but not solely its VR technology. (Just what will happen to hotels in a world where no one travels, for example?) The book also impressed me with its sense of humor, containing as it did some laugh out loud funny bits. And after the mid-point I had to force myself to put it down when I needed to, something I haven't been able to say about a work of fiction in far too long.

Thursday, April 4, 2019

Toby Young, Again

Initially I was surprised to find that Toby Young was a real person--in the sense of the film How to Lose Friends and Alienate People having been (loosely) based on the true story he recounted in his memoir and its sequel. I happened to read it, and found Young an amusing, even mildly interesting figure given his particular set of quirky experiences on the fringes of wealth and celebrity, and his candid discussion of the hungers that drove him, rare enough that it seemed worth note. (Who doesn't want a good time? Who doesn't know that a good time is mainly had by the rich in our profoundly unequal social order, and that for scrawny, nerdy-looking types like Young, getting rich and famous is pretty much only their shot at that? Being an intellectual--even to the slight degree to which he could claim that--doesn't mean one is without such desires.) His sociological eye was no match for his famous father's (indeed, he was surprisingly clueless on some points, not least just how extraordinarily privileged he has been), but that he had one at all seemed something.

Reading those works it was clear that Young had opted for a different path from his accomplished parents' extraordinary lives of public service and intellectual inquiry, favoring a crass, hedonistic individualism instead. At the end of How to Lose Friends, though, it seemed he had recognized that path's shallowness (as well as its futility for him) and moved on.

The years since have made it clear that he has not--and in a most unfortunate way. Instead of a life devoted to chasing wealth and glamour and sex, he continued his rebellion against mom and dad by endlessly and often viciously championing the most backward, reactionary politics around--attacking wheelchair ramps, mocking the appearances of the working-class kids who defy the odds to go to Oxbridge, championing eugenics--and like pampered idiots who have no idea how good they have it since the beginning of time, whining about how "You can't find good help these days," while being awfully smug about being "politically incorrect" in that way that has become so tiresomely standard for the pathetic yet utterly loathsome people who become alt-right trolls (and bullies in general).

Monday, March 25, 2019

"Debate Me!" Screamed the Troll

The Internet troll's demand that you "debate" them is a demand that you submit to trial at a moment of their choosing in a court where they are judge, jury and executioner as well as prosecutor; with only yourself for an advocate, no time or opportunity to prepare your case, and on Twitter, a 140 character limit to each and every statement for the defense.

Some debate, that.

Remember, friends--no one is obliged to stop everything they're doing and defend their opinions to an anonymous stranger at any and all times. This is all the more the case if the person called on to defend their opinions is a private individual unaccountable to the public, and absolutely the case when the person making the request shows every sign of doing so in bad faith.

Saturday, March 23, 2019

The Laziness and Cynicism of "Subjectivity"

"It's subjective," people so often say.

Most of those who hear that word nod and grunt like Tim Taylor at Good Neighbor Wilson's utterance of Big Thinks from behind the fence.

But what does the word really mean?

Not what people think it does, usually.

Often what we're talking about is quite objective. (After all, those subjective feelings are usually in relation to some concrete thing, place, person, incident.) But we cannot explain what we want to convey in a satisfactory way.

Sometimes this is a matter of simply not having thought the matter through properly.

This may be because doing so simply does exceed our capabilities. In which case not so much can be done about it (at least, by the person so frustrated).

But often it is a matter of being too lazy to try and think the matter through.

Other times we say "It's subjective" because we have thought things out--and we know that we don't really have an argument, we're really being unfair and we know that we'll come off badly when we try to explain our untenable position, but damn it, we want what we want, and we're willing to play dirty to get it.

When something is really important, be very careful of people who try to fob you off with "It's subjective."

On the Word "Entitlement"
4/13/19
Understanding the Word "Cool"
5/4/17
On The Word "Deserve"
10/21/12
On the Word "Lifestyle": A Postscript
5/18/12
On the Word "Lifestyle"
11/19/11

Review: Greenhouse Summer, by Norman Spinrad

The climate crisis has become the subject of rather a large body of science fiction over the years. Where that body of work is concerned it seems to me that Norman Spinrad's contributions to it, rarely mentioned, are unjustly overlooked; that he was ahead of the curve in addressing the matter when he did, and even decades later, far more successful in doing so than many of the attempts seen since then. Entering some of his prior works in a minor though not uninsightful way (like 1991's Russian Spring), before the turn of the century he devoted a full novel to the theme, Greenhouse Summer (1999).

Reading the book I was less satisfied with it than Russian Spring overall, its patches of brilliance marred by creakier portions (like its treatment of computer technology, and the thematically appropriate but dramatically flat conclusion to the central intrigue). However, by and large it gets the nuances right. Spinrad gets, for example, that the phenomenon will be experienced differently in different regions, generally as a disaster though not always so (we have the equatorial "Lands of the Lost," but also "Siberia the Golden"); that climate change has no clear end point, so that an already damaged world might have to face worse still in the absence of radical, remedial action, with much the same uncertainty and political wrangling we do now (his characters confronting the complete, life-eradicating catastrophe of "Condition Venus"); and that this will surely have consequences for society, the structure of which will, in turn, determine the possibilities for facing the problem (the syndicalists have inherited the Earth in his scenario, and it seems to him that therein lies such hope as exists in it). From a more literary standpoint, Greenhouse Summer impressed me in that, while Spinrad did not shrink from the horror suffered by the worst-hit places, he managed to be colorful, even satirically humorous (not least, in the airship tour of the Lands of the Lost conducted by Colonel Qaddafi very, very displeased about having been cheated by tricky foreigners during the construction of the Great Man-Made River, or its Paris reinvented as New Orleans). Indeed, when I reviewed Gordon Van Der Gelder's original anthology Welcome to the Greenhouse some years ago, and more recently, the "solarpunk" anthology Glass and Gardens, I found much to like, but was confirmed time and again in the sense that such virtues, on such display in Spinrad's book, were all too rare in their stories, and in climate fiction generally.

Thursday, February 28, 2019

The Cowardice of Consumer-Bashing; or, Neoliberal Environmentalism

The environmental movement has taken numerous forms, and indeed, just about every conceivable form with regard to ideology, so much so that one must be careful in generalizing about it. But it seems safe to say that mainstream environmentalism, like mainstream everything else today, emerged in the neoliberal era, and accommodated itself to that era. Despite the unavoidable clash between the demands of environmentalism and the prerogatives of capitalism (not least, the impossibility of infinite economic expansion on the basis of a finite resource stock), it has been neurotic about appearing even mildly critical of the socioeconomic system, let alone engaging in radical critique and proposing large solutions to large problems.

Instead it has preferred to couch its criticism of society's thrust in terms of a vague "we"--as in "We failed to heed the warnings," or "We went on with our wasteful ways"--that blurs together all of humanity, drawing no distinction between the chief executive officers of ExxonMobil and BP, and starving children in the Sahel. This explicitly asserts the "sociological nonsense and political irresponsibility" that "we all possess equal powers to make history," making them accessories, often quite knowing accessories, to the irresponsibility of those who actually hold the levers of power.1 Only the most obtuse, ill-informed or shamelessly dishonest can claim that the key political decisions regarding energy and climate, for example, were made by the public, or even represented it--when these institutions went to such great lengths to lie to the public, to confuse and distract it, to combat even the principle of its having a say (what neoliberalism, after all, has been about in the end), and then when that public voted for sane policies, overrode it. (In 2008 Americans voted for a President who promised an end to fossil fuel subsidies, cap-and-trade, a Green Jobs Corps. Instead they got the "all-of-the-above" energy policy that put into practice his opponent's running mate's cry of "Drill, baby, drill!")

Indeed, where it has been more targeted in its criticisms, environmentalism has emphasized the subjective and individual--looking away from the fact that over seventy percent of greenhouse gas emissions are generated by the operations of just one hundred corporations, thinking about which not only takes us far closer to the root of the problem given those corporations' practical control over what gets made and how, but because of their size, organization and resources generally are better able than anyone else to change all that, and in the small number of organizations involved an excellent focus for thought and action about solutions; in preference for talking up the carbon footprint of seven billion individuals. It is, they make very clear, incumbent on the individual consumer to sacrifice by choosing carefully, paying more--rather than incumbent on the manufacturer to provide the greenest products available at any given price (and the thought of regulation to that end, anathema), ignoring the realities of power, and equity. The consumer has already lost a major option when, the political system having failed them by refusing to provide adequate public transport, they are forced to buy a car and drive many, many miles in it just to have a job. When buying that car they are restricted to buying what they can afford from the models that an oligopolistic car industry is prepared to market--something it has done in line with its preference for selling not just old-fashioned gas-burners, but more vehicle per customer. But it is the consumer that it lambastes.

In it all one can see a retreat of environmentalism into an austere, misanthropic, religiosity which thunders against the consumer, "You have had it too good for too long!" (especially naked where it seems to positively gloat over the idea of civilization's crashing down and a Great Die-Off taking most of humanity with it and taking the rest back to the Dark Ages). The attack on the consumer, too, can appear a sort of compensation for their failure to influence the genuinely powerful--or shabbier still, their taking their frustrations out on those in no position to resist. It bespeaks real failure. Hopefully, rather than bespeaking it, it will admit it, abandon what has not worked, and think of what might--intellectual and moral courage rather than cowardice, in a readiness to admit that realizing its goals may make the ultra-rich unhappy, and a preparedness to think big. I, for one, am convinced that at this late stage, nothing less than a 100 percent-renewable-energy-and-large-scale-geoengineering moonshot can save us from catastrophe, and the sooner we see the obvious taken for granted, and properly acted upon, the better.

1. The words are from C. Wright Mill's The Power Elite.

Why I Am Sick of Hearing About Cowspiracies

It seems that today meat-eaters can hardly go a day without being subjected to a moral harangue about how their dietary preferences and nothing else is dooming the planet.

Anthropogenic climate change is an indisputable fact; so is the rapidity with which it has been unfolding; and the same goes for the necessity of serious action on it. Additionally, no reasonable person denies that meat production is a less efficient use of our resources than other forms of food production, or that it contributes to the accumulation of greenhouse gases. Yet, the actual scale of meat production's contribution to the problem, and the range of options for redressing that contribution, are wide open to question. Where the oft-cited documentary Cowspiracy claims that this alone is responsible for over half of greenhouse gas emissions, agriculture altogether (of which meat-raising is but a component) is rated by the Environmental Protection Agency as responsible for under a tenth of the total.

Someone is clearly wrong here. But surprisingly few people seem to be taking the trouble to work out who it is, critical assessments that might establish the validity or invalidity of the Cowspiracy analysis strangely lacking--as a Google search, and still more a Google Scholar search, will demonstrate. It appears there are only a handful of pieces from comparatively marginal sources out there, underlining the fact that outright climate change deniers have infinitely more media access than those who question the allegations of "Cowspiracy."

The result is that the intellectual basis for the assertions of Cowspiracy seems far from firm--even as those haranguing the meat-eaters automatically go to the most extreme solution. Virtually no one claims that we must respond to the problem raised by the greenhouse gas emissions transportation generates by abolishing the movement of people and things. Few even suggest doing very much to rationalize our use of transportation, most promoting alternative technologies that would permit people to more or less go on living as they do (like the electrification of vehicle fleets). However, abolition is exactly what those placing the stress on meat production, without preamble, insist upon--apparently uninterested in the reality that not all meat production is equally problematic (a substantial difference existing between, for example, the environmental impact of chicken and beef); and not all methods of meat production are equally problematic, either (evidence existing for grass-fed beef as actually a possible offset to other emissions). Indeed, where one might expect that those who are most insistent on the destructive effects of meat production would be champions of investment in, for example, the cellular agriculture methods beginning to show real promise, just as they champion renewable energy, they do not speak of it at all.

It is a remarkably categorical attitude, which can easily give the skeptic the impression that vegans have simply found outsized claims about meat production's contribution to climate change a useful addition to their list of arguments. That, perhaps, the fossil fuel industry that has fought so long, hard, dishonestly and successfully against curbs on its activity, or even the reduction of the colossal subsidies it enjoys ($5 trillion a year, if one counts externalities!), finds it convenient to have the heat turned on someone else . . .

And of course, that all this is simply another case of, in typically bourgeois fashion, trivializing every aspect of economic life into a "lifestyle choice," and what is more, casting aside any regard for equity, as mainstream environmentalism has too often and too long done. This sort of argument tells the well-off that it is not their mansions and Hummers and jet travel that is the problem (implying that they can go on indulging in all this without guilt), but the beleaguered prole who at the end of the day finds a burger more tempting than the plate of beans they want to hand him (not incidentally, in a country where it is the "elites" who turn up their noses at red meat meating as gauche); these lower class types they regard as so backward and reactionary as to justify any callousness or contempt.1

Rather than turning the fight against the real issue of climate change (and contemporary agriculture's genuine contribution to it) into an attack on the dietary choices of have-nots, one ought to acknowledge that our food production (of which meat production is a big part, but even there, still not the totality) is one of many dimensions to the problem, while as in other areas of life, the emphasis should be on meeting people's needs (I must admit that I am not convinced that a vegan diet really is best for each and all through the entirety of their life spans) in sustainable ways. Even if one accepts the extravagant claims about meat production's contribution to climate change (and as yet, there seems to me much room for doubt), the imposition of veganism on seven billion people in response is a last resort, not Plan A.

1. For a discussion of this sensibility, see Owen Jones' interesting book, Chavs: The Demonization of the Working Class.

Review: Chavs: The Demonization of the Working Class, by Owen Jones

London: Verso, 2011, pp. 352.

As the rather charged and frankly offensive title of Owen Jones' book, Chavs, implies, his concern is with the recent social history of the British working class, and the ways in which that class has been depicted and understood by British society at large. In particular Jones concerns himself with the alleged disappearance of the "respectable" working class--the image of working class people as gainfully employed, law-abiding, functional contributors to the society in which they lived--and its replacement by the image of working class Britons as "chavs"--vulgar, bigoted, anti-social louts, apt to be drunk, violent, criminal.

In considering this transformation Jones, to his credit, gives due to quite concrete facts of social history, with the catastrophe of Thatcherism playing its part in the story--for the neoliberal turn she pioneered and each and every one of her successors continued to promulgate (not least the "New Labor" she regarded as her greatest achievement) did in a manner of speaking wipe out the old working class. The collapse of the British industrial base amid the worldwide Volcker recession and homegrown monetarism, and the privatization of the housing stock, and all that accompanied and followed them, eliminated the enterprises and institutions that provided the working class with both steady, remunerative work and the benefits of a community--the factories and coal mines, the labor unions that organized those who worked in them, the council estates. What remained in their place, casual, temp, ill-paid--and atomized--work in the "service" economy, and at the same time fantasies of soaring to the top in a world where celebrities and the like are richer and more exalted than they ever were before (for instance, becoming a soccer player like superstar "chav" David Beckham), was no substitute in either monetary or social terms. All of this did undeniable, considerable damage to that class's members individually and collectively, as many were turned from, one might say, proletarian to lumpenproletarian.

However, as Jones declares in his extensive overview of the treatment of this strata by the typically reactionary British media, the image of these people as chavs was pure neoliberal propaganda, the declaration that "We are all middle class now" promoting the illusion that the worthier element of the working class had "moved on up," leaving outside that class only the incorrigibles who had only themselves to blame for their misfortunes. Underscoring this was the lavish, sensationalized attention paid to the evidences of the dysfunction of the non-respectable working class that remained. Much of this was superficial--whining about their dress. ("What ever happened to working class men wearing suits and ties?" right-wing social critics whined.) Some of it was more serious. (The alleged propensity of the working class to neglect its children is an obvious example.)1

As always, it is comforting for the privileged to think that the comfort they enjoy is wholly and unimpeachably earned and deserved by them and not enjoyed at the expense of anyone else; that those who are not so privileged are, when one cannot shrug off their deprivation so easily, the authors of their own miseries. It is more comforting still to think of them as not contributing at all, but as parasites, scrounging off the taxes of "respectable people," dragging down the economy, while debasing the quality of urban life with their anti-social behavior--for one can still less make claims on behalf of a parasite than they can of people who merely made a muck of their own lives.

In line with all this it is comforting for them to make much of working class bigotry. Comforting to say that the more affluent--enlightened--segments of society are "post-racial," and that the only reason society as a whole may not be so is the lower orders. It is convenient, additional proof of the crudity and backwardness that condemns them to their place at the bottom of the heap, that adds to their discomfort with their obvious and severe dysfunction--while being yet another excuse for deflecting any claim on their behalf. That they think they should be living better--why, that is mere "entitlement," and a racist entitlement at that. Besides, when they so clearly have no empathy for others, why should anyone care about them?

Reading it all I found Jones' case to be as robust and lucid as it was depressing. And as tends to be the case with worthwhile books, it left me thinking a good deal about its implications--not least, how a shallow and sanctimonious and commonly hypocritical parody of anti-racism becomes an excuse for a vicious classism that, ironically, has worsened the problem of racism. Because when right-wing populists (read: fascists) court the votes of the working people abused and exploited, snubbed and insulted, by the mainstream parties of the liberal/left as well as the right, they can say, no, their appeal has nothing to do with our abandonment of working class people and their interests in favor of catering to the super-rich, and everything to do with their racism. So that it is only right and fair that we continue along the path of woke neoliberalism--which only worsens the problem in a vicious cycle. In that, it strikes me, Jones' book offers insight into much more than just Britain's working class at the time, but the trends in British life generally since then (e.g. Brexit), and the destructive ascendancy across the Western world and beyond.

1. Altogether reading the criticisms of this strata I got the impression that those who traffic in the "chav" stereotype simply took an exhaustive list of racist stereotypes about inner city African-Americans in the United States, scratched out whatever epithet the list maker used for African-American and wrote in "English working class" instead.

Thursday, January 10, 2019

Entering a Post-Scarcity Age in Fiction?

Back in 1931 Astounding Science Fiction paid its writers two cents a word.

This does not sound like very much. But one has to take inflation into account. According to the United States' Bureau of Labor Statistics' Consumer Price Index, two cents in 1931 was equivalent to thirty-two cents in 2018. To put all this even more properly into perspective one might think in terms not of official inflation figures, but the economic life of the time. Just before the onset of the Great Depression (1929), America's per capita Gross Domestic Product was around $850. Today it is $57,500--almost seventy times' higher. Consequently, from the standpoint of people's actual incomes, two cents circa 1930 is more like a dollar and forty cents per word. The Science Fiction and Fantasy Writers' Association of America has just recently raised the minimum per-word compensation for a professional magazine from five to six cents. According to the guidelines on its web site the rebranded Astounding, Analog, pays six to ten cents per word for fiction--a relatively generous rate, and yet, just a small fraction of what it once afforded its authors.

Of course, one may think that magazines publishing short-form fiction are unrepresentative, given how their place has receded in contemporary life. By and large, when they look for something to read, people today tend to reach for novels. But the rate does not seem much better with those novels. Today a novel is expected to be at least a hundred thousand words long, and the advances that the major publishers offer for them tend to be in the five to ten thousand dollar range, which works out to . . . five to ten cents per word yet again.

In short, writers may be making a tenth or even less of what they used to in per word terms according to this one yardstick, which falls far short of taking everything into account. If, for example, one argues that publishers now have a higher technical standard than they did in the days of the pulps, necessitating a lengthier, more difficult training and time-consuming craftsmanship; if one argues that writers are now expected to bear a heavier burden of such activities as publicity in addition to their actual writing in return for their pay; then just looking at pay rates understates the drop in their earnings.

The financial return on effort aside, consider how publishers behave toward those who wish to submit work to them--doing just about everything in their power to keep them at bay. The major publisher open to the unsolicited, unagented submissions that are the only recourse of all but the very privileged few with "platforms" and industry "connections" (those few favored rather than disfavored by celebrity and nepotism) is a rarity today.

Some literary agents have slush piles, of course. However, when speaking of them they discuss the slush pile not as a normal, routine procedure by which they come by an appreciable portion of the wares with which they supply the market, but something they bother with mainly on the off chance that one of the ten thousand cold queries they get year in, year out, will somehow, in a way not reducible to rational, objective, verbalizable criteria, commend itself to their business judgment as likely to be profitable for their agency. The marginality of this is further reflected in their tendency to hand the job of looking out for those special finds in the slush pile not to the practiced eyes of veteran agents, who have more "important" things to do than keep up such a lookout, but unpaid interns who will years later write articles in places like Salon and the Guardian where they take the frustrations of their time on the publishing industry's lowest rung out on those who have no rung at all, gleefully trolling the unwashed plebs who committed the crime of sending in the unasked for queries and manuscripts they were obliged to read.

This means that just as the pay has plummeted, so has it become more difficult to find anyone willing to offer even these sharply reduced rates. Meanwhile, just as publishers, and agents, make themselves ever less approachable to the aspiring writer (indeed, many an aspiring writer would say, treat any approach with extreme hostility to the point of abusiveness), the TV viewer is barraged with commercials for self-publishing services, hoping to get not the writers' work, but their money. Someone who has self-published a book might even get cold calls from such companies at home, offering them editing, design, publicity and other such services. (By contrast, could you imagine one of the Big Five publishers--a Random House, a HarperCollins--buying TV time to put out a call for submissions? Or phoning the author of a self-published book that sold in the dozens to ask them to send them their book?)

All this bespeaks an extraordinary collapse of the market for fiction.

One might conclude from it that the falling price of words reflects either burgeoning supply, or sharply declining demand, and indeed, a combination of the two.

It is not at all hard to come up with a list of the reasons. A major one is what one might call the "vertical integration" of publishing. Just as steel mill owners might buy up iron and coal mines to bring the sources of the input they want for their operations under their control, so have publishers substantially gone from relying on what writers happen to send them to insuring that they are supplied with the sorts of thing they desire most. Thus we get an ever-rising volume of books from a handful of already bestselling authors, through their "working with co-authors," or turning their book-writing into a family business where their close relatives churn out work under their name (think James Patterson, think Stephen Ambrose); of books "by" reality TV "stars" and the like (the Kardashians are published novelists); of tie-ins and sequels and prequels and series' that go on and on and on even when the name on the cover is actually the name of the person who wrote the thing.

Meanwhile the publicity machines make it ever more likely that those who have become bestsellers stay bestsellers, ever more prolific bestsellers taking bigger bites out of the market for longer periods as the lists get to look static from year to year, decade to decade. (Even death hasn't stopped Robert Ludlum and Tom Clancy's names from appearing on the covers of new bestsellers every year decades after their passing, while a half century after his own passing the adventures of Ian Fleming's James Bond continue ever more creakily and pointlessly.) Naturally the premium on what any writer coming to this machinery as an outsider has to offer is far, far lower, not merely in terms of compensation, but their chances of finding any place at all for their own, original ideas in a traditional publisher's schedule.

However, even that is far from being the whole story now. What is also the case today, and may prove most important over the long run, is that there has never been so much content out there, so easily available. Never mind the favored Establishment answer that anything and everything that may be wrong with publishing today and undermining writers' earnings is due solely to the refusal of some to fully respect the claims of intellectual property-holders. The simple reality is that e-readers mean that anything without a copyright on it can be had for free, entirely legally--which includes virtually every classic, and non-classic, from the beginning of history down to the early twentieth century.

Then there is the vast quantity of fiction produced and distributed without any expectation of commercial gain. Fan fiction may be questionable from the standpoint of copyright, but nonetheless, it does conform to this pattern, and here, at least, statistics are conveniently available, with single categories of the stuff staggering. Over at FanFiction.net there are eight hundred thousand stories set within the Harry Potter universe alone, more than the most ardent fan could read in many lifetimes. And given that once it is put up very little is necessary to keep it up, this body of material is growing all the time.

There is, too, the response of aspiring, commercial authors to Big Publishing's scorn of them, and their inability to compete with its colossal publicity machinery. Many sell their novels at ninety-nine cents per copy, and give away much of their material totally for free. Few of them find the audiences for which they hope, let alone a chance to make a real living as an author (even when it is up for free, writers on sites like Wattpad are expected to send months persuading the site's readers to simply view it), but the practice is unlikely to stop anytime soon, so that this body of material, too, is growing all the time.

Altogether this means that no one merely looking for something to read will ever have to pay for it out of pocket again--and still have far, far more options without paying anything close to the prices traditional publishers demand. Meanwhile, there have never been more alternative uses of leisure time to reading than now, with those tablets and laptops and phones now making the full range of entertainment choices almost entirely portable. People who would once have had to make do with a book on the bus, the train, the plane can now enjoy a movie or a TV show or a video game or social media instead--and are perhaps more inclined to do so, as testified by the mountain of statistics testifying to less reading, and even less literacy.

Of course, for now it is still the case that if George R.R. Martin's The Winds of Winter ever actually does come out, people will rush out to pay full price for their copies. However, for all Martin's considerable virtues as a writer we should not forget that he would not be such a major mainstream success were it not for a blood-soaked, sex-and-nudity-filled, endlessly controversial nighttime soap opera adaptation of his series--many of the copies he sold creditable to the success of the TV show rather than purely literary success. It is also the case that a writer who stands today where Martin stood four decades ago, when he was just starting out, is apt to find that they cannot even give their book away.

It does not seem too much to say that we are looking at the beginnings of a post-scarcity age in fiction, and perhaps a post-scarcity era for the written word more generally; and that with each year we are entering into it more deeply. This is not the sort of post-scarcity a sane person would have preferred. (How about clean, green energy giving us electricity too cheap to meter instead?) Alas, we are living with it all the same, and if those who tell us that we are looking at an age of more general "abundance" are right, then how we deal with this surprising consequence of technological advance may well be an early indicator of how we cope with other, bigger problems as well. So far the record has not been inspiring. But then, however painful the situation has become for those who wish to be authors (and perhaps, how dangerous a situation serving them so poorly is for culture in general), this is all still new, and for the time being, scarcely recognized. But perhaps the fact that we are still only at the beginning of this strange new period is in itself grounds for hope that we will be able to combine the indisputable benefits of the new technologies with conditions that permit writers to write and yet live.

Perhaps someone should write a story about that.

Sunday, December 23, 2018

Review: Bullshit Jobs: A Theory, by David Graeber

New York: Simon & Schuster, 2018, pp. 368.

Back in 2013 David Graeber penned an article for Strike! Magazine regarding what seemed to him the explosion of pointless, economically irrational unemployment, which he colorfully termed "Bullshit Jobs."

In an extraordinarily rare case of an item with actual intellectual content going viral, it became something of an international sensation, leading to a broader research project and book-length treatment of the issue five years later. The book, like his earlier Debt, presents a large, intricate and conventional wisdom-smashing argument, but one that leaves the densely written and documented macrohistory to a minimum while in more sprightly fashion making a case rooted in anecdotal stories of present day individuals. In particular he relies on his collection of employees' self-reports about their work being "bullshit," defining the bullshit job as
a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case (9-10).
In line with this emphasis on the subjective, Graeber is also more concerned for the "moral and spiritual damage" done by the situation; the reasons why this is happening; and what might be done about it.

As one might guess, bullshit jobs consist substantially of administrative and bureaucratic functions of questionable value, especially at the middle management level. What may be more surprising to adherents of the "conventional wisdom" is that these are every bit as present in the private sector as in the public (in case it needed to be done, Graeber once more debunks the rightist/libertarian stereotype of bureaucracy as unique to government). Also surprising to those of the conventional turn of mind is the extent to which this has been bound up with the growth of those sectors that cheerleaders for neoliberalism endlessly exalt, and which have tended to shape other sectors in their turn--namely finance and information technology, whose ways have penetrated such non-profit institutions as the university. Thus everywhere one looks one sees an abundance of "flunkies" (whose job is to make the boss "look important," as a receptionist often does); "goons" whose essential task is to enable their employers' exploitation of the public, and often exist only because their competitors have them (as is often the case with telemarketers, lobbyists, corporate lawyers, public relations specialists); "duct tapers" (whose job consists of patching over quite fixable problems the bosses are too lazy, incompetent or short term-minded to resolve properly, as is often the case in computer programming); "box tickers" (whose task is to "allow an organization to claim it is doing something it is in fact not doing," like going through the motions on due diligence); and "taskmasters" (who invent additional unnecessary work for all of the unfortunate underlings named above).

Graeber also notes the ways in which "real" jobs rather than the "bullshit" kind may be infiltrated by or even an outgrowth of bullshit. As an example Graeber presents members of the teaching profession, who perform an essential function, but find their time increasingly taken up by administrative tasks; or researchers and their institutions, which devote increasing amounts of time to chasing grants (ironically, squandering billions that could be used to fund a vast amount of research).1 Janitors likewise remain essential, but janitorial work may be classifiable as a bullshit task when the workplace they happen to clean serves only a bullshit purpose--such as a PR firm. Given polling data indicating that as many as 40 percent of employees feel their jobs fall into the category he describes; and the other jobs which increasingly consist of bullshit activities, or service such activities in some respect or other; Graeber estimates that 50 percent of all work done today qualifies as bullshit "in the broadest sense of the term."

Graeber contends that this situation where half the country and the developed world are giving their waking hours to bullshit jobs entails enormous "spiritual" violence. The reason is that even if human beings are not naturals for the routine of intense and continuous toil from nine-to-five (or longer) demanded by their employers, human beings do have a significant need for purposeful (and even altruistic) activity. The make-work, the parody of work, that is a bullshit job denies them that, indeed harms them by inflicting on them the opposite, the anguish of pointless activity, while subjecting them to many of the worst features of employment--like the loss of personal freedom to the will of a boss who is often arbitrary and abusive, and more generally the alienations of which Karl Marx and C. Wright Mills wrote. Where the job pays relatively well, there is guilt and shame at their remuneration being unearned; and where the job makes the world actually a worse place, guilt and shame at their contribution to having made it so. (Indeed, many of those whose testimonials Graeber recounts quit well-paying, prestigious "bullshit" jobs to take less remunerative but more meaningful work.)

The absurdity, and tragedy, of the situation appear all the greater for the contrast with and ramifications for those doing productive, non-bullshit jobs, whose conditions deteriorate and whose compensation declines (turning them into "shit" jobs, making a genuine contribution but unpleasant to do). Graeber explains this partly in terms of know-nothing, corporatized administrators subjecting those workers to speed-ups, pay cuts and the like (there being actual productivity to squeeze further here); in part because right-wing populists stoke public hostility to such groups as teachers, nurses, and even auto workers. Indeed, Graeber goes so far as to contend that an envy of those who are genuinely productive is part of the political game being played. ("You get to do something meaningful with your life, and you expect a middle class standard of living too? How dare you!")

How could this perverse situation have been permitted to come about in a system that may not be respectful of workers, but at least prides itself on efficiency? For Graeber an important part of the story is that, rather than mass technological unemployment being a fear for the future, it is something that has already happened to us, leaving us with a "real" unemployment rate of 50 to 60 percent. However, what happened was that the gap was "stopp[ed] . . . by adding dummy jobs that are effectively made up." Indeed, Graeber argues (as he did in his essay "Flying Cars and the Declining Rate of Profit") that post-1960s neoliberalism is not a project for maximizing economic efficiency and growth as it claims to be (objects it has signally failed to deliver), but for preserving the existing order--capitalist property and labor relations (an object at which it has been rather more successful).

An Orwellian notion, Graeber admits that this sounds like "conspiracy theory," but regards it as an emergent phenomenon, resulting from the failings of the system, and of our misconceptions about what work is and what it ought to be. As he notes, the tendency to large numbers of flunkies and duct tapers and box tickers and taskmasters ordering them all about has been a result of the way financialization (and IT) led to a more ruthless, more conflictual, more alienating workplace overseen by managers increasingly divorced from and ignorant of the actual productive task, and in greater need of surveilling their work force. (Indeed, it was this unfamiliarity of managers with actual work that led to the blooming of the "consulting" industry.)

However, Graeber devotes more of his attention to the deeply distorted perceptions of work we carry around in our heads. The most basic would seem the identification of work with the "production" of material goods--when, he argues, most work has not been about production, but rather "caring" for people (child-rearing, housework, education, medical services, etc.) and things (maintenance and repair), which has been overlooked. (As the above list demonstrates, caring work has largely been women's work, and so less valued--and indeed, Graeber credits leftist feminists with developing the concept.)

Hardly less important is the tendency, already emergent in Medieval North European culture, and reinforced by the "Puritan" ethos (and capitalism, of course), to equate work with "alienated," mind-and-body destroying labor done for something other than the work's intrinsic value to the doer, under the supervision of others, all day, every day, and indeed, with the misery normally attendant in the situation.2 All other activity is "idleness." Work is deemed virtuous, idleness sinful, so that being miserable in one's work marks them as "deserving," and someone who is not working, and made miserable by their work, is "undeserving" even of the means to live.

Consequently, as production became more automated, caring work was not acknowledged as an alternative locus for labor; and there was a timidity about or hostility to changes in the social system that would reduce workloads (a shorter work week) or detach income from work, productive, caring or otherwise (through income floors, for example). Public well-being was instead identified with the maintenance of high levels of employment, particularly "full-time" employment, to the extent that this could be invoked as an excuse for wasteful or destructive social policies, Graeber citing the way in which the preservation of millions of jobs directly connected with the private health care bureaucracy has been given as a justification for not shifting to a more efficient alternative--not least by President Barack Obama himself.3 (Indeed, social safety nets themselves drove the growth of bullshit employment, through a massive bureaucracy devoted substantially to making the lives of qualified applicants seeking benefits more difficult through means testing and refusals.)

As a result, despite his aversion to the demand that social critics serve up policy solutions as the price of opening their mouths (which he observes is a way of distorting or suppressing dissent), Graeber argues for a Universal Basic Income as a solution to the problem, on several grounds. Such an income, he contends, would affirm the right of human beings to live, and aid our moving past our flawed conceptions of its connection with a perverse understanding of work by delinking income from labor. It would also eliminate the problem of means-testing and the associated welfare state bureaucracy. (The problem of how those made redundant by the change are to get on is, of course, resolved by their getting such an Income too.) It would eliminate, too, the incentive of people to do bullshit jobs (which would be a quick way of forcing employers to do without positions that are often unnecessary and wasteful for even their own firms), and more generally enable employees to walk away from a raw deal, likely improving the pay and conditions of work for all. This would be all the more significant, as he expects that the great majority of people will still be working in one way or another, at tasks that may well prove more beneficial not just to themselves but to society and the world as a whole (precisely because so much of the work being done was bullshit, and much potential creativity and ingenuity wasted in the process). Graeber also contends that this could contribute to resolving other major problems, citing that experiments with such income policies in the past (he cites India as an example) also reduced social distinctions, prejudices and ills to which the "rat race" and all the rest contribute so much; while pointing out that perhaps the single best thing we could do for the environment is to scale back our wasteful working.

In discussing what Graeber has to say it seems only fair to admit that I have long been convinced of the truth of what many will regard as his most radical claims--the immense wastefulness of contemporary capitalism, that technological employment has already happened (after picking up Jeremy Rifkin's The End of Work back when it was still current). Perhaps because of that I had reservations about just how much he stuck to his emphasis on subjective perceptions, to the exclusion of other examinations of the issue proffered by heterodox economists under the heading of "waste" for over a century (generally, in more objective, materialist ways), matters like built-in obsolescence, or marketing efforts like advertising and "product differentiation." In fact it seemed to me that doing so could have strengthened his argument considerably, and perhaps, helped him avoid what seemed to me certain exaggerations that came from his reliance on a narrower basis for approaching the issue than he might have used--like going too far in downplaying productive labor in his (justified) correction of the tendency to overlook caring labor, or his trying too hard to explain the widespread hostility to particular occupational groups in terms of this theory when others have at least as much to offer. (While he seems to me entirely right to say that the public begrudges artists a living because they seem to be avoiding their share of the common misery, it is also the case that artists are not generally respected as "useful" people; while Galbraith's concept of "convenient social virtue" seems to me to account for much of the disrespect constantly shown members of the teaching and nursing professions.4)

Still, these are comparatives quibbles with a case that is as compelling as it is unconventional. Materialist, hard data-and-statistics grubber that I am, I consistently found his argumentation regarding not just the existence of the problem, but its discoverability through his particular approach, persuasive, with the same going for his treatment of its causes and implications. At the same my interest in those other approaches was, on the whole, not a matter of skepticism but of my finding his argument so compelling that I could not but think of the possibilities for a synthesis of Graeber's argument with that longstanding tradition of writing on economic waste. (What, for instance, would it take to produce a pie chart that shows the share of bullshit in Gross World Product? How about time series' of GDP and GWP presenting figures adjusted and unadjusted for the bullshit component side by side?) It strikes me, too, that if Graeber is less concerned with the macroeconomics of the solutions he recommends than he might have been, that his case for a guaranteed minimum income is one of the more robust I have seen in recent years. Indeed, in a market full of books which are really just extensions of banal magazine articles (especially when we look at those produced by major commercial publishers for a general audience), this book, like Graeber's earlier Debt, is a diamond in the rough.

1. See Richard Harris' Rigor Mortis (reviewed here) for an in-depth discussion of this in the medical sector.
2. Graeber cites Alfred Marshall's definition of labor in Principles of Economics as work done "with a view to some good other than the pleasure derived from the work."
3. I was dismayed to see James K. Galbraith make a similar case in his otherwise excellent The Predator State.
4. As he put it in Economics and the Public Purpose, this "ascribes merit to any pattern of behaviour, however uncomfortable or unnatural for the individual involved, that serves the comfort or well-being of, or is otherwise advantageous for, the more powerful members of the community. The moral commendation of the community for convenient and therefore, virtuous behaviour then serves as a substitute for pecuniary compensation. Inconvenient behaviour becomes, deviant behaviour and is subject to the righteous disapproval or sanction of the community." It seems noteworthy that Graeber has over the years cited the works of the elder Galbraith a number of times, but also that while four of Galbraith's mid-century classics are listed in his bibliography, including American Capitalism, The Affluent Society and The New Industrial State, Economics and the Public Purpose (the work that capped that line of research) is not.

"Are the Arts Just For the Rich?"

Watching Patrick Stewart in William Shatner's documentary The Captains' Close-Up talk about his childhood, I was surprised by his working-class Yorkshire upbringing. And afterward I couldn't help thinking that where the current generation of British actors is concerned, none seem to have a background remotely like that. Look them up, and you are far more likely to find they are a graduate of Eton, an alumnus of the Dragon School.

Naturally it was with interest that I read a series of pieces in The Guardian on the phenomenon--some dealing with actors specifically, others the arts more generally. The pieces were far from as incisive as they could have been--not least, in their use of the term "middle class." A famously fuzzy term, it can be used to deliberately cloud the issue, frankly to downplay the extent to which a person from an extremely privileged background actually comes from an extremely privileged background, and often is, not least by The Guardian when discussing artists. Consider the extract from its typically puff piece profile of Killing Eve creator Phoebe Waller-Bridge:
Waller-Bridge grew up in an upper-middle-class family with baronets on both sides. Her father co-founded Tradepoint, the first fully electronic stock market . . . and her mother works for the Ironmongers' Company in the City.
On what planet is someone with "baronets on both sides" of the family, someone whose father cofounded a company with its own Wikipedia page (Tradepoint, later SWX Europe), merely "upper middle-class?"

Even in a culture so mendaciously pretending that "we're all middle class now," this is too, too much.

All the same, if the term is used in a way so loose as to be obfuscating (at best), the pieces offer a robust round-up of a sizable body of anecdotal evidence (Helen Mirren, Judi Dench and other luminaries observing their chances would have been very different today), and a growing body of solid statistical study, that confirms what David Graeber (writing of America rather than Britain, though there is no reason to think all this is less applicable to it) remarked when looking at the social, class realities of intellectual and creative life that "if your aim is to pursue any sort of value [but money]--whether that be truth . . . beauty . . . justice . . . charity, and so forth . . . [and] be paid a living wage for it"--those "jobs where one can live well, and still feel one is serving some higher purpose"--only those with "a certain degree of family wealth, social networks, and cultural capital" have much of a shot, while for the rest "there's simply no way in" (253).

In spite of which they are typically told by the "successful," and sometimes simply "everyone," that their failures are entirely their own fault, that they simply were not "good enough."

Which bullshit--the philosopher Harry Frankfurt's term certainly seems applicable here--is the conventional wisdom in an era that has never had much patience with the claims of artists to support or even sympathy.

The attitude shows up the illusion that the contemporary world is a meritocracy, makes a mockery of its slogans about opportunity, and not only diminishes the lives of those cut off from such careers solely for having made the mistake of not being born rich and well-connected, but the arts themselves, and through them, society as a whole.

Bullshit Jobs and the New Hollywood

What one might call the "myth" of the "New Hollywood" of the 1960s and 1970s is that the artists' hubris led to the alienation of key partners and allies and sponsors, personal antagonisms and artistic and commercial flops that destroyed their careers; that the limited appetite for artistically daring and politically radical content had been exhausted, as risque content became mainstreamed and as the politics of the country moved right, eliminating both inspiration and audience; and the rush to imitate the success of Steven Spielberg's Jaws and George Lucas' Star Wars.

However, reading the relevant history I found myself struck by the changing business conditions above all. The New Hollywood happened in a period of sharp changes in media and entertainment (above all, the rise of TV), but also an interregenum between business models--the studio system was dying, but the hyper-financialized multinational multimedia corporation with its mission of barraging global audiences with "high concept" content had not yet established itself. The real, significant but still very limited margin of freedom that the New Hollywood artists enjoyed in between was inconceivable except in that interregenum, and came to an end with its close.

Interestingly, it is one of the many smaller jobs David Graeber takes up in Bullshit Jobs, where his emphasis is on the aftermath of that closure, described as "a corporatization far more stifling than anything that had come before" (186). In the book's discussion the key element is the complication of the process of "development" which has seen vast numbers of executives having to successively sign off on a project while getting in their two cents--a case of too many cooks spoiling the broth, the more so because none of them know a thing about cooking, and are often just trying to justify their existences (because, by and large, they know nothing of film, and by any reasonable measure, epitomize "bullshit" in the sense Graeber writes of it).

As it happens, contemporary Hollywood comes up a second time in the book, namely its closed nature. As Graeber remarks,
Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste (252).
The combination of the industry's high-concept, global-market imperative, with this production process that looks like a recipe for incoherence or worse, and the bizarre situation in which the film industry is dominated by a closed "aristocracy" (his term), do not seem at all irrelevant to the combination of artlessness, and extraordinary divorce from lived reality, of what passes for "cinema" in our time.

Subscribe Now: Feed Icon