Thursday, February 28, 2019

The Cowardice of Consumer-Bashing; or, Neoliberal Environmentalism

The environmental movement has taken numerous forms, and indeed, just about every conceivable form with regard to ideology, so much so that one must be careful in generalizing about it. But it seems safe to say that mainstream environmentalism, like mainstream everything else today, emerged in the neoliberal era, and accommodated itself to that era. Despite the unavoidable clash between the demands of environmentalism and the prerogatives of capitalism (not least, the impossibility of infinite economic expansion on the basis of a finite resource stock), it has been neurotic about appearing even mildly critical of the socioeconomic system, let alone engaging in radical critique and proposing large solutions to large problems.

Instead it has preferred to couch its criticism of society's thrust in terms of a vague "we"--as in "We failed to heed the warnings," or "We went on with our wasteful ways"--that blurs together all of humanity, drawing no distinction between the chief executive officers of ExxonMobil and BP, and starving children in the Sahel. This explicitly asserts the "sociological nonsense and political irresponsibility" that "we all possess equal powers to make history," making them accessories, often quite knowing accessories, to the irresponsibility of those who actually hold the levers of power.1 Only the most obtuse, ill-informed or shamelessly dishonest can claim that the key political decisions regarding energy and climate, for example, were made by the public, or even represented it--when these institutions went to such great lengths to lie to the public, to confuse and distract it, to combat even the principle of its having a say (what neoliberalism, after all, has been about in the end), and then when that public voted for sane policies, overrode it. (In 2008 Americans voted for a President who promised an end to fossil fuel subsidies, cap-and-trade, a Green Jobs Corps. Instead they got the "all-of-the-above" energy policy that put into practice his opponent's running mate's cry of "Drill, baby, drill!")

Indeed, where it has been more targeted in its criticisms, environmentalism has emphasized the subjective and individual--looking away from the fact that over seventy percent of greenhouse gas emissions are generated by the operations of just one hundred corporations, thinking about which not only takes us far closer to the root of the problem given those corporations' practical control over what gets made and how, but because of their size, organization and resources generally are better able than anyone else to change all that, and in the small number of organizations involved an excellent focus for thought and action about solutions; in preference for talking up the carbon footprint of seven billion individuals. It is, they make very clear, incumbent on the individual consumer to sacrifice by choosing carefully, paying more--rather than incumbent on the manufacturer to provide the greenest products available at any given price (and the thought of regulation to that end, anathema), ignoring the realities of power, and equity. The consumer has already lost a major option when, the political system having failed them by refusing to provide adequate public transport, they are forced to buy a car and drive many, many miles in it just to have a job. When buying that car they are restricted to buying what they can afford from the models that an oligopolistic car industry is prepared to market--something it has done in line with its preference for selling not just old-fashioned gas-burners, but more vehicle per customer. But it is the consumer that it lambastes.

In it all one can see a retreat of environmentalism into an austere, misanthropic, religiosity which thunders against the consumer, "You have had it too good for too long!" (especially naked where it seems to positively gloat over the idea of civilization's crashing down and a Great Die-Off taking most of humanity with it and taking the rest back to the Dark Ages). The attack on the consumer, too, can appear a sort of compensation for their failure to influence the genuinely powerful--or shabbier still, their taking their frustrations out on those in no position to resist. It bespeaks real failure. Hopefully, rather than bespeaking it, it will admit it, abandon what has not worked, and think of what might--intellectual and moral courage rather than cowardice, in a readiness to admit that realizing its goals may make the ultra-rich unhappy, and a preparedness to think big. I, for one, am convinced that at this late stage, nothing less than a 100 percent-renewable-energy-and-large-scale-geoengineering moonshot can save us from catastrophe, and the sooner we see the obvious taken for granted, and properly acted upon, the better.

1. The words are from C. Wright Mill's The Power Elite.

Why I Am Sick of Hearing About Cowspiracies

It seems that today meat-eaters can hardly go a day without being subjected to a moral harangue about how their dietary preferences and nothing else is dooming the planet.

Anthropogenic climate change is an indisputable fact; so is the rapidity with which it has been unfolding; and the same goes for the necessity of serious action on it. Additionally, no reasonable person denies that meat production is a less efficient use of our resources than other forms of food production, or that it contributes to the accumulation of greenhouse gases. Yet, the actual scale of meat production's contribution to the problem, and the range of options for redressing that contribution, are wide open to question. Where the oft-cited documentary Cowspiracy claims that this alone is responsible for over half of greenhouse gas emissions, agriculture altogether (of which meat-raising is but a component) is rated by the Environmental Protection Agency as responsible for under a tenth of the total.

Someone is clearly wrong here. But surprisingly few people seem to be taking the trouble to work out who it is, critical assessments that might establish the validity or invalidity of the Cowspiracy analysis strangely lacking--as a Google search, and still more a Google Scholar search, will demonstrate. It appears there are only a handful of pieces from comparatively marginal sources out there, underlining the fact that outright climate change deniers have infinitely more media access than those who question the allegations of "Cowspiracy."

The result is that the intellectual basis for the assertions of Cowspiracy seems far from firm--even as those haranguing the meat-eaters automatically go to the most extreme solution. Virtually no one claims that we must respond to the problem raised by the greenhouse gas emissions transportation generates by abolishing the movement of people and things. Few even suggest doing very much to rationalize our use of transportation, most promoting alternative technologies that would permit people to more or less go on living as they do (like the electrification of vehicle fleets). However, abolition is exactly what those placing the stress on meat production, without preamble, insist upon--apparently uninterested in the reality that not all meat production is equally problematic (a substantial difference existing between, for example, the environmental impact of chicken and beef); and not all methods of meat production are equally problematic, either (evidence existing for grass-fed beef as actually a possible offset to other emissions). Indeed, where one might expect that those who are most insistent on the destructive effects of meat production would be champions of investment in, for example, the cellular agriculture methods beginning to show real promise, just as they champion renewable energy, they do not speak of it at all.

It is a remarkably categorical attitude, which can easily give the skeptic the impression that vegans have simply found outsized claims about meat production's contribution to climate change a useful addition to their list of arguments. That, perhaps, the fossil fuel industry that has fought so long, hard, dishonestly and successfully against curbs on its activity, or even the reduction of the colossal subsidies it enjoys ($5 trillion a year, if one counts externalities!), finds it convenient to have the heat turned on someone else . . .

And of course, that all this is simply another case of, in typically bourgeois fashion, trivializing every aspect of economic life into a "lifestyle choice," and what is more, casting aside any regard for equity, as mainstream environmentalism has too often and too long done. This sort of argument tells the well-off that it is not their mansions and Hummers and jet travel that is the problem (implying that they can go on indulging in all this without guilt), but the beleaguered prole who at the end of the day finds a burger more tempting than the plate of beans they want to hand him (not incidentally, in a country where it is the "elites" who turn up their noses at red meat meating as gauche); these lower class types they regard as so backward and reactionary as to justify any callousness or contempt.1

Rather than turning the fight against the real issue of climate change (and contemporary agriculture's genuine contribution to it) into an attack on the dietary choices of have-nots, one ought to acknowledge that our food production (of which meat production is a big part, but even there, still not the totality) is one of many dimensions to the problem, while as in other areas of life, the emphasis should be on meeting people's needs (I must admit that I am not convinced that a vegan diet really is best for each and all through the entirety of their life spans) in sustainable ways. Even if one accepts the extravagant claims about meat production's contribution to climate change (and as yet, there seems to me much room for doubt), the imposition of veganism on seven billion people in response is a last resort, not Plan A.

1. For a discussion of this sensibility, see Owen Jones' interesting book, Chavs: The Demonization of the Working Class.

Review: Chavs: The Demonization of the Working Class, by Owen Jones

London: Verso, 2011, pp. 352.

As the rather charged and frankly offensive title of Owen Jones' book, Chavs, implies, his concern is with the recent social history of the British working class, and the ways in which that class has been depicted and understood by British society at large. In particular Jones concerns himself with the alleged disappearance of the "respectable" working class--the image of working class people as gainfully employed, law-abiding, functional contributors to the society in which they lived--and its replacement by the image of working class Britons as "chavs"--vulgar, bigoted, anti-social louts, apt to be drunk, violent, criminal.

In considering this transformation Jones, to his credit, gives due to quite concrete facts of social history, with the catastrophe of Thatcherism playing its part in the story--for the neoliberal turn she pioneered and each and every one of her successors continued to promulgate (not least the "New Labor" she regarded as her greatest achievement) did in a manner of speaking wipe out the old working class. The collapse of the British industrial base amid the worldwide Volcker recession and homegrown monetarism, and the privatization of the housing stock, and all that accompanied and followed them, eliminated the enterprises and institutions that provided the working class with both steady, remunerative work and the benefits of a community--the factories and coal mines, the labor unions that organized those who worked in them, the council estates. What remained in their place, casual, temp, ill-paid--and atomized--work in the "service" economy, and at the same time fantasies of soaring to the top in a world where celebrities and the like are richer and more exalted than they ever were before (for instance, becoming a soccer player like superstar "chav" David Beckham), was no substitute in either monetary or social terms. All of this did undeniable, considerable damage to that class's members individually and collectively, as many were turned from, one might say, proletarian to lumpenproletarian.

However, as Jones declares in his extensive overview of the treatment of this strata by the typically reactionary British media, the image of these people as chavs was pure neoliberal propaganda, the declaration that "We are all middle class now" promoting the illusion that the worthier element of the working class had "moved on up," leaving outside that class only the incorrigibles who had only themselves to blame for their misfortunes. Underscoring this was the lavish, sensationalized attention paid to the evidences of the dysfunction of the non-respectable working class that remained. Much of this was superficial--whining about their dress. ("What ever happened to working class men wearing suits and ties?" right-wing social critics whined.) Some of it was more serious. (The alleged propensity of the working class to neglect its children is an obvious example.)1

As always, it is comforting for the privileged to think that the comfort they enjoy is wholly and unimpeachably earned and deserved by them and not enjoyed at the expense of anyone else; that those who are not so privileged are, when one cannot shrug off their deprivation so easily, the authors of their own miseries. It is more comforting still to think of them as not contributing at all, but as parasites, scrounging off the taxes of "respectable people," dragging down the economy, while debasing the quality of urban life with their anti-social behavior--for one can still less make claims on behalf of a parasite than they can of people who merely made a muck of their own lives.

In line with all this it is comforting for them to make much of working class bigotry. Comforting to say that the more affluent--enlightened--segments of society are "post-racial," and that the only reason society as a whole may not be so is the lower orders. It is convenient, additional proof of the crudity and backwardness that condemns them to their place at the bottom of the heap, that adds to their discomfort with their obvious and severe dysfunction--while being yet another excuse for deflecting any claim on their behalf. That they think they should be living better--why, that is mere "entitlement," and a racist entitlement at that. Besides, when they so clearly have no empathy for others, why should anyone care about them?

Reading it all I found Jones' case to be as robust and lucid as it was depressing. And as tends to be the case with worthwhile books, it left me thinking a good deal about its implications--not least, how a shallow and sanctimonious and commonly hypocritical parody of anti-racism becomes an excuse for a vicious classism that, ironically, has worsened the problem of racism. Because when right-wing populists (read: fascists) court the votes of the working people abused and exploited, snubbed and insulted, by the mainstream parties of the liberal/left as well as the right, they can say, no, their appeal has nothing to do with our abandonment of working class people and their interests in favor of catering to the super-rich, and everything to do with their racism. So that it is only right and fair that we continue along the path of woke neoliberalism--which only worsens the problem in a vicious cycle. In that, it strikes me, Jones' book offers insight into much more than just Britain's working class at the time, but the trends in British life generally since then (e.g. Brexit), and the destructive ascendancy across the Western world and beyond.

1. Altogether reading the criticisms of this strata I got the impression that those who traffic in the "chav" stereotype simply took an exhaustive list of racist stereotypes about inner city African-Americans in the United States, scratched out whatever epithet the list maker used for African-American and wrote in "English working class" instead.

Thursday, January 10, 2019

Entering a Post-Scarcity Age in Fiction?

Back in 1931 Astounding Science Fiction paid its writers two cents a word.

This does not sound like very much. But one has to take inflation into account. According to the United States' Bureau of Labor Statistics' Consumer Price Index, two cents in 1931 was equivalent to thirty-two cents in 2018. To put all this even more properly into perspective one might think in terms not of official inflation figures, but the economic life of the time. Just before the onset of the Great Depression (1929), America's per capita Gross Domestic Product was around $850. Today it is $57,500--almost seventy times' higher. Consequently, from the standpoint of people's actual incomes, two cents circa 1930 is more like a dollar and forty cents per word. The Science Fiction and Fantasy Writers' Association of America has just recently raised the minimum per-word compensation for a professional magazine from five to six cents. According to the guidelines on its web site the rebranded Astounding, Analog, pays six to ten cents per word for fiction--a relatively generous rate, and yet, just a small fraction of what it once afforded its authors.

Of course, one may think that magazines publishing short-form fiction are unrepresentative, given how their place has receded in contemporary life. By and large, when they look for something to read, people today tend to reach for novels. But the rate does not seem much better with those novels. Today a novel is expected to be at least a hundred thousand words long, and the advances that the major publishers offer for them tend to be in the five to ten thousand dollar range, which works out to . . . five to ten cents per word yet again.

In short, writers may be making a tenth or even less of what they used to in per word terms according to this one yardstick, which falls far short of taking everything into account. If, for example, one argues that publishers now have a higher technical standard than they did in the days of the pulps, necessitating a lengthier, more difficult training and time-consuming craftsmanship; if one argues that writers are now expected to bear a heavier burden of such activities as publicity in addition to their actual writing in return for their pay; then just looking at pay rates understates the drop in their earnings.

The financial return on effort aside, consider how publishers behave toward those who wish to submit work to them--doing just about everything in their power to keep them at bay. The major publisher open to the unsolicited, unagented submissions that are the only recourse of all but the very privileged few with "platforms" and industry "connections" (those few favored rather than disfavored by celebrity and nepotism) is a rarity today.

Some literary agents have slush piles, of course. However, when speaking of them they discuss the slush pile not as a normal, routine procedure by which they come by an appreciable portion of the wares with which they supply the market, but something they bother with mainly on the off chance that one of the ten thousand cold queries they get year in, year out, will somehow, in a way not reducible to rational, objective, verbalizable criteria, commend itself to their business judgment as likely to be profitable for their agency. The marginality of this is further reflected in their tendency to hand the job of looking out for those special finds in the slush pile not to the practiced eyes of veteran agents, who have more "important" things to do than keep up such a lookout, but unpaid interns who will years later write articles in places like Salon and the Guardian where they take the frustrations of their time on the publishing industry's lowest rung out on those who have no rung at all, gleefully trolling the unwashed plebs who committed the crime of sending in the unasked for queries and manuscripts they were obliged to read.

This means that just as the pay has plummeted, so has it become more difficult to find anyone willing to offer even these sharply reduced rates. Meanwhile, just as publishers, and agents, make themselves ever less approachable to the aspiring writer (indeed, many an aspiring writer would say, treat any approach with extreme hostility to the point of abusiveness), the TV viewer is barraged with commercials for self-publishing services, hoping to get not the writers' work, but their money. Someone who has self-published a book might even get cold calls from such companies at home, offering them editing, design, publicity and other such services. (By contrast, could you imagine one of the Big Five publishers--a Random House, a HarperCollins--buying TV time to put out a call for submissions? Or phoning the author of a self-published book that sold in the dozens to ask them to send them their book?)

All this bespeaks an extraordinary collapse of the market for fiction.

One might conclude from it that the falling price of words reflects either burgeoning supply, or sharply declining demand, and indeed, a combination of the two.

It is not at all hard to come up with a list of the reasons. A major one is what one might call the "vertical integration" of publishing. Just as steel mill owners might buy up iron and coal mines to bring the sources of the input they want for their operations under their control, so have publishers substantially gone from relying on what writers happen to send them to insuring that they are supplied with the sorts of thing they desire most. Thus we get an ever-rising volume of books from a handful of already bestselling authors, through their "working with co-authors," or turning their book-writing into a family business where their close relatives churn out work under their name (think James Patterson, think Stephen Ambrose); of books "by" reality TV "stars" and the like (the Kardashians are published novelists); of tie-ins and sequels and prequels and series' that go on and on and on even when the name on the cover is actually the name of the person who wrote the thing.

Meanwhile the publicity machines make it ever more likely that those who have become bestsellers stay bestsellers, ever more prolific bestsellers taking bigger bites out of the market for longer periods as the lists get to look static from year to year, decade to decade. (Even death hasn't stopped Robert Ludlum and Tom Clancy's names from appearing on the covers of new bestsellers every year decades after their passing, while a half century after his own passing the adventures of Ian Fleming's James Bond continue ever more creakily and pointlessly.) Naturally the premium on what any writer coming to this machinery as an outsider has to offer is far, far lower, not merely in terms of compensation, but their chances of finding any place at all for their own, original ideas in a traditional publisher's schedule.

However, even that is far from being the whole story now. What is also the case today, and may prove most important over the long run, is that there has never been so much content out there, so easily available. Never mind the favored Establishment answer that anything and everything that may be wrong with publishing today and undermining writers' earnings is due solely to the refusal of some to fully respect the claims of intellectual property-holders. The simple reality is that e-readers mean that anything without a copyright on it can be had for free, entirely legally--which includes virtually every classic, and non-classic, from the beginning of history down to the early twentieth century.

Then there is the vast quantity of fiction produced and distributed without any expectation of commercial gain. Fan fiction may be questionable from the standpoint of copyright, but nonetheless, it does conform to this pattern, and here, at least, statistics are conveniently available, with single categories of the stuff staggering. Over at FanFiction.net there are eight hundred thousand stories set within the Harry Potter universe alone, more than the most ardent fan could read in many lifetimes. And given that once it is put up very little is necessary to keep it up, this body of material is growing all the time.

There is, too, the response of aspiring, commercial authors to Big Publishing's scorn of them, and their inability to compete with its colossal publicity machinery. Many sell their novels at ninety-nine cents per copy, and give away much of their material totally for free. Few of them find the audiences for which they hope, let alone a chance to make a real living as an author (even when it is up for free, writers on sites like Wattpad are expected to send months persuading the site's readers to simply view it), but the practice is unlikely to stop anytime soon, so that this body of material, too, is growing all the time.

Altogether this means that no one merely looking for something to read will ever have to pay for it out of pocket again--and still have far, far more options without paying anything close to the prices traditional publishers demand. Meanwhile, there have never been more alternative uses of leisure time to reading than now, with those tablets and laptops and phones now making the full range of entertainment choices almost entirely portable. People who would once have had to make do with a book on the bus, the train, the plane can now enjoy a movie or a TV show or a video game or social media instead--and are perhaps more inclined to do so, as testified by the mountain of statistics testifying to less reading, and even less literacy.

Of course, for now it is still the case that if George R.R. Martin's The Winds of Winter ever actually does come out, people will rush out to pay full price for their copies. However, for all Martin's considerable virtues as a writer we should not forget that he would not be such a major mainstream success were it not for a blood-soaked, sex-and-nudity-filled, endlessly controversial nighttime soap opera adaptation of his series--many of the copies he sold creditable to the success of the TV show rather than purely literary success. It is also the case that a writer who stands today where Martin stood four decades ago, when he was just starting out, is apt to find that they cannot even give their book away.

It does not seem too much to say that we are looking at the beginnings of a post-scarcity age in fiction, and perhaps a post-scarcity era for the written word more generally; and that with each year we are entering into it more deeply. This is not the sort of post-scarcity a sane person would have preferred. (How about clean, green energy giving us electricity too cheap to meter instead?) Alas, we are living with it all the same, and if those who tell us that we are looking at an age of more general "abundance" are right, then how we deal with this surprising consequence of technological advance may well be an early indicator of how we cope with other, bigger problems as well. So far the record has not been inspiring. But then, however painful the situation has become for those who wish to be authors (and perhaps, how dangerous a situation serving them so poorly is for culture in general), this is all still new, and for the time being, scarcely recognized. But perhaps the fact that we are still only at the beginning of this strange new period is in itself grounds for hope that we will be able to combine the indisputable benefits of the new technologies with conditions that permit writers to write and yet live.

Perhaps someone should write a story about that.

Sunday, December 23, 2018

Review: Bullshit Jobs: A Theory, by David Graeber

New York: Simon & Schuster, 2018, pp. 368.

Back in 2013 David Graeber penned an article for Strike! Magazine regarding what seemed to him the explosion of pointless, economically irrational unemployment, which he colorfully termed "Bullshit Jobs."

In an extraordinarily rare case of an item with actual intellectual content going viral, it became something of an international sensation, leading to a broader research project and book-length treatment of the issue five years later. The book, like his earlier Debt, presents a large, intricate and conventional wisdom-smashing argument, but one that leaves the densely written and documented macrohistory to a minimum while in more sprightly fashion making a case rooted in anecdotal stories of present day individuals. In particular he relies on his collection of employees' self-reports about their work being "bullshit," defining the bullshit job as
a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case (9-10).
In line with this emphasis on the subjective, Graeber is also more concerned for the "moral and spiritual damage" done by the situation; the reasons why this is happening; and what might be done about it.

As one might guess, bullshit jobs consist substantially of administrative and bureaucratic functions of questionable value, especially at the middle management level. What may be more surprising to adherents of the "conventional wisdom" is that these are every bit as present in the private sector as in the public (in case it needed to be done, Graeber once more debunks the rightist/libertarian stereotype of bureaucracy as unique to government). Also surprising to those of the conventional turn of mind is the extent to which this has been bound up with the growth of those sectors that cheerleaders for neoliberalism endlessly exalt, and which have tended to shape other sectors in their turn--namely finance and information technology, whose ways have penetrated such non-profit institutions as the university. Thus everywhere one looks one sees an abundance of "flunkies" (whose job is to make the boss "look important," as a receptionist often does); "goons" whose essential task is to enable their employers' exploitation of the public, and often exist only because their competitors have them (as is often the case with telemarketers, lobbyists, corporate lawyers, public relations specialists); "duct tapers" (whose job consists of patching over quite fixable problems the bosses are too lazy, incompetent or short term-minded to resolve properly, as is often the case in computer programming); "box tickers" (whose task is to "allow an organization to claim it is doing something it is in fact not doing," like going through the motions on due diligence); and "taskmasters" (who invent additional unnecessary work for all of the unfortunate underlings named above).

Graeber also notes the ways in which "real" jobs rather than the "bullshit" kind may be infiltrated by or even an outgrowth of bullshit. As an example Graeber presents members of the teaching profession, who perform an essential function, but find their time increasingly taken up by administrative tasks; or researchers and their institutions, which devote increasing amounts of time to chasing grants (ironically, squandering billions that could be used to fund a vast amount of research).1 Janitors likewise remain essential, but janitorial work may be classifiable as a bullshit task when the workplace they happen to clean serves only a bullshit purpose--such as a PR firm. Given polling data indicating that as many as 40 percent of employees feel their jobs fall into the category he describes; and the other jobs which increasingly consist of bullshit activities, or service such activities in some respect or other; Graeber estimates that 50 percent of all work done today qualifies as bullshit "in the broadest sense of the term."

Graeber contends that this situation where half the country and the developed world are giving their waking hours to bullshit jobs entails enormous "spiritual" violence. The reason is that even if human beings are not naturals for the routine of intense and continuous toil from nine-to-five (or longer) demanded by their employers, human beings do have a significant need for purposeful (and even altruistic) activity. The make-work, the parody of work, that is a bullshit job denies them that, indeed harms them by inflicting on them the opposite, the anguish of pointless activity, while subjecting them to many of the worst features of employment--like the loss of personal freedom to the will of a boss who is often arbitrary and abusive, and more generally the alienations of which Karl Marx and C. Wright Mills wrote. Where the job pays relatively well, there is guilt and shame at their remuneration being unearned; and where the job makes the world actually a worse place, guilt and shame at their contribution to having made it so. (Indeed, many of those whose testimonials Graeber recounts quit well-paying, prestigious "bullshit" jobs to take less remunerative but more meaningful work.)

The absurdity, and tragedy, of the situation appear all the greater for the contrast with and ramifications for those doing productive, non-bullshit jobs, whose conditions deteriorate and whose compensation declines (turning them into "shit" jobs, making a genuine contribution but unpleasant to do). Graeber explains this partly in terms of know-nothing, corporatized administrators subjecting those workers to speed-ups, pay cuts and the like (there being actual productivity to squeeze further here); in part because right-wing populists stoke public hostility to such groups as teachers, nurses, and even auto workers. Indeed, Graeber goes so far as to contend that an envy of those who are genuinely productive is part of the political game being played. ("You get to do something meaningful with your life, and you expect a middle class standard of living too? How dare you!")

How could this perverse situation have been permitted to come about in a system that may not be respectful of workers, but at least prides itself on efficiency? For Graeber an important part of the story is that, rather than mass technological unemployment being a fear for the future, it is something that has already happened to us, leaving us with a "real" unemployment rate of 50 to 60 percent. However, what happened was that the gap was "stopp[ed] . . . by adding dummy jobs that are effectively made up." Indeed, Graeber argues (as he did in his essay "Flying Cars and the Declining Rate of Profit") that post-1960s neoliberalism is not a project for maximizing economic efficiency and growth as it claims to be (objects it has signally failed to deliver), but for preserving the existing order--capitalist property and labor relations (an object at which it has been rather more successful).

An Orwellian notion, Graeber admits that this sounds like "conspiracy theory," but regards it as an emergent phenomenon, resulting from the failings of the system, and of our misconceptions about what work is and what it ought to be. As he notes, the tendency to large numbers of flunkies and duct tapers and box tickers and taskmasters ordering them all about has been a result of the way financialization (and IT) led to a more ruthless, more conflictual, more alienating workplace overseen by managers increasingly divorced from and ignorant of the actual productive task, and in greater need of surveilling their work force. (Indeed, it was this unfamiliarity of managers with actual work that led to the blooming of the "consulting" industry.)

However, Graeber devotes more of his attention to the deeply distorted perceptions of work we carry around in our heads. The most basic would seem the identification of work with the "production" of material goods--when, he argues, most work has not been about production, but rather "caring" for people (child-rearing, housework, education, medical services, etc.) and things (maintenance and repair), which has been overlooked. (As the above list demonstrates, caring work has largely been women's work, and so less valued--and indeed, Graeber credits leftist feminists with developing the concept.)

Hardly less important is the tendency, already emergent in Medieval North European culture, and reinforced by the "Puritan" ethos (and capitalism, of course), to equate work with "alienated," mind-and-body destroying labor done for something other than the work's intrinsic value to the doer, under the supervision of others, all day, every day, and indeed, with the misery normally attendant in the situation.2 All other activity is "idleness." Work is deemed virtuous, idleness sinful, so that being miserable in one's work marks them as "deserving," and someone who is not working, and made miserable by their work, is "undeserving" even of the means to live.

Consequently, as production became more automated, caring work was not acknowledged as an alternative locus for labor; and there was a timidity about or hostility to changes in the social system that would reduce workloads (a shorter work week) or detach income from work, productive, caring or otherwise (through income floors, for example). Public well-being was instead identified with the maintenance of high levels of employment, particularly "full-time" employment, to the extent that this could be invoked as an excuse for wasteful or destructive social policies, Graeber citing the way in which the preservation of millions of jobs directly connected with the private health care bureaucracy has been given as a justification for not shifting to a more efficient alternative--not least by President Barack Obama himself.3 (Indeed, social safety nets themselves drove the growth of bullshit employment, through a massive bureaucracy devoted substantially to making the lives of qualified applicants seeking benefits more difficult through means testing and refusals.)

As a result, despite his aversion to the demand that social critics serve up policy solutions as the price of opening their mouths (which he observes is a way of distorting or suppressing dissent), Graeber argues for a Universal Basic Income as a solution to the problem, on several grounds. Such an income, he contends, would affirm the right of human beings to live, and aid our moving past our flawed conceptions of its connection with a perverse understanding of work by delinking income from labor. It would also eliminate the problem of means-testing and the associated welfare state bureaucracy. (The problem of how those made redundant by the change are to get on is, of course, resolved by their getting such an Income too.) It would eliminate, too, the incentive of people to do bullshit jobs (which would be a quick way of forcing employers to do without positions that are often unnecessary and wasteful for even their own firms), and more generally enable employees to walk away from a raw deal, likely improving the pay and conditions of work for all. This would be all the more significant, as he expects that the great majority of people will still be working in one way or another, at tasks that may well prove more beneficial not just to themselves but to society and the world as a whole (precisely because so much of the work being done was bullshit, and much potential creativity and ingenuity wasted in the process). Graeber also contends that this could contribute to resolving other major problems, citing that experiments with such income policies in the past (he cites India as an example) also reduced social distinctions, prejudices and ills to which the "rat race" and all the rest contribute so much; while pointing out that perhaps the single best thing we could do for the environment is to scale back our wasteful working.

In discussing what Graeber has to say it seems only fair to admit that I have long been convinced of the truth of what many will regard as his most radical claims--the immense wastefulness of contemporary capitalism, that technological employment has already happened (after picking up Jeremy Rifkin's The End of Work back when it was still current). Perhaps because of that I had reservations about just how much he stuck to his emphasis on subjective perceptions, to the exclusion of other examinations of the issue proffered by heterodox economists under the heading of "waste" for over a century (generally, in more objective, materialist ways), matters like built-in obsolescence, or marketing efforts like advertising and "product differentiation." In fact it seemed to me that doing so could have strengthened his argument considerably, and perhaps, helped him avoid what seemed to me certain exaggerations that came from his reliance on a narrower basis for approaching the issue than he might have used--like going too far in downplaying productive labor in his (justified) correction of the tendency to overlook caring labor, or his trying too hard to explain the widespread hostility to particular occupational groups in terms of this theory when others have at least as much to offer. (While he seems to me entirely right to say that the public begrudges artists a living because they seem to be avoiding their share of the common misery, it is also the case that artists are not generally respected as "useful" people; while Galbraith's concept of "convenient social virtue" seems to me to account for much of the disrespect constantly shown members of the teaching and nursing professions.4)

Still, these are comparatives quibbles with a case that is as compelling as it is unconventional. Materialist, hard data-and-statistics grubber that I am, I consistently found his argumentation regarding not just the existence of the problem, but its discoverability through his particular approach, persuasive, with the same going for his treatment of its causes and implications. At the same my interest in those other approaches was, on the whole, not a matter of skepticism but of my finding his argument so compelling that I could not but think of the possibilities for a synthesis of Graeber's argument with that longstanding tradition of writing on economic waste. (What, for instance, would it take to produce a pie chart that shows the share of bullshit in Gross World Product? How about time series' of GDP and GWP presenting figures adjusted and unadjusted for the bullshit component side by side?) It strikes me, too, that if Graeber is less concerned with the macroeconomics of the solutions he recommends than he might have been, that his case for a guaranteed minimum income is one of the more robust I have seen in recent years. Indeed, in a market full of books which are really just extensions of banal magazine articles (especially when we look at those produced by major commercial publishers for a general audience), this book, like Graeber's earlier Debt, is a diamond in the rough.

1. See Richard Harris' Rigor Mortis (reviewed here) for an in-depth discussion of this in the medical sector.
2. Graeber cites Alfred Marshall's definition of labor in Principles of Economics as work done "with a view to some good other than the pleasure derived from the work."
3. I was dismayed to see James K. Galbraith make a similar case in his otherwise excellent The Predator State.
4. As he put it in Economics and the Public Purpose, this "ascribes merit to any pattern of behaviour, however uncomfortable or unnatural for the individual involved, that serves the comfort or well-being of, or is otherwise advantageous for, the more powerful members of the community. The moral commendation of the community for convenient and therefore, virtuous behaviour then serves as a substitute for pecuniary compensation. Inconvenient behaviour becomes, deviant behaviour and is subject to the righteous disapproval or sanction of the community." It seems noteworthy that Graeber has over the years cited the works of the elder Galbraith a number of times, but also that while four of Galbraith's mid-century classics are listed in his bibliography, including American Capitalism, The Affluent Society and The New Industrial State, Economics and the Public Purpose (the work that capped that line of research) is not.

"Are the Arts Just For the Rich?"

Watching Patrick Stewart in William Shatner's documentary The Captains' Close-Up talk about his childhood, I was surprised by his working-class Yorkshire upbringing. And afterward I couldn't help thinking that where the current generation of British actors is concerned, none seem to have a background remotely like that. Look them up, and you are far more likely to find they are a graduate of Eton, an alumnus of the Dragon School.

Naturally it was with interest that I read a series of pieces in The Guardian on the phenomenon--some dealing with actors specifically, others the arts more generally. The pieces were far from as incisive as they could have been--not least, in their use of the term "middle class." A famously fuzzy term, it can be used to deliberately cloud the issue, frankly to downplay the extent to which a person from an extremely privileged background actually comes from an extremely privileged background, and often is, not least by The Guardian when discussing artists. Consider the extract from its typically puff piece profile of Killing Eve creator Phoebe Waller-Bridge:
Waller-Bridge grew up in an upper-middle-class family with baronets on both sides. Her father co-founded Tradepoint, the first fully electronic stock market . . . and her mother works for the Ironmongers' Company in the City.
On what planet is someone with "baronets on both sides" of the family, someone whose father cofounded a company with its own Wikipedia page (Tradepoint, later SWX Europe), merely "upper middle-class?"

Even in a culture so mendaciously pretending that "we're all middle class now," this is too, too much.

All the same, if the term is used in a way so loose as to be obfuscating (at best), the pieces offer a robust round-up of a sizable body of anecdotal evidence (Helen Mirren, Judi Dench and other luminaries observing their chances would have been very different today), and a growing body of solid statistical study, that confirms what David Graeber (writing of America rather than Britain, though there is no reason to think all this is less applicable to it) remarked when looking at the social, class realities of intellectual and creative life that "if your aim is to pursue any sort of value [but money]--whether that be truth . . . beauty . . . justice . . . charity, and so forth . . . [and] be paid a living wage for it"--those "jobs where one can live well, and still feel one is serving some higher purpose"--only those with "a certain degree of family wealth, social networks, and cultural capital" have much of a shot, while for the rest "there's simply no way in" (253).

In spite of which they are typically told by the "successful," and sometimes simply "everyone," that their failures are entirely their own fault, that they simply were not "good enough."

Which bullshit--the philosopher Harry Frankfurt's term certainly seems applicable here--is the conventional wisdom in an era that has never had much patience with the claims of artists to support or even sympathy.

The attitude shows up the illusion that the contemporary world is a meritocracy, makes a mockery of its slogans about opportunity, and not only diminishes the lives of those cut off from such careers solely for having made the mistake of not being born rich and well-connected, but the arts themselves, and through them, society as a whole.

Bullshit Jobs and the New Hollywood

What one might call the "myth" of the "New Hollywood" of the 1960s and 1970s is that the artists' hubris led to the alienation of key partners and allies and sponsors, personal antagonisms and artistic and commercial flops that destroyed their careers; that the limited appetite for artistically daring and politically radical content had been exhausted, as risque content became mainstreamed and as the politics of the country moved right, eliminating both inspiration and audience; and the rush to imitate the success of Steven Spielberg's Jaws and George Lucas' Star Wars.

However, reading the relevant history I found myself struck by the changing business conditions above all. The New Hollywood happened in a period of sharp changes in media and entertainment (above all, the rise of TV), but also an interregenum between business models--the studio system was dying, but the hyper-financialized multinational multimedia corporation with its mission of barraging global audiences with "high concept" content had not yet established itself. The real, significant but still very limited margin of freedom that the New Hollywood artists enjoyed in between was inconceivable except in that interregenum, and came to an end with its close.

Interestingly, it is one of the many smaller jobs David Graeber takes up in Bullshit Jobs, where his emphasis is on the aftermath of that closure, described as "a corporatization far more stifling than anything that had come before" (186). In the book's discussion the key element is the complication of the process of "development" which has seen vast numbers of executives having to successively sign off on a project while getting in their two cents--a case of too many cooks spoiling the broth, the more so because none of them know a thing about cooking, and are often just trying to justify their existences (because, by and large, they know nothing of film, and by any reasonable measure, epitomize "bullshit" in the sense Graeber writes of it).

As it happens, contemporary Hollywood comes up a second time in the book, namely its closed nature. As Graeber remarks,
Look at a list of the lead actors of a major motion picture nowadays and you are likely to find barely a single one that can't boast at least two generations of Hollywood actors, writers, producers, and directors in their family tree. The film industry has come to be dominated by an in-marrying caste (252).
The combination of the industry's high-concept, global-market imperative, with this production process that looks like a recipe for incoherence or worse, and the bizarre situation in which the film industry is dominated by a closed "aristocracy" (his term), do not seem at all irrelevant to the combination of artlessness, and extraordinary divorce from lived reality, of what passes for "cinema" in our time.

Sunday, December 16, 2018

Are They Trolling Me? A Test

I have already written a blog post about spotting trolls, but thought I'd present some of those thoughts again in a somewhat more systematized, handier fashion--in the form of a test to which people can reasonably put an interlocutor's statements. Admittedly based on limited personal experience I don't claim that it is scientific or pretend that it is foolproof, but I find the essentials handy, think others might find them handy too, and that feedback from others might help develop this into something better, so here it goes--a set of questions with differing weights.

1. Did they butt into a conversation already ongoing between other people? (10 points)

2. If they butted in, was it into the conversation of people who clearly had a very different attitude toward the subject matter? (For example, a #MAGA butting into a conversation between two #Resist in a thread started by another #Resist account's Retweeting an item from a progressive media outlet?) (10 points)

3. Are they just using this as an occasion to inflict talking points on people they seem well aware have no interest in hearing them (rather than engaging others in a manner indicative of genuine interest in their thoughts and opinions)? (20 points)

4. Are they demanding that other people provide lavish defenses for their opinions? (10 points)

5. Are they trying to press the other person into agreeing with them? For example, do they keep repeating their talking points in a manner suggestive of pressuring them, or asking the same questions over and over again after the object of their questioning has given them all the answer they have to offer? (20 points)

6. Do they use insults or make personal attacks? (30 points)

Bonus Question: Does their profile flaunt a troll's attitude? Trolls are often proud of their activity, such that I have seen some declare that they are trolls in their Twitter profile, while others, only slightly less subtle, boast in transgressive ways about their mean-spiritedness and sadism to shock, offend or intimidate, in their imagery as well as in their words. (This can include reference to or glorification of violent acts in their handles and profile pictures; it may also include the celebration of much maligned figures or evocations of their ideology as studied provocations.) (40 points-and just 40 points because even a troll might not be a troll all the time.)

Ask each of these questions in regard to the suspected troll, and add up the points warranted by each "Yes." Someone who scores a 30 is suspect as a troll, someone who scores a 40 is likely to be that, someone who scores a 50 or more probably that.

The first six questions allow for a score of 100, while adding in the bonus question permits a maximum score of 140.

As that suggests, trolls, by nature unsubtle, tend to give themselves away very early on in the game. I think it best to take the hint, and best that you also do the same. Do not fall for the lie that you are being a thin-skinned "snowflake" or intolerant of "free speech." Accusing people for being thin-skinned for simply not accepting abuse is what a bully does. ("What, can't take a joke? Don't you have any sense of humor? You're no fun.") And respect for freedom of speech does not mean that you are personally required to spend every waking moment of the rest of your life being an audience for people who disagree with you--and still less, people who are promoting ideas that are genuinely loathsome. (I'm making a value judgment here, and not apologizing for it.) Feel free to cut a troll off as soon as you have become convinced of their purpose. Mute, block and REPORT as seems appropriate to you.

Sunday, December 9, 2018

What Makes a Troll a Troll?

We all know that the Internet, and especially social media, are inundated with trolls--pathetic, repugnant, dark triad-afflicted losers (narcissistic, Machiavellian, psychopathic) whose idea of a good time is ruining them for everyone else.

Many will acknowledge that it is a bad idea to respond to a troll. They are uninterested in anything you have to say, only in distracting you from what you were doing, knocking the conversation off track, making you waste your time and effort dealing with them, and causing whatever harm they can. This may be so much the case that they will say things they don't mean simply to satisfy their desire to disrupt and to hurt--but much of the time, maybe, probably, most of the time they believe them, if not always wholeheartedly.

But how does one tell the difference between trolling and an opinion they simply do not like?

I find that trolls tend to butt into ongoing conversations among other people--usually, people of quite a different sensibility than themselves. (To choose an admittedly non-neutral example, one might find, for example, a few people who might all be considered left-of-center responding to an item to which such persons might be expected to be more attentive than their counterparts on the right when a right-winger suddenly turns up.)

I also find that when they do butt in they do one of three things:

1. Fling insult and abuse. (All they can do is call the participants in the conversation stupid--which often betrays that this is exactly what they are.)

2. Rub their opinions in the faces of people they expect to be repulsed by those opinions, rather than try to actually discuss anything. It is a little harder to be sure of this than, for example, insult. Still, there are giveaways. Such opinions are typically canned, often by someone else (they don't actually do much thinking for themselves, or they'd have better things to do than this), and commonly irrelevant to the conversation into which they have entered. (I recall a thread where people discussed the President's reply to the Thanksgiving Day question "What are you thankful for?"--and only that--and someone felt the need to inject the claim that "Socialism killed 100 million people.") When others react, they commonly display satisfaction, perhaps repeating the action.

In cases I have had the impression that they are like exhibitionists, deriving satisfaction from others' disgusted reactions. In others, they seem to delight in lobbing a grenade into a crowd of bystanders. In still others, it seems they are more purposeful--intent on diverting a discussion, for example. (I recall the comments thread of a news story about the Netherlands' police training eagles to catch drones, and seeing it from the start diverted into an attack on solar energy, with the hundreds of comments that followed caught up in the ensuing flame war. Alas, renewable energy and sane climate policies are very common troll targets.)

3. In the rare case that they are able to actually interact with others regarding the subject, they behave in very unreasonable, bullying fashion. They do not ask for an explanation of another person's opinion, they demand that they defend it, and raise the bar for such defense very high--asking for the equivalent of a doctoral dissertation, complete with footnotes, as the price of their having opened their mouth. Even after the person has given such explanation as they have to offer, the questioner refuse to accept that the other person has said their piece and keeps coming at them, as if intent on making them recant, on converting them to their cause. (I have, unfortunately, found myself having such talks with proponents of atomic energy who want me to "admit" that renewable energy can never be the foundation of our energy base.)

Can normally decent people find themselves acting in ways similar to this? It's not impossible. Annoyance, a foul mood, and they slip into some bad behavior. But I think one can go too far with the "Everyone can be a troll" line, and certainly anyone who has doubts about a particular interlocutor can, on Twitter at least, just look at their account, see the way the suspected troll has chosen to present themselves to the online world, see the things they choose to post and share.

Some I have seen proudly and not at all ironically write the word "Troll" in their bio. And I find it best to take them at their word.

When I run into someone fitting someone fitting the profile described here, I don't mute the conversation, I block them. Permanently.

Yesterday, one of my comments proved to be pure troll-bait, alas, inciting dozens of attacks from people who responded in exactly these ways. I have blocked each and every one, and at the time of this writing, find myself continuing to block them--setting a one-day record, which is, I suppose, why I buckled down and wrote up this post.

I strongly urge everyone else to similarly block those who conduct themselves in such a manner. Trolls, especially as defined here, may have the right to speak, but you have no obligation to listen to them, let alone answer them. They have no right to take your time and attention, let alone inflict harm upon your mental well-being.

Do not let them do it.

Wednesday, October 31, 2018

Yes, Tax Breaks ARE Subsidies

Many have encountered the claim, typically made by right-wing and especially libertarian commentators, that tax breaks are somehow not subsidies.

As is usually the case when libertarians and other champions of the most hardline economic orthodoxy make such "educational" pronouncements, the assertion is, all at once, utterly contrary to any conventional understanding of economic reality; Olympian in its arrogance and contempt toward anyone who would presume differently; and opaque in its reasoning.

After all, Investopedia, no bastion of left-wing, communistic thought, defines a subsidy as:
a benefit given to an individual, business or institution, usually by the government. It is usually in the form of a cash payment or a tax reduction. The subsidy is typically given to remove some type of burden, and it is often considered to be in the overall interest of the public, given to promote a social good or an economic policy.
Note that "tax reduction" is explicitly included in the definition they provide.

Indeed, one does not find the essentials contested even in the explanation of what a subsidy is provided in the Mises Institute article that seems to usually be at or near the top of the list of search hits on Google when anyone uses it to research the topic, "No, Tax Breaks are Not Subsidies." While the article does charge an "economic and ethical" difference between a cash payment and a tax break (the difference between keeping what you have and being given something taken from someone else), this does not in and of itself make a tax break not a subsidy. In fact, the article acknowledges that "entrepreneurs who take advantage of tax breaks will incur fewer costs than entrepreneurs who don't," and that such breaks are indeed "beneficial to those who claim them"--which is totally in line with the idea that, in conferring advantage to some business at public cost (in this case, forgone tax income, which is shifted to other parties who accordingly bear a greater part of the tax burden, or the burden of forgoing such expenditures), a tax break has the same practical costs and benefits as a subsidy.

So where, one might wonder, is the argument?

The real objection of the piece's author to the term's use in this case appears later, when the article continues with the explanation that government is "not a wealth creator," but only a taker of others' incomes. Because of this such seeming special advantages are really just reprieves from taxation on business that it compares in its various metaphors to a "life jacket in a sea of wealth redistribution," with capitalism only able "to breathe" through such loopholes. Indeed, the web site followed up this piece with another article titled "The Answer to 'Unfair' Tax Breaks is More Tax Breaks."

In short, the "tax breaks are not subsidies" argument assumes that governments have no business taxing business, so that any such taxation is illegitimate (rather than a We-the-people argument for "No taxation without representation" their stance is a pre-1789 French aristocrat's arrogant "No taxes on us, ever!"), and accordingly any respite from such taxation only just, natural, appropriate, and not at all a show of special favor.

None of this is self-evident to most. And when explained to them, most will flatly (and justly) reject the premise of the argument.

Moreover, it will appear the case not only that a subsidy IS a subsidy IS a subsidy, but that, as is so often the case the arrogance and opacity are mere cover for unbelievably shoddy reasoning in the name of the self-serving ideas to which persons and institutions of this type are so prone. Like the claim that one can "never run out of" a given resource, or that involuntary unemployment "cannot exist," an economists' insistence that "tax breaks are not subsidies" is simply a matter of market fundamentalist theologians posing as social scientists, in the way that orthodox economists have been doing for at least a century now; of the staff of right-wing think tanks and the like pretending to be public intellectuals interested in and explaining the actual world rather than PR hacks for the interests that pay their salaries, utilizing theories formed and held without interest in or regard for the real world because they conveniently legitimize their political positions; of the inmates in the asylum insisting that they are really sane while it is everyone else who is crazy.

Saturday, October 27, 2018

George Orwell's Nineteen Eighty-Four: A Note

Some time ago a poll in Britain identified George Orwell's Nineteen Eighty-Four as the book that people most often lie about having read.

Naturally the ways in which we hear the book discussed often seem to miss the point. In line with the prejudices fostered by Cold War orthodoxy, they also identify it with socialism--without which the book might not be nearly so widely cited and promoted as it is today.

Alas, the latter conveniently overlooks the fact that Orwell, while an anti-Stalinist, was still a socialist; and what he described in his book was not meant to be thought of as socialism but rather a pseudo-socialism intended to prevent the genuine kind from ever arriving. Put another way, it was intended to preserve the existence of inequality, a power elite, the mechanisms of oppression as ends in themselves. The key tool was the militarization of society--keeping it in a state of perpetual war.

I direct the reader--you know, the one actually reading the book rather than saying they did, to Chapter III of the book-within-a-book that is Emmanuel Goldstein's The Theory and Practice of Collective Oligarchism, "War is Peace." There, as Goldstein observes, the rising productivity of machine-equipped industrialized society was making it possible to eliminate "hunger, overwork, dirt, illiteracy, and disease" before very long, and indeed led to a "vision of a future society unbelievably rich, leisured, orderly, and efficient"--what we call the capital "F" Future of which the flying car is the symbol--"was part of the consciousness of nearly every literate person." Those in power saw that this would spell the end of any need or justification for "human inequality," and indeed, "hierarchical society"--distinction conferred by wealth, and "POWER . . . remained in the hands of a small privileged caste," and this "privileged minority," now functionless, swept away by the well-fed, literate, thinking lower orders who no longer had use for them, and knew it.

Indeed, Goldstein declares that "[i]n the long run, a hierarchical society was only possible on a basis of poverty and ignorance." Deliberate restriction of economic output to prevent this proved politically unworkable (as the post-World War I stagnation and Great Depression proved), but perpetual war afforded an option for "us[ing] up the products of the machine without raising the general standard of living," which indeed was the principal function of the eternal war amongst Oceania, Eurasia and Eastasia. After all, there is nothing like the cry "We're at war!" to make people stomach austerity, deprivation, repression; to make them think the stifling of dissent justified--with even liberal societies seeing war time and again prove the end of progressive hopes.1

This commitment to inequality and oppression for their own sake, and the extremes to which an elite fearful of its position might go to resist a movement toward a more egalitarian order of things; the recognition of how eternal emergency can be an excuse for such a state--these were arguably Orwell's greatest insights, and warnings we have ignored at our very great peril.2

1. Chris Hedges wrote about this in The Death of the Liberal Class. John K. Galbraith wrote about his own experience of this in his A Journey Through Economic Time: A Firsthand View.
2. In the closing essay of his collection On History, Eric Hobsbawm reflects on the sheer bloodiness and brutality of the twentieth century and suggests that it was as horrible as it was because it was, at bottom, a "century of counterrevolution," and no one can be more vicious than the privileged when they feel that their selfishness is threatened.

Friday, October 26, 2018

Review: On History, by Eric Hobsbawm

New York: New Press, 1997, pp. 305.

As the title of his book implies, here historian Eric Hobsbawm is writing less of history here than of historiography. Academic a subject as this may sound, however, this is not a collection of minutely academic articles for ultra-specialists. Over half of the pieces (eleven) are lectures given not in the classroom but at special events at various institutions around the world; two more are conference papers; and two of the pieces originally written for publication were book reviews in non-academic forums--the Times Literary Supplement and the New York Review of Books. It is the case, too, that Hobsbawm is an unfashionably staunch, forceful and persuasive defender of reason and the Enlightenment that he is, and of the value of history and historians as well.

Consequently, despite the methodological emphasis of the book, and the fact that most of his pieces raise more questions than answers, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure to, even disinterest in, grappling with a historical reality that time and again proves the utter worthlessness of their models and their apologia. Despite the methodological emphasis of the book, he does not retreat into quasi-metaphysical abstraction, but keeps close to actual practice, with much to say about the investigation of specific historical problems, from social history, history-from-below and urban studies to the historiography of the Russian Revolution. His two-part "Historians and Economists," despite his protestations of being out of his depth in the subject at the start, presents an extremely well-informed and incisive critical intellectual history of the economics profession, and especially its longtime failure at, and even disinterest in, grappling with a historical reality that time and again demonstrates the utter worthlessness of their models and their apologia (fitting them, as he remarks, for the "dog-collar of the (lay) theologian").

Of course, more than two decades have passed since the publication of this collection, and even at the time many of the pieces were already a generation old. Still, if in places they show their age by treating old phenomena as if they were new (as when he writes of the arrival of Marxism in the Academy, Walt Whitman Rostow's modernization theory, cliometrics, and the break-up of Yugoslavia), they retain their interest because what is past is not past, and in some cases more present than ever--and often, more pervasive, more dominant, more damaging than before. Certainly this the case with his critique of orthodox economists' pieties, but this goes, too, with what he has to say of other unhealthy tendencies of the late twentieth century, like postmodernism and identity politics. As he memorably writes regarding the latter, "few relativists have the full courage of their convictions, at least when it comes to deciding such questions as whether Hitler's Holocaust took place or not," while at any rate, "relativism will not do in history any more than in law courts." There, "[w]hether the accused in a murder trial is or is not guilty depends on the assessment of old-fashioned positivist evidence," and any of his readers who find themselves wrongly placed "in the dock will do well to appeal to it. It is the lawyers for the guilty ones who fall back on postmodern lines of defence."

Indeed, reading Hobsbawm I find myself remembering another book by another great of the past century who "writes of the Enlightenment without a sneer," C. Wright Mills' The Sociological Imagination. Mills was a sociologist who called on social scientists (or "social students?") to be more historically minded, while here Hobsbawm as a historian made a not dissimilar case. Both were entirely right, and that they have been so little heeded has impoverished the lines of inquiry with which they were concerned, and our collective understanding of our own world.

Saturday, September 8, 2018

Shelley, Verne and Wells--especially Wells

When we recount the history of science fiction, it is common to point to some early figure as its founder--for instance, Mary Shelley, Jules Verne or H.G. Wells.

The emphasis on identifying a single, founding work by a single author strikes me as unsatisfying for numerous reasons, only one of which I want to go into here right now---that it makes much more sense to credit them with each defining a crucial strand of science fiction, already in place before it coalesced into a genre (in the '20s, on the watch of Hugo Gernsback).

Note that I write defining here, not founding, because it would be excessive to claim that they did something that had never been done before to any degree. Rather they did what others might have done before in a certain way, got noticed for it, and became a model for those who followed in the process.

The Gothic writer Shelley defined the science-based horror story--where the scientific endeavor goes very, very bad. Verne defined the science-based adventure--where a discovery or invention sets us up for a thrill ride to the center of the Earth or the moon or for twenty thousand leagues under the sea which might teach us something along the way, most likely in the way of scientific facts. And Wells defined the science-based story not about some hero or anti-hero, but where, to borrow Isaac Asimov's phrase, humanity as a whole is the protagonist; the story which uses science to think about society, and considers how science, by way of ideas and technology, might change it; the story about a future that is clearly not the same as today, broadly, deeply and densely imagined; the science fiction story which can criticize things as they are, and bear the hope of progress.

Of these the last perhaps accounts for a smaller part of the genre's overall output than the other, older strands. Certainly it is less present in the more popular work than the influences of Shelley or Verne. (Horror stories, adventure stories, are the easier sell.) Still, it is Wells' strand that seems to me to not only have been far and away the most intellectually interesting, but to have given science fiction a cultural role beyond mere entertainment, or in some general way getting people excited about science; to have enabled science fiction to really matter. And it seems to me that the vigor of that particular tradition is, more than any other factor, the determinant of the health of the genre as a whole.

Are Books Too Long These Days?

Are books too long these days?

I will say up front that many of the novels that have most impressed, most affected, most influenced me were thousand-pagers. Fyodor Dostoyevsky's The Karamazov Brothers, for example. (I can't imagine Tales From the Singularity without that one.) Or Anthony Trollope's The Way We Live Now. (Which is still in a lot of ways The Way We Live Now in the twenty-first century.) Or Theodore Dreiser's An American Tragedy. (Has anything equally ambitious, sweeping, worthwhile been written about American life since?)

And reading my way through the classics, I encountered a good many that don't have a membership in that pantheon, but where I could appreciate what they were going for, and that trying to do it took half a million words (as Victor Hugo did in his national epic of France, Les Miserables, and Leo Tolstoy did in War and Peace).

Still, not every book needs to be so long as that. Not every story requires so much sheer mass. Most are better off without it. And in general I think those books that most of even that small minority that actually reads tends to actually read--the romances and thrillers and romantic thrillers--are ill-served by the demand for doorstops. What might be a brisk entertainment instead ends up bloated and slow, and often pretentious, and I find myself nostalgic for the quick and dirty writing of a half century ago, and the still older pulps. Reading Dirk Pitt at the series' best was a lot of fun, but there is a lot to be said for those who came before him, not least that other Clark-from-New-York-with-a-Fortress-of-Solitude, Doc Savage.

The Summer 2018 Box Office: Solo and The Meg

I haven't done an overview of the summer box office in quite a while, in part because it has become so damned repetitive, in its commercial successes--and perhaps even more consistently, its artistic failures. Every year Hollywood laments its earnings, every year the more critical critics decry the shallowness and sameness and staleness of it all, every year we hear promises that Hollywood will change, and every year, it demonstrates that not only has it not done so, but that its lack of memory is utter and total, the promise unremembered.

So I'll restrict my comments on the whole tedious thing to the two films I felt I actually had somethijng to say about: Solo and Meg.

After over three months of release in which it has had long play in every major market (even Japan got it before the end of June), Solo remains short of the $400 million global mark, like I guessed it would be after what, only in the context of the money poured into it, was regarded as a dismal weekend. What it will mean for the franchise remains very much a matter of rumor and speculation. But the shock seems undeniable.

By contrast Meg proved that rarity, a movie that performs above expectations rather than below them, and that still greater rarity, the seemingly written-off dump month release that proves a blockbuster. (The predictions were $10-20 million in its opening weekend in North America, but it actually pulled in more than twice the high end of the range, $45 million.) A somewhat more modest production with much more modest expectations, The Meg has already outgrossed Solo globally (the China market has helped a lot), and if the figures discussed earlier are to be believed (a $400 million break-even point, due to the advantages it enjoys as a Chinese coproduction in that makret), already well into the black, and still raking it in (in its fourth weekend of U.S. release, still at #2).

A follow-up is not a sure thing (given the nine figure budgets and equally hefty promotional bills, the hefty competition and the terms of Meg's success as a bit of goofy fun, the margins are not exactly vast), but it still looks quite likely. We might even see other Steve Alten works finding their way to the big screen as a result, in what looks like at least partial redemption of the dashed hopes held for it all way back when we first heard of the hopes for the project.

From the standpoint of the business Meg's success falls far short of balancing out Solo's underperformance, and all it represents (the ultimate in the Hollywood franchise mentality, by way of the franchise that did more than any other to establish the blockbuster as we know it). Still, looking at the two trajectories together, I suppose there's a certain symmetry in them.

Subscribe Now: Feed Icon