Sunday, June 30, 2013

An Alternate World War II: H.G. Wells' The Shape of Things to Come

Reading Wells' The Shape of Things to Come from the standpoint of eight decades on, one is, of course, struck by how far the history of the twentieth and twenty-first centuries verged from his anticipations. Certainly his expectations about the conflicts that led to World War II are no exception; rather than the last gasp of the nation-state, they led to the birth of far more states (as a result of decolonization), and the dominance of international politics, and certainly the European politics on which he focused, by two states of unprecedented strength (the U.S. and Soviet Union). Yet, I was also struck by the number of parallels between his anticipations and what actually happened in the war--what he got right as well as wrong. Below is a list of the parallels that caught my eye.

* The war is ongoing by 1940.
* Despite widespread nationalistic hysteria, there is little enthusiasm among the civilian populations of the countries marching off to war.
* Germany's rearmament is an open process as well as a secret one.
* The war begins with fighting between Germany and Poland over Danzig.
* While war rages in northeastern Europe, Italy expands in the Balkans.
* Despite its commitments to its East European allies, France hesitates to fully enter the fray.
* While fighting Germany in the west, Poland finds itself attacked in the east by Soviet action.
* Germany and Austria are unified.
* There is a confluence between the initially separate aggressions of Germany and Italy into a common conflict between these powers on one side, and their enemies on the other, that extends "from the Pyrenees to Siberia."
* Hungary and Bulgaria fight on the same side as the German-Italian "Axis," entering the action through their participation in the invasion and occupation of Yugoslavia.
* The Axis entry into Yugoslavia is challenged by a significant guerrilla campaign.
* In East Asia, Japan invades "China proper," leading to open warfare between it and the United States.
* The war continues for several years in both Europe and Asia.
* Aerial warfare is a significant part of the conflict, with the belligerents relying on it heavily to achieve their aims. It is also highly indecisive--while still killing millions.
* The war contributes powerfully to the collapse of Europe's dominion over the rest of the world.
* The war sees the Soviet Union expand inside Europe, gaining territory at Poland's expense, and also establishing a new Soviet republic on territory taken from Romania, while more broadly consolidating a sphere of influence in the eastern, predominantly Slavic, part of the continent.

Especially given that the book was published in 1933, six thoroughly eventful years before the outbreak of open warfare among Europe's great powers, this is an impressive list of good calls, which comprise a substantial part of the war's outline as well as numerous particular details of its course--and testify to his real insight into the events of his time. Still, Wells got a great deal wrong, and it does not do to gloss over these, least of all those attributes of the conflict that may have been foreseeable. Two errors are particularly striking (certainly where the larger shape of the European war is concerned). The first is that he expected a smaller number of major powers to take part as belligerents than actually did so, particularly in Europe--his scenario excluding Britain and the United States from the fighting on the continent, while the Soviet Union and Germany also did not come to blows. The second is that he anticipated a "limited" effort on the part of those actors that did fight, waging a technologically sophisticated but economically constrained war.

Nonetheless, this may not have been so unlikely as it now appears. While Wells characterized Hitler as "the voice of Germany losing control," and viewed the country as having been maddened by the situation of Danzig, he anticipated a weak Germany--weak enough, in fact, that in the conflict over Danzig, it is Poland which strikes the first large-scale blow (with an air raid on Berlin), and then holds its own in the subsequent war, even with a Soviet-backed rebellion in Polish Ukraine. At the same time, he also anticipated a much weaker United States, much less able to intervene in a European war.

This was, in part, a reflection of his expectation that the Depression represented the final collapse of capitalism. Wells was aware of the American New Deal, and regarded it as a step in the right direction, but predicted that it would prove unsuccessful--not because of any flaw in the economic theory underlying it, but because he thought that American political culture, and American inexperience in such matters, would prevent the Federal government from successfully administering such a program. However, the New Deal actually returned the U.S. economy's output to 1929 levels by 1936, and left it 20 percent larger in 1940. Germany, too, achieved a significant reduction of unemployment, and the recovery of its national output, through "Keynesian" methods.

And of course, each nation was to translate that recovery into a stronger military position, and build on it, with a "military Keynesianism" of a kind which Wells simply seems not to have anticipated (identifying military spending solely with economic strain and privation as he did). Germany followed this course first in 1936, to which one might add that Germany (aided by its appearance of comparative dynamism) also attained a number of political successes that enabled it to further strengthen itself, like the Anglo-German Naval Agreement of 1935, and more dramatically, Germany's getting other nations to acquiesce in its union with Austria in 1938, and its absorption of Czechoslovakia in 1939, all of which substantially improved its strategic and military position when hostilities actually began. So did the German army's capacity for high speed and flexible movement, a function in part of its exploitation of the possibilities of the tank in ways that the Allies had failed to do during the '30s.1 The result was not only to make a Polish attack on Germany implausible, but to make Germany's war against Poland short and successful. Indeed, it enabled Germany to turn west and win similar victories over the Low Countries and France.

Had Germany not posed that kind of threat, had it in fact been attacked by Poland, and been unable to achieve better than a draw in that fight, while France dithered about supporting its ally, one could indeed imagine Britain remaining aloof from the conflict, especially if it were distracted by colonial affairs--and the United States being that much less inclined to enter the fray, while the Soviet Union remained content with shoring up its position in eastern Europe. Nonetheless, Germany's growing power and record of expansionism in the interwar period assured British involvement in a German-Polish war on some level. Subsequently, Germany's successes early in the war meant increasing U.S. involvement through its support of Britain, and after December 1941, its own open, direct, full-scale involvement. That involvement, in turn, greatly enlarged the U.S.'s power to act through an extraordinary spurt of war-driven economic growth, American GDP expanding over 70 percent between 1940 and 1945.1

This combination of an economically stronger and expanded Germany capable of threatening all of Europe, and an economically more powerful United States capable of a major transoceanic intervention, are at the root of the other, larger error Wells made about the character of the war, namely its totalistic nature.

Wells' expectation, in line with the fashionable military futurology of the time, was that "the next war" would not be a "total" clash of multimillion-man armies and navies, but a conflict of small, technically oriented establishments emphasizing "aero-chemical" power above their other arms. On that score he, and a great many others, were wrong (especially about the casual attitude of the belligerents toward the use of chemical weaponry). Still, Wells was more astute than many of his contemporaries in appreciating that strategic bombing would not win the quick, decisive, cheap victory enthusiasts of the technique promised--and not for lack of trying, with the U.S. and Britain in particular relying very heavily on this approach.2

Equally, he was right about the European powers trying to avoid a totalistic conflict--as they did in the earlier period of the war, haunted as their leaders were by memories of the domestic upheavals to which World War I led. Britain, and France, certainly refrained from this in the "phony war" phase, after which France was knocked out of the game, and Britain only made a more strenuous effort following the French collapse--an effort which exhausted its foreign exchange reserves by early 1941 (after which vast foreign, and especially American, economic aid was indispensable). Germany, too, was leery of a total war, actually avoiding full-scale mobilization until 1943, after it was embroiled in a losing fight against an alliance that included a fully committed United States and Soviet Union--its run of conquests instead massively subsidizing its efforts.

Putting it another way, it was a stronger Germany's series of swift political and military triumphs over its initial continental opponents (which let Germany go on the offensive so successfully as to leave Britain fighting for its life, and clear the way for an invasion of the Soviet Union), and the economic recovery, and boom, of the United States (which enabled it to not just intervene, but intervene massively), that made the European war truly and lengthily totalistic (the U.S. effort subsidizing Britain's, and along with the strains of the Axis-Soviet war to which Operation Barbarossa led, contributing to Germany's reacting in kind). All of this resulted in a conflict that was shorter than Wells anticipated (ending in 1945 rather than 1949), but far more decisive (with the Allies securing the formal unconditional surrender of the Axis countries, then proceeding to occupy those states and give them entirely new political institutions rather than merely suspending hostilities), and by some measures also more horrific (perhaps 30 million dying on the Eastern Front alone).

By contrast, had the war remained a continental affair among relatively anemic military actors, history might have gone rather more like Wells' anticipation. Indeed, like many a well-reasoned vision of the future which did not actually come to pass, his scenario retains a great deal of interest as a counterfactual long after it has ceased to interest us an anticipation of "things to come."

1. This failure is ironic given that Wells offered an influential presentation of the idea in the 1903 short story "The Land Ironclads."
2. Giulio Douhet, one of the foremost such thinkers, argued in his classic The Command of the Air that a squadron of twenty planes would suffice to "break up the whole social structure of the enemy in less than a week, no matter what his army and navy may do." (This quotation is taken from the Dino Ferrari translation published by Coward-McCann in 1942, where the remark appears on page 142.)

Reflections on William Gibson's Distrust That Particular Flavor

On first contact I found William Gibson's fiction deeply frustrating. One reason for this was stylistic. (In the pieces with which I first came into contact, like the short story "Johnny Mnemonic," clarity and flow were sacrificed to the principle of showing as flashily as possible--and this was certainly one writer who could have told a good deal more than he did.) However, the content of his stories was also an issue. His Sprawl stories and novels seemed to touch on innumerable major issues, from the Cold War to corporate power to the widening gap between the classes, but he never really said anything about these things.

Gibson was, in short, a postmodernist. Since then I have increasingly thought of him--not just today's Gibson, but the Gibson we've always had--as a postmodernist first and a science fiction writer second, with Gibson's remarks in interviews reinforcing the impression.1 If anything, it has been reinforced yet again reading Distrust That Particular Flavor, the first ever collection of the small but noted body of nonfiction he has accumulated over the years.

As those who have made its acquaintance before know, Gibson is not a scholar or a journalist. Rather what he does in his nonfiction is offer reflections on a handful of experiences--particularly through pieces of travelogue (most of it Japan-set, though his feather-ruffling account of his famous visit to Singapore, "Disneyland With the Death Penalty," is included), and accounts of his contacts with various bits of information technology and media (like "William Gibson's Filmless Festival," in which he shares a number of movies shot on video with his daughter).

At their best, these pieces can be read as a collection of glittering bits, offering a razor-sharp insight in razor-sharp prose, and on occasion even engaging on a human level (as in his tribute to actor Takeshi Kitano, "The Baddest Dude on Earth"). At their worst they seem hopelessly self-indulgent, like "My Obsession," his piece about a period of addiction to shopping for mechanical watches on EBay (which Gibson himself admits went on way too long). Fortunately most pieces come closer to the former than the latter, and while in the end it was not all that I would have hoped for from Gibson's first nonfiction book once upon a time, I still found it a worthwhile read.

1. To put it bluntly, he always seemed to speak at length about trivia (like using EBay), rather than the Big Issues about which I always hoped he would say something.

Monday, June 24, 2013

Bad Teacher 2?

WARNING: MILD SPOILERS AHEAD

I rarely expect much from Hollywood comedies anymore, and 2011's Bad Teacher was no exception. Teachers, after all, are a shamefully easy, "safe" target, on the receiving end of several of American society's animuses--toward intellectuals, toward government and government workers, toward organized labor (and especially public sector unions), toward liberals (which teachers are all incorrectly assumed to be by many a conservative detractor). And because of their vulnerability (as a not merely maligned, but ill-paid and relatively unprestigious profession) prone to be blamed for every failing of American education, real or merely perceived.

Still, there is no denying that the education system, like any other major institution, is not an unfit subject for satire. Our debates over curriculum and educational methods, from the math wars to standardized testing; the tension between the "convenient social virtue" society is prone to demand of teachers, and their needs and rights as human beings and professionals; the sociology of the classroom and the educational bureaucracy and the broader conditions under which educators work; the inequality between the schooling given the poorest and the most privileged--all this can plausibly make for a movie about something.1

Bad Teacher does not go this route, however.2 The failings of the titular figure, Elizabeth Halsey are purely individual and anomalous. She is simply a lazy, conniving "bad girl" who somehow got hired as a teacher, and somehow never got weeded out from among the flaky, naive do-gooders who are her colleagues--not that she has any desire to continue in this career. Her real aspiration is to marry money so that she can give up teaching and live in comfortable leisure. This leads to a pair of overlapping love triangles--the first between Elizabeth, substitute teacher Scott Delacorte and Elizabeth's colleague Amy Squirrel in the first, and the second between Elizabeth and Scott and gym teacher Russell Gettis--which have Elizabeth scheming to keep Scott and Amy apart, and buy herself breast augmentation surgery which she thinks will let her win over Scott (whose principal attraction is his coming from a wealthy family).

Of course, love triangles, ill-conceived money-making schemes and general bad behavior are standard comedic material, but even as "turn your brain off" entertainment I found the results underwhelming. The characters were one-note and often annoying (particularly Delacorte and Squirrel), the jokes mostly flat and ill-conceived (with Halsey's un-p.c. remarks merely seeming incongruous rather than funny, sounding as they did not like the un-p.c. remarks of her generation, but her cranky old grandfather's). And to top it all off, the film, wholly dependent on Elizabeth's consistent greed and mean-spiritedness for its few laughs, has her undergo a third-act change of heart so unconvincing that it feels like a parody of the kind of commercial hokum that director Jake Kasdan sent up in his earlier The TV Set (2006).

Nonetheless, the film was a hit as these things go, pulling in over $100 million domestically and another $115 million overseas (no Hangover, but still decent given the $20 million budget), and finding enough enthusiasm in Hollywood for a continuation that CBS has a sitcom based on the material premiering in the fall, and Sony has now announced a sequel.

I wonder about these decisions. It is far from clear to me that the concept has a hundred episodes in it, especially with the sexuality and language unavoidably toned down for network television.3 Equally the sequel seems a risky proposition, because it is bound to cost more--but not necessarily bound to sell more tickets. The margin between budget and gross is not all that high here, and I am not at all sure that this is an idea which left viewers begging for more (and indeed, the audience's affection does not seem to have been overwhelming). Also unhelpful is the conclusion of the first film, which had a (somewhat) reformed Halsey give up teaching to be the school's guidance counselor, so that she is no longer so bad as before, nor a teacher, complicating any continuation. Certainly it is difficult to picture the sequel doing well at the box office while the sitcom of the same name is airing--a potential competitor rather than complement, and perhaps a drag if it is not well-received.

If Hollywood really does mean to turn this one into a large, profitable franchise, those least powerful of Tinseltown's creative people, the writers, will have a lot of difficult work ahead of them.

1. As John Kenneth Galbraith put it in his classic Economics and the Public Purpose, the "convenient social virtue ascribes merit to any pattern of behavior, however uncomfortable or unnatural for the individual involved, that serves the comfort or well-being of, or is otherwise advantageous for, the more powerful members of the community. The moral commendation of the community for convenient and therefore, virtuous behavior then serves as a substitute for pecuniary compensation." This is "widely important for inducing people to perform unpleasant services." Nurses, for example, are expected to accept such commendation "as a partial substitute for compensation," while "such merit was never deemed a wholly satisfactory substitute for remuneration in the case of physicians."
2. In short, Bad Teacher proves to be pure Bucket Brigade, saying as much about education as Anchorman says about journalism and broadcasting, The 40 Year Old Virgin says about celibacy, or Step Brothers says about why people continue living with their parents well into adulthood--which is to say that not only does it say nothing, but it does not even appear to have been interested in saying anything. Instead the film is a showcase for the writers' and actors' trademark lines and gags (like the bizarre exclamations labeled "Ron Burgundyisms" on Urban Dictionary).
3. Premium cable, of course, would be another matter.

H.G. Wells' The Shape of Things to Come: Eighty Years On

For much of his career H.G. Wells was a founding proponent of what Patrick Parrinder termed "the scientific world-view"--a sense that the nineteenth century liberal order (a balance of power among nation-states preserving the peace, a capitalist world economy) was dead, and that the way forward for the species lay in rationalism, socialism and international organization. Advanced in both his nonfiction and fiction, across a range of genres, Wells gave it its fullest treatment in 1933's book The Shape of Things to Come. Wells presents Shape as the "dream-book" of Philip Raven, a British civil servant who recorded dream visions he had of a twenty-second century history textbook which tells of the founding of a future "World State."

Its story is, in a way, the story of all human history, the book looking back to antiquity for the origin of the idea in the World-State, and finding it in the universalism of the religions which emerged in the first millennium BCE, and the dissent of prophets taken for insane or inspired, and always inconvenient (with Plato's Republic a crucial moment in its depiction of an elite committed to service rather than self-aggrandizement in implementing in a social order). These forces continued to develop, until by the seventeenth century an increasingly critical turn of mind was evident, and with it the beginnings of modern scientific rationality and criticism of existing institutions--monarchy, aristocracy, religion, even property.

Alongside these important cultural and intellectual changes, advancing technology and growing organization made what was previously an ideal into a practical possibility and necessity by the early twentieth century. By this point, Wells' textbook argued, capitalism was translating increases in productivity into unemployment, weakened consumption and slumps--a self-defeating pattern which appeared to have culminated in the Great Depression--while avoidable poverty continued to stunt human life. Meanwhile advancing military technology and the increasingly totalistic character of warfare (demonstrated in World War I) made armed conflict more costly and dangerous for human civilization, with the risk of such conflict exacerbated by nationalistic and militaristic ideas, by what we today call military-industrial complexes, by the worst of tradition and habit (religious beliefs, sexual attitudes, etc.), and even by the physically and psychologically unhealthful conditions of everyday life in the era (all the way down to bad hygiene and noise pollution).

Wells also pointed to widening recognition of the problems, and of the ways in which they could be redressed, in meaningful but flawed developments like Henry Ford's Peace Ship, the League of Nations, the London Economic Conference, Franklin Delano Roosevelt's New Deal--and the emergence of explicit alternatives to liberalism on the left in Soviet Communism, and the right in fascism in Italy, Germany and elsewhere.1 Nonetheless, the limitations of these initiatives and movements mean that the world fails to recover from the economic crisis. Instead the world's economy continues to fall into greater disarray, resulting in living standards, deteriorating public services, and rising crime and insecurity of all kinds, extending even to a resurgence of maritime piracy. Moreover, the worst in these new movements comes to the fore when Nazi Germany's dispute with Poland over Danzig escalates into shooting, sparking a general European war that, along with the ongoing conflict in east Asia, saps what remains of the world's economic vitality, and leaves the world wide open to a plague that kills off half the planet's population in the 1950s.2

This collapse of the old order creates a space in which a new one can emerge, the groundwork for which is laid by a practical-minded technical-administrative elite centered on the control of the world's long-distance communications. This "Air and Sea Control," the members of which recognize that the more traditional objects of such careers have been mooted by the new realities, works instead for the revival of the world economy and the building of a World State, a process formalized in a 1965 conference at Basra. This dictatorship, up against the remnants of the old order of capitalist enterprise and sovereign states (actually drawing strength from the revival of the world economy), manages to pragmatically, patiently and consistently outmaneuver its foes, only rarely having to employ any significant force to get its way. (Indeed, it manages to circumscribe privately owned business and let it fade away rather than outlawing it.)

Of course, after this point there are continued difficulties as the World-State goes about the immense practical work of achieving material plenty for all, eradicating disease, and educating its citizens for life in the new world, while still facing attempts at subversion from those opposed to it, resulting in thousands of political murders, and executions into the 2030s (during which the conduct of the security organs was not always unimpeachable). Later the World-State itself became a problem, under the administration of a stifling (if benevolent) "Puritan Tyranny" that was so "consumed by an overwhelming fear of leisure both for [itself] and others" in a world where so many of the old constraints on life had been eliminated that it bowdlerized literature and the arts, and "INVENTED work for the Fellowship and all the world."

Nonetheless, this too gives way as the World-State withers, the Central Council increasingly disregarded by the subordinate Controls overseeing various aspects of human life (education, health, etc.)--an administration of things rather than people. The process is formalized when the Council is officially retired in the 2059 Declaration of Mégève, which declares "the Martyrdom of Man . . . at an end" and "the sunrise of the hour" of united humanity in a world without superstition, tribalism and want, freer, happier--and saner--than the people of previous centuries had even dreamed it could be.

As might be expected from Wells' earlier success as a popular historian, his historiography is impressive--thorough and lucid and accessible. As might also be expected given his brand of science fiction, his interweaving of the events of the past, and his present, with his record of things to come, feels seamless even today, enough so that even someone well-read in the history of the time can have a hard time figuring out just when he stopped writing as a historian and started writing as a scenarist. A grand testament to both his vast imagination and underappreciated technical skill, it endows what he describes with an astonishing verisimilitude.

Nonetheless, we know that history did not take the turn he anticipated. Instead the Second World War proved to be a far more intense conflict than he anticipated, which led not to the dissolution of the nation-state system, but its reorganization at a higher level--the old circle of Great Powers mostly folded into the alliance systems of the U.S. and Soviet Union. Moreover, the Keynesian policies he underestimated helped lay the foundation for the American economic recovery that proved crucial to the Allied victory in World War II, and the unprecedented, global boom that followed the conflict.

Of course, that order, too, did not last, the world in the 1970s turning to neoliberalism. Since then the world economy has been increasingly crisis-ridden and anemic, prone to increasing inequality, and ecologically strained in ways he did not imagine, with the problem raised by nuclear weapons in the 1940s (sooner than he feared) unresolved. It is a perilous situation, leaving many once again convinced that this state of things cannot last, but there is little enthusiasm for the sorts of solutions he proposed, and the idea of a future remotely like the one he described seems as implausible now as it has ever been.

Even the attempt to imagine something like it reflects this sensibility, as W. Warren Wagar's A Short History of the Future demonstrates. Modeled on Wells' scenario, but updated for our times, it envisages the world's economic and political crises leading to strategic nuclear warfare on a far more crowded planet; a much more painful rebuilding process, up against far more violent resistance; and much more problematic consequences, full of survivals of the older irrationalism, with anything like a new humanistic consensus beyond reach--postmodernism apparently here to stay.

Some of this is an understandable reaction to what now appears Wells' overoptimism about some things, like his confidence in Science as well as the sciences, the idea that these have to be on the same side. (I think of George Orwell's response to this attitude with the charge that in comparison with Britain the Nazis were both more barbaric and more scientific.) Certainly his confidence in technocrats, and the "plasticity" of human consciousness and behavior, far exceed anything one is likely to encounter today. Indeed, it is all such that Wagar himself confesses that his version of events has a look of "Augustinian pessimism" about it, and that is the difference between our troubled times, and the period in which Wells wrote his classic book.

1. The Soviet Union, in his view, was a promising beginning distorted when the "infinitely practical" Lenin died and was succeeded by the dogmatic Stalin, and with him, the primacy of politics over technical rationality, and nationalism over internationalism, so that it failed to rise to the role it should have played.
2. Interestingly Wells made no direct reference to the influenza epidemic which followed World War I and took a far heavier toll of human lives than the actual fighting.

Wednesday, June 19, 2013

These Cyberpunk Times: The Travails of Detroit

Watching the city of Detroit subjected to a wave of privatization by an unelected emergency manager, I cannot but think of Robocop 2--not as good a film as the original in most respects, perhaps, but certainly the more audacious social satire. And in its scenario, far more prophetic than anyone expected at the time, for all the goofiness of its presentation.

As it turns out, a good many others have had thoughts along similar lines, Aaron Foley recently writing of a proposal to erect a statue to the iconic character in that city.

Learning the Novelist's Craft

When first getting acquainted with the output of the "how-to-be-a-writer" industry, I read on a number of occasions that it commonly takes something like a half dozen manuscripts and ten years to get a book published--or at least, to write a "publishable" book.1 Such data as I have seen since then has convinced me that they were not all that far off the mark.

Of course, the reason why this is the case is that it takes such time to hone the relevant skills to the necessary degree. After all, novel-writing not only demands the skills associated with fiction writing in general (like the ability to tell a story, describe a scene or develop a character on the page), or writing in a specific genre (like the ability to create suspense that a thriller writer needs), but their exercise on the larger scale required for a book-length story, which means a writer's managing a number of interconnected subplots, coordinating a larger cast of characters, and overall presenting the broader field of view and deeper detailing that go with the form. The differences are not just quantitative, but qualitative, and entirely new and different skills are involved.

Still, those would-be purveyors of advice never said why it should take so long to pick these up. Rather, as I slowly learned, the answer laid in what they didn't say. Search as I might in the how-to books for some tidy explanation of what exactly one has to do to produce a hundred thousand word narrative, I never found it--because writing is more art than science, with little lending itself to formulas, and even the apparent formulas of little use in themselves.

Compare, for instance, the position of an English student learning how to construct a functional sentence, and another English student learning how to write a thesis statement. In the former case, the issue is reducible to the expression of a complete thought with a main clause containing a subject and predicate, accompanied by the appropriate punctuation and connections (e.g. capitalize the first letter of the first word, use commas to connect your clauses, finish with an end mark)--something one can execute and judge with almost mathematical precision.

Alas, no explanation of how to write a thesis statement is as compact and tidy as that. In the best of cases the student is given not a clear-cut prescription, but a theory of what constitutes a thesis, likely to be a much more complex thing (e.g. "A thesis is a claim about a topic with room for debate, based on a rational analysis of the facts"). One can, of course, judge whether or not someone has written a thesis in a reasonably objective way, but actually assimilating what it takes to produce one is a much more demanding thing, reliant on a good deal of intuition, personal observation, and learning by doing.

This is all the more the case with the writing of fiction. One typically learns to write novels by groping their way toward an understanding of the form, which typically comes less from the digestion of pithy advice than from reading a good many such works, and then by actually writing novels. The sheer length of a novel means that this is a slow process, a single first draft easily the work of months or years, without counting in the irregular and nonlinear pre-writing process, or the editing and revision that inevitably caps a serious effort. And of course, just as in the mastery of any other skill, several repetitions of the task are likely to be necessary before the result becomes tolerable.

Moreover, the process tends to be extended even beyond the amount of time it takes to write the requisite hundreds of thousands of words for several reasons.

One is the likelihood that the aspiring writer is unlikely to have their training process financially supported. This means that they are likely working at something else for income, which reduces the number of hours--and especially fresh, energetic, clear-headed hours--they can allot to their writing. (I know of one novelist--with a day job--who writes two hundred words a day. He will necessarily be less prolific than someone who, able to commit themselves to this full-time, can put in two thousand.)

Another reason is that one does not necessarily jump from one novel to the next. There are inevitable gaps, just as in any other course of work--because life gets in the way, and because, contrary to the celebration of stupid persistence so congenial to the self-help culture, a writer does sometimes need to go and do something else. People can get burned out writing, just as they can get burned out doing anything else, and the problem is exacerbated by the setbacks and disappointments with which the process is fraught--especially when one is not just writing, but pursuing publication. Submitting one's own work is not just time-consuming (and remember, any time in which one is submitting queries is time in which one is not actually writing), but typically a disspiriting activity that will frequently have the aspirant licking their wounds-- again and again and again, for many years.

It doesn't help that the learning process tends to be lonely. Yes, we all hear tales of people gifted with wonderfully supportive friends and family, but the reality is that people who are not writers generally do not understand what writers go through, and rarely have much sympathy for their tribulations (as the writer tends to find when they need to vent). Authors do command a measure of respect if they get rich and famous (despite which even the rich and famous have their insecurities and frustrations), and loved ones will likely put up with one who is at least paying his bills with his writing--but an author who is as yet unpublished tends to be treated with suspicion and contempt, as a malingering pseudointellectual who ought to be doing something more "useful" with their time. This doesn't help one's efficiency.

Unpublished authors also suffer in another way from that isolated position. Unlike an established writer with access to agent and editor and a slew of friends in the business, they do not get the benefit of personalized, professional advice. They may not even have anyone who can be a willing and able sounding board for their ideas, let alone offer meaningful feedback on their manuscripts. This leaves them having to figure out much more for themselves, doing much more of their own editing (another dispiriting activity), and makes them that much more likely to squander a lot of time and effort on artistically or commercially dubious ideas.

And so a decade goes by, just like that.

Of course, some writers do not need a half dozen manuscripts or a decade to reach the point at which they write "publishable" novels. Why the difference? Exceptional talent is always one possibility, though I am doubtful that talent spares anyone who actually pursues this course from having to suffer through a great deal of drudgery and frustration. A more likely possibility is that the author came to their first manuscript, or their third, with their skills relatively well-honed by other activity. They have already read more and written more than their counterparts by that point, and so are better-equipped to make the attempt. (Certainly a writer who pens their first novel-length manuscript at fifteen is likely to produce something quite different from what they would have written at twenty-five or thirty-five, especially if they have been doing other kinds of writing in the meantime.)

Some writers also get lucky, sidestepping a good many pitfalls without even knowing it, or learning more from their mistakes, or bouncing back more quickly from their failures, or having good fortune in their advisers, their personal support system, their access to privacy and quiet, or even the quantity and quality of leisure time they are able to devote to the effort. Some also get lucky in having a subject that makes the work easier, one that brings them closer than others to the ideal of "the story that writes itself." And some are fortunate in not having to go all this way on their own, but being able to get an agent or a publisher to take on a relatively raw manuscript, which they get to fix up under the close guidance of the pros, who are there for them again on the next attempt.2

Unfortunately, good luck is not a career plan.

1. Of course, publishable books often do go unpublished--while books that some might deem "unpublishable" do make it into the mill, and even reach the bestseller lists. The concern in this post is solely with the problem of writing a passable manuscript, getting published being an altogether different matter.
2. One other possibility is that the difference is not in the amount of work put in, but in the ways in which authors number their manuscripts. One writer may count the fifth massive rewrite of his first manuscript as still that first manuscript, while another might regard it as his sixth effort.

Monday, June 17, 2013

The Politics of Continuum, Part II

While the television show Continuum has drawn attention for its depiction of North America as a corporatocratic police state in the late twenty-first century, its attitude toward that milieu is what seems to have drawn most of the comment. A primary reason for this is that its protagonist (Rachel Nichols' Kiera Cameron) is a policewoman who came back in time to 2012 in pursuit of Liber8, a rebel group fightng the corporate order (led by Tony Amendola's Edouard Kagame), who are depicted as criminals and terrorists. One might consequently expect that Cameron's character is being presented as the hero, Kagame's group the villains – a role-reversal atypical for a genre which ordinarily has the good guys fighting dictatorships and police states rather than defending them.

The resulting ambiguity has been compounded by what the show's creators and cast have had to say about its politics. Their remarks suggest a relativistic neutrality in the conflict between Keira Cameron and Liber8, their attitudes treated as equally valid. This is no doubt meant to seem intriguing and daring, but in practice such "equal time" treatment of conflicting understandings and experiences has often distorted rather than clarified (as journalistic coverage of issues like climate change constantly reminds us).

Indeed, there are grounds for taking issue with the intent itself. Such a stance, after all, can be taken as a very convenient way of evading the writer's responsibility to seek the truth (again, as the lamentable state of contemporary journalism regularly reminds us). One can even contend that the willingness to give "equal time" to a dictatorial, human rights-abusing corporatocracy and its opponents is problematic from an ethical perspective, while also reflecting biases so deep-rooted as to scarcely be noticed. (Consider, for instance, how the show's relativistic stance would have been received were the secret police and the dissidents butting heads in a Communist society rather than a corporatocratic one.1)

What the show itself says about the world of 2077 at the outset is also worth noting, and possibly quite telling. Where the origins of the corporatocracy are concerned we are told that business took over an insolvent government in the course of bailing it out financially, a narrative which evokes the idea of efficient, market-disciplined private firms on the one side and government made profligate by its pandering to the voting rabble on the other.2 This premise at the least suggests that the show's economic thinking is not merely of the right (whose sympathy for business and wealth does not necessarily equal approval of corporatocracy), but those explicitly elitist and overtly anti-democratic quarters of the right where one might most expect to find sympathy for a fully privatized society.

Reinforcing this view is the fact that this understanding of economic life is never problematized, let alone challenged, at any point during the show's first season.3 Certainly I cannot think of any aspect of the show's future history which similarly reflects any critique of capitalism, corporate power or the tensions between these and political democracy (the Liber8 rebels instead having little to offer but resentment over inequity).4 This makes it at least plausible to argue that the show falls short of the "equal time" standard proclaimed by its creators, the writers taking the case for a Corporate Congress rather more seriously than they do any alternative for which the rebels might be fighting.

1. Simon Barry remarked in an interview that "I think that if you were to ask Rachel’s character if she lived in an oppressive society, she would say, 'No.' And I think that’s kind of the point. Our Liber8 freedom fighters/terrorists, if you will, have a different opinion."
2. It is also quite far from the reality of recent events. What we saw in 2008 was the opposite – profligate businesses run by corrupt executives bailed out by government, governments which arguably run in the red because of their pandering to business and the wealthy, exempting them from their fair share of taxes, showering these groups with giveaways of other sorts, and of course, enduring the revenue shortfalls that go along with the mediocre economic growth that has tended to follow in the wake of "pro-business" policy.
3. Indeed, it is worth noting that the scenario dramatizes at least one fear strongly associated with the populist right, namely the absorption of the United States into a North American Union.
4. Such a reading also seems to me reinforced by the presentation of farm boy and technical genius Alec Sadler as an Edisonade-hero-in-the-making (rather than a youth born to privilege, whose ascent to the uppermost strata of the North American Union is a function of his having picked his parents well), implying the meritocracy that is the justification generally given today for such extreme differences in wealth and status.

Thursday, June 13, 2013

Watching Doctor Who at Fifty

I will start off by saying that this post is not a retrospective about the series. I simply lack the context to provide that, having as I do virtually no familiarity with Doctor Who during its first twenty-six years. Instead it is a consideration of the show in its current form, which is frankly all I really know firsthand.

My first encounter with Dr. Who, after all, was in the American-made TV movie that aired on FOX in 1996. I remember little of it except what seemed to be the titular character's wandering a hospital without his memory for most of its running time. Needless to say, it left little impression on me, and I'd thought, on anyone else, and was later surprised to find that it was regarded as canon, with Paul McGann officially counted as the Eighth Doctor. Recently reading a summary of that movie it struck me that this may have been because I came to the film totally unfamiliar with the character's history (I'd never so much as heard of The Master), and that I might have had a more favorable impression if I came to it as a longtime fan - but in any event, I did not encounter the franchise again until Syfy Channel aired series one of the revival in early 2006.1

Once again, I didn't know what to make of the franchise.2 For one thing it seemed rather light on concept, lacking the density of world-building or intellectual play I found in shows like Babylon 5 or Lexx. Much of what the show did present seemed to me very old-fashioned (the Daleks something out of a '50s B movie), while other ideas appeared just plain silly (like the Autons the Doctor confronted in the first episode of the revival, "Rose"). For another, the Doctor's four-dimensional tourism seemed aimless and casual next to the more purposeful sagas I was used to - and surprisingly Earthbound, the hero of this space opera doing much of anything off-world for just four of season one's thirteen episodes. Instead he spent a lot of time in contemporary Britain, and when traveling to other times, tended to stick with British drama's most conventional choices (e.g. the Victorian era, World War II), which seemed rather limiting.3 Even on a visual level the production left me underwhelmed, neither lavish enough nor exotic enough to stand out from the then rather thick crowd of TV space operas.4

Still, I was a fan by the end of the first season. I suppose it just took a while for me to catch its wavelength: its penchant for Douglas Adams-like zaniness and whimsy, its faithfulness to its inheritance, its Britain-centricness.5 (I'd watched a lot of British television before, but little of it genre television, and while I suppose I'm rather more up on British history than most on this side of the Atlantic, once in a while there was a reference to something I didn't fully get, or wasn't familiar with from before.6) It took a while, too, to get to know the protagonist (who would be intolerably Mary Sue-like if he were not a nine hundred year old alien of exceptional charm and generosity), and the dynamic between Doctor and Companion, and the rhythms of the story arcs. The stories got bigger, the visuals better. And while still regarding it as intellectually and dramatically lighter stuff than many past favorites, and packed with implausibilities and dissonances best not examined too closely, every so often it offered a clever idea (as in "Blink"), or a genuinely moving moment (as in "Vincent and the Doctor"), while presenting its humanism with rather more conviction than the later Trek series' managed, and maintaining a cheerfulness that is downright refreshing in this age of dark-and-gritty-everything-with-a-side-of-still-more-dark-and-gritty. It has all made for an appealing enough combination that I find myself looking forward to season eight in a way I would not be to the announcement of an anemic new small screen Trek, or yet another clone of the Galactica reboot.

1. As indicated by the lengthy article devoted to it on Wikipedia, and its 5.9 score over at the Internet Movie Data Base, others responded rather more favorably.
2. It is worth remembering that I had long since read a large portion of the genre's print classics (a fair amount of newer stuff included), and taken in such shows as the famously idiosyncratic Lexx, so that this was not simply a case of a viewer being confused about what to make of a show simply because it "was not Star Trek."
3. Of course, series four of Lexx was set on contemporary Earth - but that followed three seasons of outer-space adventure.
4. For nearly two decades, from the late 1980s to the late 2000s, there was always at least one first-run North American space opera in production, and around the turn of the century, often a half dozen. However, no such show has been on the air since the cancellation of Stargate: Universe.
5. Interestingly, Adams was to be a direct influence on the show's course in a number of ways, not least his writing a number of episodes.
6. Case in point: in the second series episode "The Idiot's Lantern," the plot hinges on the coronation of Queen Elizabeth being a milestone in the history of television, something of which I had not previously been aware. Prior to that, in "The Doctor Dances," when the Doctor followed up "beat the Germans, save the world" with "don't forget the welfare state!" I had been less than certain of the connection implied between the former and the latter - not having yet read Angus Calder's outstanding history of the era, The People's War.

Sunday, June 9, 2013

Of Right and Left Transhumanism

The way we think about the possibility of transcending "the human condition" strongly reflects the way in which we think about humanity and society in general. Transhumanist thought has been no exception, evident across the political spectrum--while anything but the fundamental premise varies in line with the thinker's other political attitudes. In fact, it seems possible to speak of a transhumanism of the political right, and a transhumanism of the political left, the key difference between which is the context in which we would see technology utilized to effect fundamental changes in human beings. Rightist transhumanism seeks to achieve it within the existing social order (e.g. capitalism), and where the alleviation of problems like poverty or pollution are concerned, to anticipate that the new technologies associated with it will obviate the need for social and political change. Leftist transhumanism regards it as part of a broader project of human liberation, likely to follow the achievement of a more equitable society.

Ray Kurzweil's Singularitarianism is a clear case of the former, positing as it does our experiencing twenty thousand years' worth of change within the space of a century--and still finding ourselves living with a capitalist economy, while the solution of our environmental problems depends on the improvement of technology in response to market imperatives, rather than social and political innovation (like changes in values or the emergence of new institutions). By contrast, Olaf Stapledon's Last and First Men and Star Maker, or W. Warren Wagar's A Short History of the Future, present the leftist version, in which the modification of the species follows the achievement of a "good society" which has moved past the inequities of our own.

Just as elsewhere in our intellectual life these past several decades, the ideas of the right have prevailed in this discussion, reducing the left to offering little but criticism of the visions of Kurzweil and company. Nonetheless, Ken MacLeod's recent piece in Aeon Magazine contending that socialism, or something like it, is all the more necessary in an age in which our already contentious identity politics have been made much more so by the addition of substantial biological diversity hints at the possibility of movement in this direction.

Saturday, June 8, 2013

Sex on Screen: "The Joylessness of Sex" on Television

I previously noted the decline of sex as a theme of blockbusters at the American box office, and the disappearance of gratuitous nudity as an element in major hits.

Of course, this has had partially to do with the unprecedented abundance of sex on television. Still, as Gina Bellafante notes over at T Magazine,
what's striking about the current depiction is how much of it just isn't sexy — how much of it is divorced from any real sense of eroticism or desire. The audience, at home in bed in need of diversion, is betrayed. What they get instead is sex that is transactional, utilitarian — the end product of a kind of twisted careerism.
This is less surprising than it may seem. Even as some applaud a new age of openness, and others decry the sexualization of culture, society remains at bottom deeply sex-negative. In those quarters where religious-traditionalist hostility to sexual imagery has weakened (or at least, sunk beneath the surface of everyday life), postmodernism and identity politics have filled its niche. Terms like "objectification," when tortured far past any useful meaning in that way that has become routine, can virtually outlaw such concepts as physical attraction, sexual fantasy and even sexuality itself.

Whether intentionally or unintentionally, proponents of such ideas have had their effect, one which has naturally been manifested in our cultural production. Sex as a set-up for degradation or self-destruction, sex as a temptation to be resisted or a display of weakness, even sex as banality and disappointment, whether treated comically or tragically or tragic-comically, is broadly accepted. Sex as an occasion for embarrassment or frustration or gross-out gags (as in "raunchy" comedies), sex scenes which make the viewer's skin crawl (Ms. Bellafonte describes several), are what we can expect. The idea of sex and sexuality as a source of pleasure (for the characters, or the viewer) is regarded with much more apprehension, and within the mainstream, is rather more elusive, even as depictions of sex have become more frequent and graphic and accessible.

Once again I remember Roger Vadim's remark that "Hollywood is sometimes licentious, but always puritanical."1 And that hasn't changed. If anything, that characterization seems truer of the place than ever.

1. Also consistent with this sensibility: sex as a prelude to the monster or killer striking like some cosmic punisher (as in horror movies), or sex as indiscretion or transgression for which someone will soon pay dearly (the basis of the erotic thriller and much other drama).

Subscribe Now: Feed Icon