Wednesday, July 3, 2013

What Would You Like to See on This Blog?

The title of this post says it all--I am asking what you, the readers of Raritania, would like to see me doing here on this blog, whether that means this blog's doing more of something or less of something, whether you have in mind a single-post treatment of some topic or a longer-ranged shift in emphasis, or even simply some change in the design or the line-up of feeds in the blog-list at the side of the page. I am interested in what you didn't like as well as what you did.

To get your message to me, just leave a comment below (or if you prefer, send an e-mail to the address provided on this page). I cannot assure you that I will be able to use all of your suggestions, but I do promise to answer every comment and question appearing on this page, the comments section of which will, I hope, become a forum for such ideas.

On Warehouse 13's H.G. Wells

I have recently had occasion to return to the work of H.G. Wells--and found myself thinking of Warehouse 13's inclusion of a Victorian figure by that name in the core cast.

The show's premise regarding the character is that H.G. Wells is not the famous writer we knew, but his sister--never mind that he had no sibling named Helena, and for that matter, no sister by any name in his lifetime.1 The show then goes on to claim that the man we think of as Herbert George Wells was really Charles Wells (why we thought of him as H.G. is never explained), and made his fame with Helena's ideas, for which he claimed full credit--again, never mind that Helena was bronzed in 1900, while the H.G. Wells we know not only lived another forty-six years, but was productive throughout that period, giving the world such novels as A Modern Utopia (1905), The War in the Air (1908), Tono-Bungay (1909), The History of Mr. Polly (1910), The New Machiavelli (1911), The World Set Free (1914) and The Shape of Things to Come (1933), and nonfiction works like Anticipations (1901), The Outline of History (1920) and The Science of Life (1930), an opus hardly reducible to pre-1900 inspiration.

For anyone who assumed a connection to Wells' actual biography this can only seem confusing, the liberties the show takes too great for the concept to work as a compelling secret history (not that Wells is much of a candidate for such a story anyway, given how public a figure he was for a very long time). So what, one wonders, is going on here? Are the writers of this famously concept-light show playing a postmodern game with the audience?

If this is a game, it does not seem much of one, actual use of Wells' character or ideas being very scarce, as was probably inevitable. For instance, how likely is American television to feature a character making the case for a secular, socialist world state? (The closest we come to this is Helena's expression of disappointment at the state of the world she found a century after her bronzing.) No one seems to have noticed the shocking disparity between the rationalism for which Wells so famously stood, and the magical artifacts with which the warehouse deals.2 And even superficial reference to the stuff of Wells' best-known books is rare. (We do see Helena employ a time machine of her own design at one point, but it bears no conceptual resemblance to the device of that novel, or even physical resemblance to the iconic realization of it in the 1960 film version of the book, and in the end the plot of the episode really looks like just an excuse to send Myka and Pete back to the era of Don Draper, lazily milking the Mad Men cult just like everybody else on American television.)

And so it seems less an attempt at a game than the mere spicing up of the show with a character named so as to grab a bit of unearned attention from the steampunk-inclined. On the whole I do think the show is better off for Helena's (and Jaime Murray's) presence in it, but it has to be admitted that tying her character up with H.G. Wells in the way that they have is rather lazy and cheap--and for an admirer of Wells, rather grating in its disrespect for the great author and his achievements.

1. Wells' parents in fact had only one daughter, Fanny, who died two years before Herbert George was born.
2. George Orwell famously paid tribute to Wells as the English-speaking world's prophet of reason in the first two decades of the twentieth century in "Wells, Hitler and the World-State"--in which he ironically went on to lament that "since 1920 he has squandered his talents in slaying paper dragons."

Monday, July 1, 2013

Filming John le Carrés The Honourable Schoolboy? (A Note on the Challenges)

WARNING: MAJOR SPOILERS AHEAD

Two years ago a film version of John le Carré's classic Tinker, Tailor, Soldier, Spy hit the big screen. The film was certainly not perfect from a purist's perspective. As was perhaps inevitable in any viable two-hour film, it did not quite do justice to the novel's sprawl and feel (the tangle of the back stories, the wooliness of the investigation, the shabbiness of imperialists living "after-the-empire"), and less inevitably, included some questionable additions (the dubious symbolism of Smiley swimming in a pond, bits of violence apparently intended to spice up a story lacking in action, but which seemed merely repellent and even propagandistic).1 Nonetheless, the script was an impressive feat of compression and rearrangement, rendering a book that often seems opaque not merely intelligible, but accessible, while retaining something of the original's complexity. The film was also bolstered by strong performances from the cast and skillfully edited (the closing montage justly drawing favorable comment). The result was a critical success, and on its modest terms, a commercial one as well, which has led to some talk of a sequel, focusing on Smiley's People.

Of course, Smiley's People is the third novel in the "Karla" trilogy, not the second, such a plan necessarily skipping over the series' second book, The Honourable Schoolboy--despite not only the fact that the events of Schoolboy bring about the situation we see at the start of Smiley's, but that it also seems to be the more highly praised of the latter two novels. Nonetheless, it is worth remembering that, difficult as Tinker, Tailor was to compress into a two-hour movie, Honourable (the longest of the three novels) is harder still, as it tells a larger, more complex story following two different but related and ultimately converging tracks--the maneuverings of Smiley and his people in London in the aftermath of the unmasking of Bill Haydon's treachery, and Jerry Westerby's field work in Southeast Asia on Smiley's people's behalf. This is compounded by the fact that what were at the time of the novel's writing recent events are now relatively obscure history, particularly the complex of interrelated Southeast Asian wars (Vietnam, Laos, Cambodia, Thailand) that form a significant part of the backdrop. A great deal would have to be explained to the audience, which afterward might still have a hard time following along, easily thinking that Westerby is still in the same country, and confronted by the same conflict, even as he crosses from one country (and one war zone) to another.2

Moreover, Westerby's adventure is nothing short of an epic journey across the region at the time of the fall of Saigon--the kind of thing which would be hugely expensive to shoot faithfully. Where Tinker, Tailor mostly gave us small groups of people talking to each other in mundane-looking rooms, Honourable is packed with such spectacles as the high life at the Happy Valley Racecourse and the siege of Pnom Penh, while, uncharacteristically for a le Carré novel, there are a number of elaborate set pieces involving blazing machine guns, explosions and swooping aircraft.

I have not heard any estimates of what the budget would have to be to get it all on screen, but I would not be surprised to hear a figure upward of $100 million--in contrast with the $20 million spent on Tinker, Tailor. Of course, such sums are spent on movies all the time. (I counted at least twenty major releases with such budgets in 2012 alone.) Nonetheless, a blockbuster budget can only be raised when there is the prospect of a blockbuster gross, and one has to recall that Tinker, Tailor was a hit on a much smaller scale, earning $80 million not in the first weekend of its North American release, but its entire global run. And the prospects of a film version of Honourable Schoolboy doing much better, let alone well enough to justify a $100 million-plus production budget, seem slim given not just the limited size of the built-in audience created by the original book, and the previous film, but the source material itself, which in its structure and course is no more the stuff of blockbusters than Tinker, Tailor was.3 The fact that the Circus is here working against Chinese intelligence is also likely to be an inhibiting factor, given the leeriness of the film industry about doing anything which might seem offensive to those in command of what is now the world's second-largest movie market--fears that loom all the larger when one talks big budgets. Those particular fears are not at all allayed by the fact that a major part of the story is set in British-ruled Hong Kong, and that unlike in Tinker, Tailor (which switched its Hong Kong scenes to Istanbul), the location cannot be changed without doing considerable violence to the plot. (One might add, too, that the story is not especially flattering to the U.S. either. In fact, reading the book I had the impression that, after writing of the end of the British Empire in Tinker, Tailor, le Carré decided to take on what looked to many at the time like the end of the American empire.)

Of course, book-to-screen adaptations make compromises all the time, but the tension between art and commerce here would be considerable, and the results likely to displease fans without reaching that more general audience necessary to make the project profitable. By contrast, Smiley's People is a far easier movie to make, with a simpler, more compact story making far fewer demands on a production's resources. The Russian-set bits are easy enough to do on a sound stage, and the rest of the location shooting poses little challenge, while the large-scale spectacle and elaborate action of Schoolboy are totally left out. All of that makes this turn of events unsurprising, even if it is disappointing to fans of the trilogy who would have liked a big-screen version.

1. In particular there are two graphic killings which have the effect of identifying the Soviets with senseless, misogynistic violence.
2. I suspect that even Americans familiar with the era are scarcely aware of the war in Thailand, or that the United States continued to conduct an air war in Cambodia for several months after the withdrawal of U.S. troops from Vietnam.
3. Consider, for instance, the finale to this rather more ambiguous, personal tale: Westerby betrays the operation to save a woman with whom he has become obsessed, and is killed by his colleagues in British intelligence for it. After that the prize defector at the center of the game winds up in American hands, with the British cut out of the debriefing to follow; and Smiley himself gets pensioned, while his "people" are also squeezed out of their present jobs, reassigned when not retired. In short, Smiley does it again--despite which Smiley's people lose, hardly a crowd-pleasing finale.

Sunday, June 30, 2013

An Alternate World War II: H.G. Wells' The Shape of Things to Come

Reading Wells' The Shape of Things to Come from the standpoint of eight decades on, one is, of course, struck by how far the history of the twentieth and twenty-first centuries verged from his anticipations. Certainly his expectations about the conflicts that led to World War II are no exception; rather than the last gasp of the nation-state, they led to the birth of far more states (as a result of decolonization), and the dominance of international politics, and certainly the European politics on which he focused, by two states of unprecedented strength (the U.S. and Soviet Union). Yet, I was also struck by the number of parallels between his anticipations and what actually happened in the war--what he got right as well as wrong. Below is a list of the parallels that caught my eye.

* The war is ongoing by 1940.
* Despite widespread nationalistic hysteria, there is little enthusiasm among the civilian populations of the countries marching off to war.
* Germany's rearmament is an open process as well as a secret one.
* The war begins with fighting between Germany and Poland over Danzig.
* While war rages in northeastern Europe, Italy expands in the Balkans.
* Despite its commitments to its East European allies, France hesitates to fully enter the fray.
* While fighting Germany in the west, Poland finds itself attacked in the east by Soviet action.
* Germany and Austria are unified.
* There is a confluence between the initially separate aggressions of Germany and Italy into a common conflict between these powers on one side, and their enemies on the other, that extends "from the Pyrenees to Siberia."
* Hungary and Bulgaria fight on the same side as the German-Italian "Axis," entering the action through their participation in the invasion and occupation of Yugoslavia.
* The Axis entry into Yugoslavia is challenged by a significant guerrilla campaign.
* In East Asia, Japan invades "China proper," leading to open warfare between it and the United States.
* The war continues for several years in both Europe and Asia.
* Aerial warfare is a significant part of the conflict, with the belligerents relying on it heavily to achieve their aims. It is also highly indecisive--while still killing millions.
* The war contributes powerfully to the collapse of Europe's dominion over the rest of the world.
* The war sees the Soviet Union expand inside Europe, gaining territory at Poland's expense, and also establishing a new Soviet republic on territory taken from Romania, while more broadly consolidating a sphere of influence in the eastern, predominantly Slavic, part of the continent.

Especially given that the book was published in 1933, six thoroughly eventful years before the outbreak of open warfare among Europe's great powers, this is an impressive list of good calls, which comprise a substantial part of the war's outline as well as numerous particular details of its course--and testify to his real insight into the events of his time. Still, Wells got a great deal wrong, and it does not do to gloss over these, least of all those attributes of the conflict that may have been foreseeable. Two errors are particularly striking (certainly where the larger shape of the European war is concerned). The first is that he expected a smaller number of major powers to take part as belligerents than actually did so, particularly in Europe--his scenario excluding Britain and the United States from the fighting on the continent, while the Soviet Union and Germany also did not come to blows. The second is that he anticipated a "limited" effort on the part of those actors that did fight, waging a technologically sophisticated but economically constrained war.

Nonetheless, this may not have been so unlikely as it now appears. While Wells characterized Hitler as "the voice of Germany losing control," and viewed the country as having been maddened by the situation of Danzig, he anticipated a weak Germany--weak enough, in fact, that in the conflict over Danzig, it is Poland which strikes the first large-scale blow (with an air raid on Berlin), and then holds its own in the subsequent war, even with a Soviet-backed rebellion in Polish Ukraine. At the same time, he also anticipated a much weaker United States, much less able to intervene in a European war.

This was, in part, a reflection of his expectation that the Depression represented the final collapse of capitalism. Wells was aware of the American New Deal, and regarded it as a step in the right direction, but predicted that it would prove unsuccessful--not because of any flaw in the economic theory underlying it, but because he thought that American political culture, and American inexperience in such matters, would prevent the Federal government from successfully administering such a program. However, the New Deal actually returned the U.S. economy's output to 1929 levels by 1936, and left it 20 percent larger in 1940. Germany, too, achieved a significant reduction of unemployment, and the recovery of its national output, through "Keynesian" methods.

And of course, each nation was to translate that recovery into a stronger military position, and build on it, with a "military Keynesianism" of a kind which Wells simply seems not to have anticipated (identifying military spending solely with economic strain and privation as he did). Germany followed this course first in 1936, to which one might add that Germany (aided by its appearance of comparative dynamism) also attained a number of political successes that enabled it to further strengthen itself, like the Anglo-German Naval Agreement of 1935, and more dramatically, Germany's getting other nations to acquiesce in its union with Austria in 1938, and its absorption of Czechoslovakia in 1939, all of which substantially improved its strategic and military position when hostilities actually began. So did the German army's capacity for high speed and flexible movement, a function in part of its exploitation of the possibilities of the tank in ways that the Allies had failed to do during the '30s.1 The result was not only to make a Polish attack on Germany implausible, but to make Germany's war against Poland short and successful. Indeed, it enabled Germany to turn west and win similar victories over the Low Countries and France.

Had Germany not posed that kind of threat, had it in fact been attacked by Poland, and been unable to achieve better than a draw in that fight, while France dithered about supporting its ally, one could indeed imagine Britain remaining aloof from the conflict, especially if it were distracted by colonial affairs--and the United States being that much less inclined to enter the fray, while the Soviet Union remained content with shoring up its position in eastern Europe. Nonetheless, Germany's growing power and record of expansionism in the interwar period assured British involvement in a German-Polish war on some level. Subsequently, Germany's successes early in the war meant increasing U.S. involvement through its support of Britain, and after December 1941, its own open, direct, full-scale involvement. That involvement, in turn, greatly enlarged the U.S.'s power to act through an extraordinary spurt of war-driven economic growth, American GDP expanding over 70 percent between 1940 and 1945.1

This combination of an economically stronger and expanded Germany capable of threatening all of Europe, and an economically more powerful United States capable of a major transoceanic intervention, are at the root of the other, larger error Wells made about the character of the war, namely its totalistic nature.

Wells' expectation, in line with the fashionable military futurology of the time, was that "the next war" would not be a "total" clash of multimillion-man armies and navies, but a conflict of small, technically oriented establishments emphasizing "aero-chemical" power above their other arms. On that score he, and a great many others, were wrong (especially about the casual attitude of the belligerents toward the use of chemical weaponry). Still, Wells was more astute than many of his contemporaries in appreciating that strategic bombing would not win the quick, decisive, cheap victory enthusiasts of the technique promised--and not for lack of trying, with the U.S. and Britain in particular relying very heavily on this approach.2

Equally, he was right about the European powers trying to avoid a totalistic conflict--as they did in the earlier period of the war, haunted as their leaders were by memories of the domestic upheavals to which World War I led. Britain, and France, certainly refrained from this in the "phony war" phase, after which France was knocked out of the game, and Britain only made a more strenuous effort following the French collapse--an effort which exhausted its foreign exchange reserves by early 1941 (after which vast foreign, and especially American, economic aid was indispensable). Germany, too, was leery of a total war, actually avoiding full-scale mobilization until 1943, after it was embroiled in a losing fight against an alliance that included a fully committed United States and Soviet Union--its run of conquests instead massively subsidizing its efforts.

Putting it another way, it was a stronger Germany's series of swift political and military triumphs over its initial continental opponents (which let Germany go on the offensive so successfully as to leave Britain fighting for its life, and clear the way for an invasion of the Soviet Union), and the economic recovery, and boom, of the United States (which enabled it to not just intervene, but intervene massively), that made the European war truly and lengthily totalistic (the U.S. effort subsidizing Britain's, and along with the strains of the Axis-Soviet war to which Operation Barbarossa led, contributing to Germany's reacting in kind). All of this resulted in a conflict that was shorter than Wells anticipated (ending in 1945 rather than 1949), but far more decisive (with the Allies securing the formal unconditional surrender of the Axis countries, then proceeding to occupy those states and give them entirely new political institutions rather than merely suspending hostilities), and by some measures also more horrific (perhaps 30 million dying on the Eastern Front alone).

By contrast, had the war remained a continental affair among relatively anemic military actors, history might have gone rather more like Wells' anticipation. Indeed, like many a well-reasoned vision of the future which did not actually come to pass, his scenario retains a great deal of interest as a counterfactual long after it has ceased to interest us an anticipation of "things to come."

1. This failure is ironic given that Wells offered an influential presentation of the idea in the 1903 short story "The Land Ironclads."
2. Giulio Douhet, one of the foremost such thinkers, argued in his classic The Command of the Air that a squadron of twenty planes would suffice to "break up the whole social structure of the enemy in less than a week, no matter what his army and navy may do." (This quotation is taken from the Dino Ferrari translation published by Coward-McCann in 1942, where the remark appears on page 142.)

Reflections on William Gibson's Distrust That Particular Flavor

On first contact I found William Gibson's fiction deeply frustrating. One reason for this was stylistic. (In the pieces with which I first came into contact, like the short story "Johnny Mnemonic," clarity and flow were sacrificed to the principle of showing as flashily as possible--and this was certainly one writer who could have told a good deal more than he did.) However, the content of his stories was also an issue. His Sprawl stories and novels seemed to touch on innumerable major issues, from the Cold War to corporate power to the widening gap between the classes, but he never really said anything about these things.

Gibson was, in short, a postmodernist. Since then I have increasingly thought of him--not just today's Gibson, but the Gibson we've always had--as a postmodernist first and a science fiction writer second, with Gibson's remarks in interviews reinforcing the impression.1 If anything, it has been reinforced yet again reading Distrust That Particular Flavor, the first ever collection of the small but noted body of nonfiction he has accumulated over the years.

As those who have made its acquaintance before know, Gibson is not a scholar or a journalist. Rather what he does in his nonfiction is offer reflections on a handful of experiences--particularly through pieces of travelogue (most of it Japan-set, though his feather-ruffling account of his famous visit to Singapore, "Disneyland With the Death Penalty," is included), and accounts of his contacts with various bits of information technology and media (like "William Gibson's Filmless Festival," in which he shares a number of movies shot on video with his daughter).

At their best, these pieces can be read as a collection of glittering bits, offering a razor-sharp insight in razor-sharp prose, and on occasion even engaging on a human level (as in his tribute to actor Takeshi Kitano, "The Baddest Dude on Earth"). At their worst they seem hopelessly self-indulgent, like "My Obsession," his piece about a period of addiction to shopping for mechanical watches on EBay (which Gibson himself admits went on way too long). Fortunately most pieces come closer to the former than the latter, and while in the end it was not all that I would have hoped for from Gibson's first nonfiction book once upon a time, I still found it a worthwhile read.

1. To put it bluntly, he always seemed to speak at length about trivia (like using EBay), rather than the Big Issues about which I always hoped he would say something.

Monday, June 24, 2013

Bad Teacher 2?

WARNING: MILD SPOILERS AHEAD

I rarely expect much from Hollywood comedies anymore, and 2011's Bad Teacher was no exception. Teachers, after all, are a shamefully easy, "safe" target, on the receiving end of several of American society's animuses--toward intellectuals, toward government and government workers, toward organized labor (and especially public sector unions), toward liberals (which teachers are all incorrectly assumed to be by many a conservative detractor). And because of their vulnerability (as a not merely maligned, but ill-paid and relatively unprestigious profession) prone to be blamed for every failing of American education, real or merely perceived.

Still, there is no denying that the education system, like any other major institution, is not an unfit subject for satire. Our debates over curriculum and educational methods, from the math wars to standardized testing; the tension between the "convenient social virtue" society is prone to demand of teachers, and their needs and rights as human beings and professionals; the sociology of the classroom and the educational bureaucracy and the broader conditions under which educators work; the inequality between the schooling given the poorest and the most privileged--all this can plausibly make for a movie about something.1

Bad Teacher does not go this route, however.2 The failings of the titular figure, Elizabeth Halsey are purely individual and anomalous. She is simply a lazy, conniving "bad girl" who somehow got hired as a teacher, and somehow never got weeded out from among the flaky, naive do-gooders who are her colleagues--not that she has any desire to continue in this career. Her real aspiration is to marry money so that she can give up teaching and live in comfortable leisure. This leads to a pair of overlapping love triangles--the first between Elizabeth, substitute teacher Scott Delacorte and Elizabeth's colleague Amy Squirrel in the first, and the second between Elizabeth and Scott and gym teacher Russell Gettis--which have Elizabeth scheming to keep Scott and Amy apart, and buy herself breast augmentation surgery which she thinks will let her win over Scott (whose principal attraction is his coming from a wealthy family).

Of course, love triangles, ill-conceived money-making schemes and general bad behavior are standard comedic material, but even as "turn your brain off" entertainment I found the results underwhelming. The characters were one-note and often annoying (particularly Delacorte and Squirrel), the jokes mostly flat and ill-conceived (with Halsey's un-p.c. remarks merely seeming incongruous rather than funny, sounding as they did not like the un-p.c. remarks of her generation, but her cranky old grandfather's). And to top it all off, the film, wholly dependent on Elizabeth's consistent greed and mean-spiritedness for its few laughs, has her undergo a third-act change of heart so unconvincing that it feels like a parody of the kind of commercial hokum that director Jake Kasdan sent up in his earlier The TV Set (2006).

Nonetheless, the film was a hit as these things go, pulling in over $100 million domestically and another $115 million overseas (no Hangover, but still decent given the $20 million budget), and finding enough enthusiasm in Hollywood for a continuation that CBS has a sitcom based on the material premiering in the fall, and Sony has now announced a sequel.

I wonder about these decisions. It is far from clear to me that the concept has a hundred episodes in it, especially with the sexuality and language unavoidably toned down for network television.3 Equally the sequel seems a risky proposition, because it is bound to cost more--but not necessarily bound to sell more tickets. The margin between budget and gross is not all that high here, and I am not at all sure that this is an idea which left viewers begging for more (and indeed, the audience's affection does not seem to have been overwhelming). Also unhelpful is the conclusion of the first film, which had a (somewhat) reformed Halsey give up teaching to be the school's guidance counselor, so that she is no longer so bad as before, nor a teacher, complicating any continuation. Certainly it is difficult to picture the sequel doing well at the box office while the sitcom of the same name is airing--a potential competitor rather than complement, and perhaps a drag if it is not well-received.

If Hollywood really does mean to turn this one into a large, profitable franchise, those least powerful of Tinseltown's creative people, the writers, will have a lot of difficult work ahead of them.

1. As John Kenneth Galbraith put it in his classic Economics and the Public Purpose, the "convenient social virtue ascribes merit to any pattern of behavior, however uncomfortable or unnatural for the individual involved, that serves the comfort or well-being of, or is otherwise advantageous for, the more powerful members of the community. The moral commendation of the community for convenient and therefore, virtuous behavior then serves as a substitute for pecuniary compensation." This is "widely important for inducing people to perform unpleasant services." Nurses, for example, are expected to accept such commendation "as a partial substitute for compensation," while "such merit was never deemed a wholly satisfactory substitute for remuneration in the case of physicians."
2. In short, Bad Teacher proves to be pure Bucket Brigade, saying as much about education as Anchorman says about journalism and broadcasting, The 40 Year Old Virgin says about celibacy, or Step Brothers says about why people continue living with their parents well into adulthood--which is to say that not only does it say nothing, but it does not even appear to have been interested in saying anything. Instead the film is a showcase for the writers' and actors' trademark lines and gags (like the bizarre exclamations labeled "Ron Burgundyisms" on Urban Dictionary).
3. Premium cable, of course, would be another matter.

H.G. Wells' The Shape of Things to Come: Eighty Years On

For much of his career H.G. Wells was a founding proponent of what Patrick Parrinder termed "the scientific world-view"--a sense that the nineteenth century liberal order (a balance of power among nation-states preserving the peace, a capitalist world economy) was dead, and that the way forward for the species lay in rationalism, socialism and international organization. Advanced in both his nonfiction and fiction, across a range of genres, Wells gave it its fullest treatment in 1933's book The Shape of Things to Come. Wells presents Shape as the "dream-book" of Philip Raven, a British civil servant who recorded dream visions he had of a twenty-second century history textbook which tells of the founding of a future "World State."

Its story is, in a way, the story of all human history, the book looking back to antiquity for the origin of the idea in the World-State, and finding it in the universalism of the religions which emerged in the first millennium BCE, and the dissent of prophets taken for insane or inspired, and always inconvenient (with Plato's Republic a crucial moment in its depiction of an elite committed to service rather than self-aggrandizement in implementing in a social order). These forces continued to develop, until by the seventeenth century an increasingly critical turn of mind was evident, and with it the beginnings of modern scientific rationality and criticism of existing institutions--monarchy, aristocracy, religion, even property.

Alongside these important cultural and intellectual changes, advancing technology and growing organization made what was previously an ideal into a practical possibility and necessity by the early twentieth century. By this point, Wells' textbook argued, capitalism was translating increases in productivity into unemployment, weakened consumption and slumps--a self-defeating pattern which appeared to have culminated in the Great Depression--while avoidable poverty continued to stunt human life. Meanwhile advancing military technology and the increasingly totalistic character of warfare (demonstrated in World War I) made armed conflict more costly and dangerous for human civilization, with the risk of such conflict exacerbated by nationalistic and militaristic ideas, by what we today call military-industrial complexes, by the worst of tradition and habit (religious beliefs, sexual attitudes, etc.), and even by the physically and psychologically unhealthful conditions of everyday life in the era (all the way down to bad hygiene and noise pollution).

Wells also pointed to widening recognition of the problems, and of the ways in which they could be redressed, in meaningful but flawed developments like Henry Ford's Peace Ship, the League of Nations, the London Economic Conference, Franklin Delano Roosevelt's New Deal--and the emergence of explicit alternatives to liberalism on the left in Soviet Communism, and the right in fascism in Italy, Germany and elsewhere.1 Nonetheless, the limitations of these initiatives and movements mean that the world fails to recover from the economic crisis. Instead the world's economy continues to fall into greater disarray, resulting in living standards, deteriorating public services, and rising crime and insecurity of all kinds, extending even to a resurgence of maritime piracy. Moreover, the worst in these new movements comes to the fore when Nazi Germany's dispute with Poland over Danzig escalates into shooting, sparking a general European war that, along with the ongoing conflict in east Asia, saps what remains of the world's economic vitality, and leaves the world wide open to a plague that kills off half the planet's population in the 1950s.2

This collapse of the old order creates a space in which a new one can emerge, the groundwork for which is laid by a practical-minded technical-administrative elite centered on the control of the world's long-distance communications. This "Air and Sea Control," the members of which recognize that the more traditional objects of such careers have been mooted by the new realities, works instead for the revival of the world economy and the building of a World State, a process formalized in a 1965 conference at Basra. This dictatorship, up against the remnants of the old order of capitalist enterprise and sovereign states (actually drawing strength from the revival of the world economy), manages to pragmatically, patiently and consistently outmaneuver its foes, only rarely having to employ any significant force to get its way. (Indeed, it manages to circumscribe privately owned business and let it fade away rather than outlawing it.)

Of course, after this point there are continued difficulties as the World-State goes about the immense practical work of achieving material plenty for all, eradicating disease, and educating its citizens for life in the new world, while still facing attempts at subversion from those opposed to it, resulting in thousands of political murders, and executions into the 2030s (during which the conduct of the security organs was not always unimpeachable). Later the World-State itself became a problem, under the administration of a stifling (if benevolent) "Puritan Tyranny" that was so "consumed by an overwhelming fear of leisure both for [itself] and others" in a world where so many of the old constraints on life had been eliminated that it bowdlerized literature and the arts, and "INVENTED work for the Fellowship and all the world."

Nonetheless, this too gives way as the World-State withers, the Central Council increasingly disregarded by the subordinate Controls overseeing various aspects of human life (education, health, etc.)--an administration of things rather than people. The process is formalized when the Council is officially retired in the 2059 Declaration of Mégève, which declares "the Martyrdom of Man . . . at an end" and "the sunrise of the hour" of united humanity in a world without superstition, tribalism and want, freer, happier--and saner--than the people of previous centuries had even dreamed it could be.

As might be expected from Wells' earlier success as a popular historian, his historiography is impressive--thorough and lucid and accessible. As might also be expected given his brand of science fiction, his interweaving of the events of the past, and his present, with his record of things to come, feels seamless even today, enough so that even someone well-read in the history of the time can have a hard time figuring out just when he stopped writing as a historian and started writing as a scenarist. A grand testament to both his vast imagination and underappreciated technical skill, it endows what he describes with an astonishing verisimilitude.

Nonetheless, we know that history did not take the turn he anticipated. Instead the Second World War proved to be a far more intense conflict than he anticipated, which led not to the dissolution of the nation-state system, but its reorganization at a higher level--the old circle of Great Powers mostly folded into the alliance systems of the U.S. and Soviet Union. Moreover, the Keynesian policies he underestimated helped lay the foundation for the American economic recovery that proved crucial to the Allied victory in World War II, and the unprecedented, global boom that followed the conflict.

Of course, that order, too, did not last, the world in the 1970s turning to neoliberalism. Since then the world economy has been increasingly crisis-ridden and anemic, prone to increasing inequality, and ecologically strained in ways he did not imagine, with the problem raised by nuclear weapons in the 1940s (sooner than he feared) unresolved. It is a perilous situation, leaving many once again convinced that this state of things cannot last, but there is little enthusiasm for the sorts of solutions he proposed, and the idea of a future remotely like the one he described seems as implausible now as it has ever been.

Even the attempt to imagine something like it reflects this sensibility, as W. Warren Wagar's A Short History of the Future demonstrates. Modeled on Wells' scenario, but updated for our times, it envisages the world's economic and political crises leading to strategic nuclear warfare on a far more crowded planet; a much more painful rebuilding process, up against far more violent resistance; and much more problematic consequences, full of survivals of the older irrationalism, with anything like a new humanistic consensus beyond reach--postmodernism apparently here to stay.

Some of this is an understandable reaction to what now appears Wells' overoptimism about some things, like his confidence in Science as well as the sciences, the idea that these have to be on the same side. (I think of George Orwell's response to this attitude with the charge that in comparison with Britain the Nazis were both more barbaric and more scientific.) Certainly his confidence in technocrats, and the "plasticity" of human consciousness and behavior, far exceed anything one is likely to encounter today. Indeed, it is all such that Wagar himself confesses that his version of events has a look of "Augustinian pessimism" about it, and that is the difference between our troubled times, and the period in which Wells wrote his classic book.

1. The Soviet Union, in his view, was a promising beginning distorted when the "infinitely practical" Lenin died and was succeeded by the dogmatic Stalin, and with him, the primacy of politics over technical rationality, and nationalism over internationalism, so that it failed to rise to the role it should have played.
2. Interestingly Wells made no direct reference to the influenza epidemic which followed World War I and took a far heavier toll of human lives than the actual fighting.

Wednesday, June 19, 2013

These Cyberpunk Times: The Travails of Detroit

Watching the city of Detroit subjected to a wave of privatization by an unelected emergency manager, I cannot but think of Robocop 2--not as good a film as the original in most respects, perhaps, but certainly the more audacious social satire. And in its scenario, far more prophetic than anyone expected at the time, for all the goofiness of its presentation.

As it turns out, a good many others have had thoughts along similar lines, Aaron Foley recently writing of a proposal to erect a statue to the iconic character in that city.

Learning the Novelist's Craft

When first getting acquainted with the output of the "how-to-be-a-writer" industry, I read on a number of occasions that it commonly takes something like a half dozen manuscripts and ten years to get a book published--or at least, to write a "publishable" book.1 Such data as I have seen since then has convinced me that they were not all that far off the mark.

Of course, the reason why this is the case is that it takes such time to hone the relevant skills to the necessary degree. After all, novel-writing not only demands the skills associated with fiction writing in general (like the ability to tell a story, describe a scene or develop a character on the page), or writing in a specific genre (like the ability to create suspense that a thriller writer needs), but their exercise on the larger scale required for a book-length story, which means a writer's managing a number of interconnected subplots, coordinating a larger cast of characters, and overall presenting the broader field of view and deeper detailing that go with the form. The differences are not just quantitative, but qualitative, and entirely new and different skills are involved.

Still, those would-be purveyors of advice never said why it should take so long to pick these up. Rather, as I slowly learned, the answer laid in what they didn't say. Search as I might in the how-to books for some tidy explanation of what exactly one has to do to produce a hundred thousand word narrative, I never found it--because writing is more art than science, with little lending itself to formulas, and even the apparent formulas of little use in themselves.

Compare, for instance, the position of an English student learning how to construct a functional sentence, and another English student learning how to write a thesis statement. In the former case, the issue is reducible to the expression of a complete thought with a main clause containing a subject and predicate, accompanied by the appropriate punctuation and connections (e.g. capitalize the first letter of the first word, use commas to connect your clauses, finish with an end mark)--something one can execute and judge with almost mathematical precision.

Alas, no explanation of how to write a thesis statement is as compact and tidy as that. In the best of cases the student is given not a clear-cut prescription, but a theory of what constitutes a thesis, likely to be a much more complex thing (e.g. "A thesis is a claim about a topic with room for debate, based on a rational analysis of the facts"). One can, of course, judge whether or not someone has written a thesis in a reasonably objective way, but actually assimilating what it takes to produce one is a much more demanding thing, reliant on a good deal of intuition, personal observation, and learning by doing.

This is all the more the case with the writing of fiction. One typically learns to write novels by groping their way toward an understanding of the form, which typically comes less from the digestion of pithy advice than from reading a good many such works, and then by actually writing novels. The sheer length of a novel means that this is a slow process, a single first draft easily the work of months or years, without counting in the irregular and nonlinear pre-writing process, or the editing and revision that inevitably caps a serious effort. And of course, just as in the mastery of any other skill, several repetitions of the task are likely to be necessary before the result becomes tolerable.

Moreover, the process tends to be extended even beyond the amount of time it takes to write the requisite hundreds of thousands of words for several reasons.

One is the likelihood that the aspiring writer is unlikely to have their training process financially supported. This means that they are likely working at something else for income, which reduces the number of hours--and especially fresh, energetic, clear-headed hours--they can allot to their writing. (I know of one novelist--with a day job--who writes two hundred words a day. He will necessarily be less prolific than someone who, able to commit themselves to this full-time, can put in two thousand.)

Another reason is that one does not necessarily jump from one novel to the next. There are inevitable gaps, just as in any other course of work--because life gets in the way, and because, contrary to the celebration of stupid persistence so congenial to the self-help culture, a writer does sometimes need to go and do something else. People can get burned out writing, just as they can get burned out doing anything else, and the problem is exacerbated by the setbacks and disappointments with which the process is fraught--especially when one is not just writing, but pursuing publication. Submitting one's own work is not just time-consuming (and remember, any time in which one is submitting queries is time in which one is not actually writing), but typically a disspiriting activity that will frequently have the aspirant licking their wounds-- again and again and again, for many years.

It doesn't help that the learning process tends to be lonely. Yes, we all hear tales of people gifted with wonderfully supportive friends and family, but the reality is that people who are not writers generally do not understand what writers go through, and rarely have much sympathy for their tribulations (as the writer tends to find when they need to vent). Authors do command a measure of respect if they get rich and famous (despite which even the rich and famous have their insecurities and frustrations), and loved ones will likely put up with one who is at least paying his bills with his writing--but an author who is as yet unpublished tends to be treated with suspicion and contempt, as a malingering pseudointellectual who ought to be doing something more "useful" with their time. This doesn't help one's efficiency.

Unpublished authors also suffer in another way from that isolated position. Unlike an established writer with access to agent and editor and a slew of friends in the business, they do not get the benefit of personalized, professional advice. They may not even have anyone who can be a willing and able sounding board for their ideas, let alone offer meaningful feedback on their manuscripts. This leaves them having to figure out much more for themselves, doing much more of their own editing (another dispiriting activity), and makes them that much more likely to squander a lot of time and effort on artistically or commercially dubious ideas.

And so a decade goes by, just like that.

Of course, some writers do not need a half dozen manuscripts or a decade to reach the point at which they write "publishable" novels. Why the difference? Exceptional talent is always one possibility, though I am doubtful that talent spares anyone who actually pursues this course from having to suffer through a great deal of drudgery and frustration. A more likely possibility is that the author came to their first manuscript, or their third, with their skills relatively well-honed by other activity. They have already read more and written more than their counterparts by that point, and so are better-equipped to make the attempt. (Certainly a writer who pens their first novel-length manuscript at fifteen is likely to produce something quite different from what they would have written at twenty-five or thirty-five, especially if they have been doing other kinds of writing in the meantime.)

Some writers also get lucky, sidestepping a good many pitfalls without even knowing it, or learning more from their mistakes, or bouncing back more quickly from their failures, or having good fortune in their advisers, their personal support system, their access to privacy and quiet, or even the quantity and quality of leisure time they are able to devote to the effort. Some also get lucky in having a subject that makes the work easier, one that brings them closer than others to the ideal of "the story that writes itself." And some are fortunate in not having to go all this way on their own, but being able to get an agent or a publisher to take on a relatively raw manuscript, which they get to fix up under the close guidance of the pros, who are there for them again on the next attempt.2

Unfortunately, good luck is not a career plan.

1. Of course, publishable books often do go unpublished--while books that some might deem "unpublishable" do make it into the mill, and even reach the bestseller lists. The concern in this post is solely with the problem of writing a passable manuscript, getting published being an altogether different matter.
2. One other possibility is that the difference is not in the amount of work put in, but in the ways in which authors number their manuscripts. One writer may count the fifth massive rewrite of his first manuscript as still that first manuscript, while another might regard it as his sixth effort.

Monday, June 17, 2013

The Politics of Continuum, Part II

While the television show Continuum has drawn attention for its depiction of North America as a corporatocratic police state in the late twenty-first century, its attitude toward that milieu is what seems to have drawn most of the comment. A primary reason for this is that its protagonist (Rachel Nichols' Kiera Cameron) is a policewoman who came back in time to 2012 in pursuit of Liber8, a rebel group fightng the corporate order (led by Tony Amendola's Edouard Kagame), who are depicted as criminals and terrorists. One might consequently expect that Cameron's character is being presented as the hero, Kagame's group the villains – a role-reversal atypical for a genre which ordinarily has the good guys fighting dictatorships and police states rather than defending them.

The resulting ambiguity has been compounded by what the show's creators and cast have had to say about its politics. Their remarks suggest a relativistic neutrality in the conflict between Keira Cameron and Liber8, their attitudes treated as equally valid. This is no doubt meant to seem intriguing and daring, but in practice such "equal time" treatment of conflicting understandings and experiences has often distorted rather than clarified (as journalistic coverage of issues like climate change constantly reminds us).

Indeed, there are grounds for taking issue with the intent itself. Such a stance, after all, can be taken as a very convenient way of evading the writer's responsibility to seek the truth (again, as the lamentable state of contemporary journalism regularly reminds us). One can even contend that the willingness to give "equal time" to a dictatorial, human rights-abusing corporatocracy and its opponents is problematic from an ethical perspective, while also reflecting biases so deep-rooted as to scarcely be noticed. (Consider, for instance, how the show's relativistic stance would have been received were the secret police and the dissidents butting heads in a Communist society rather than a corporatocratic one.1)

What the show itself says about the world of 2077 at the outset is also worth noting, and possibly quite telling. Where the origins of the corporatocracy are concerned we are told that business took over an insolvent government in the course of bailing it out financially, a narrative which evokes the idea of efficient, market-disciplined private firms on the one side and government made profligate by its pandering to the voting rabble on the other.2 This premise at the least suggests that the show's economic thinking is not merely of the right (whose sympathy for business and wealth does not necessarily equal approval of corporatocracy), but those explicitly elitist and overtly anti-democratic quarters of the right where one might most expect to find sympathy for a fully privatized society.

Reinforcing this view is the fact that this understanding of economic life is never problematized, let alone challenged, at any point during the show's first season.3 Certainly I cannot think of any aspect of the show's future history which similarly reflects any critique of capitalism, corporate power or the tensions between these and political democracy (the Liber8 rebels instead having little to offer but resentment over inequity).4 This makes it at least plausible to argue that the show falls short of the "equal time" standard proclaimed by its creators, the writers taking the case for a Corporate Congress rather more seriously than they do any alternative for which the rebels might be fighting.

1. Simon Barry remarked in an interview that "I think that if you were to ask Rachel’s character if she lived in an oppressive society, she would say, 'No.' And I think that’s kind of the point. Our Liber8 freedom fighters/terrorists, if you will, have a different opinion."
2. It is also quite far from the reality of recent events. What we saw in 2008 was the opposite – profligate businesses run by corrupt executives bailed out by government, governments which arguably run in the red because of their pandering to business and the wealthy, exempting them from their fair share of taxes, showering these groups with giveaways of other sorts, and of course, enduring the revenue shortfalls that go along with the mediocre economic growth that has tended to follow in the wake of "pro-business" policy.
3. Indeed, it is worth noting that the scenario dramatizes at least one fear strongly associated with the populist right, namely the absorption of the United States into a North American Union.
4. Such a reading also seems to me reinforced by the presentation of farm boy and technical genius Alec Sadler as an Edisonade-hero-in-the-making (rather than a youth born to privilege, whose ascent to the uppermost strata of the North American Union is a function of his having picked his parents well), implying the meritocracy that is the justification generally given today for such extreme differences in wealth and status.

Thursday, June 13, 2013

Watching Doctor Who at Fifty

I will start off by saying that this post is not a retrospective about the series. I simply lack the context to provide that, having as I do virtually no familiarity with Doctor Who during its first twenty-six years. Instead it is a consideration of the show in its current form, which is frankly all I really know firsthand.

My first encounter with Dr. Who, after all, was in the American-made TV movie that aired on FOX in 1996. I remember little of it except what seemed to be the titular character's wandering a hospital without his memory for most of its running time. Needless to say, it left little impression on me, and I'd thought, on anyone else, and was later surprised to find that it was regarded as canon, with Paul McGann officially counted as the Eighth Doctor. Recently reading a summary of that movie it struck me that this may have been because I came to the film totally unfamiliar with the character's history (I'd never so much as heard of The Master), and that I might have had a more favorable impression if I came to it as a longtime fan - but in any event, I did not encounter the franchise again until Syfy Channel aired series one of the revival in early 2006.1

Once again, I didn't know what to make of the franchise.2 For one thing it seemed rather light on concept, lacking the density of world-building or intellectual play I found in shows like Babylon 5 or Lexx. Much of what the show did present seemed to me very old-fashioned (the Daleks something out of a '50s B movie), while other ideas appeared just plain silly (like the Autons the Doctor confronted in the first episode of the revival, "Rose"). For another, the Doctor's four-dimensional tourism seemed aimless and casual next to the more purposeful sagas I was used to - and surprisingly Earthbound, the hero of this space opera doing much of anything off-world for just four of season one's thirteen episodes. Instead he spent a lot of time in contemporary Britain, and when traveling to other times, tended to stick with British drama's most conventional choices (e.g. the Victorian era, World War II), which seemed rather limiting.3 Even on a visual level the production left me underwhelmed, neither lavish enough nor exotic enough to stand out from the then rather thick crowd of TV space operas.4

Still, I was a fan by the end of the first season. I suppose it just took a while for me to catch its wavelength: its penchant for Douglas Adams-like zaniness and whimsy, its faithfulness to its inheritance, its Britain-centricness.5 (I'd watched a lot of British television before, but little of it genre television, and while I suppose I'm rather more up on British history than most on this side of the Atlantic, once in a while there was a reference to something I didn't fully get, or wasn't familiar with from before.6) It took a while, too, to get to know the protagonist (who would be intolerably Mary Sue-like if he were not a nine hundred year old alien of exceptional charm and generosity), and the dynamic between Doctor and Companion, and the rhythms of the story arcs. The stories got bigger, the visuals better. And while still regarding it as intellectually and dramatically lighter stuff than many past favorites, and packed with implausibilities and dissonances best not examined too closely, every so often it offered a clever idea (as in "Blink"), or a genuinely moving moment (as in "Vincent and the Doctor"), while presenting its humanism with rather more conviction than the later Trek series' managed, and maintaining a cheerfulness that is downright refreshing in this age of dark-and-gritty-everything-with-a-side-of-still-more-dark-and-gritty. It has all made for an appealing enough combination that I find myself looking forward to season eight in a way I would not be to the announcement of an anemic new small screen Trek, or yet another clone of the Galactica reboot.

1. As indicated by the lengthy article devoted to it on Wikipedia, and its 5.9 score over at the Internet Movie Data Base, others responded rather more favorably.
2. It is worth remembering that I had long since read a large portion of the genre's print classics (a fair amount of newer stuff included), and taken in such shows as the famously idiosyncratic Lexx, so that this was not simply a case of a viewer being confused about what to make of a show simply because it "was not Star Trek."
3. Of course, series four of Lexx was set on contemporary Earth - but that followed three seasons of outer-space adventure.
4. For nearly two decades, from the late 1980s to the late 2000s, there was always at least one first-run North American space opera in production, and around the turn of the century, often a half dozen. However, no such show has been on the air since the cancellation of Stargate: Universe.
5. Interestingly, Adams was to be a direct influence on the show's course in a number of ways, not least his writing a number of episodes.
6. Case in point: in the second series episode "The Idiot's Lantern," the plot hinges on the coronation of Queen Elizabeth being a milestone in the history of television, something of which I had not previously been aware. Prior to that, in "The Doctor Dances," when the Doctor followed up "beat the Germans, save the world" with "don't forget the welfare state!" I had been less than certain of the connection implied between the former and the latter - not having yet read Angus Calder's outstanding history of the era, The People's War.

Sunday, June 9, 2013

Of Right and Left Transhumanism

The way we think about the possibility of transcending "the human condition" strongly reflects the way in which we think about humanity and society in general. Transhumanist thought has been no exception, evident across the political spectrum--while anything but the fundamental premise varies in line with the thinker's other political attitudes. In fact, it seems possible to speak of a transhumanism of the political right, and a transhumanism of the political left, the key difference between which is the context in which we would see technology utilized to effect fundamental changes in human beings. Rightist transhumanism seeks to achieve it within the existing social order (e.g. capitalism), and where the alleviation of problems like poverty or pollution are concerned, to anticipate that the new technologies associated with it will obviate the need for social and political change. Leftist transhumanism regards it as part of a broader project of human liberation, likely to follow the achievement of a more equitable society.

Ray Kurzweil's Singularitarianism is a clear case of the former, positing as it does our experiencing twenty thousand years' worth of change within the space of a century--and still finding ourselves living with a capitalist economy, while the solution of our environmental problems depends on the improvement of technology in response to market imperatives, rather than social and political innovation (like changes in values or the emergence of new institutions). By contrast, Olaf Stapledon's Last and First Men and Star Maker, or W. Warren Wagar's A Short History of the Future, present the leftist version, in which the modification of the species follows the achievement of a "good society" which has moved past the inequities of our own.

Just as elsewhere in our intellectual life these past several decades, the ideas of the right have prevailed in this discussion, reducing the left to offering little but criticism of the visions of Kurzweil and company. Nonetheless, Ken MacLeod's recent piece in Aeon Magazine contending that socialism, or something like it, is all the more necessary in an age in which our already contentious identity politics have been made much more so by the addition of substantial biological diversity hints at the possibility of movement in this direction.

Saturday, June 8, 2013

Sex on Screen: "The Joylessness of Sex" on Television

I previously noted the decline of sex as a theme of blockbusters at the American box office, and the disappearance of gratuitous nudity as an element in major hits.

Of course, this has had partially to do with the unprecedented abundance of sex on television. Still, as Gina Bellafante notes over at T Magazine,
what's striking about the current depiction is how much of it just isn't sexy — how much of it is divorced from any real sense of eroticism or desire. The audience, at home in bed in need of diversion, is betrayed. What they get instead is sex that is transactional, utilitarian — the end product of a kind of twisted careerism.
This is less surprising than it may seem. Even as some applaud a new age of openness, and others decry the sexualization of culture, society remains at bottom deeply sex-negative. In those quarters where religious-traditionalist hostility to sexual imagery has weakened (or at least, sunk beneath the surface of everyday life), postmodernism and identity politics have filled its niche. Terms like "objectification," when tortured far past any useful meaning in that way that has become routine, can virtually outlaw such concepts as physical attraction, sexual fantasy and even sexuality itself.

Whether intentionally or unintentionally, proponents of such ideas have had their effect, one which has naturally been manifested in our cultural production. Sex as a set-up for degradation or self-destruction, sex as a temptation to be resisted or a display of weakness, even sex as banality and disappointment, whether treated comically or tragically or tragic-comically, is broadly accepted. Sex as an occasion for embarrassment or frustration or gross-out gags (as in "raunchy" comedies), sex scenes which make the viewer's skin crawl (Ms. Bellafonte describes several), are what we can expect. The idea of sex and sexuality as a source of pleasure (for the characters, or the viewer) is regarded with much more apprehension, and within the mainstream, is rather more elusive, even as depictions of sex have become more frequent and graphic and accessible.

Once again I remember Roger Vadim's remark that "Hollywood is sometimes licentious, but always puritanical."1 And that hasn't changed. If anything, that characterization seems truer of the place than ever.

1. Also consistent with this sensibility: sex as a prelude to the monster or killer striking like some cosmic punisher (as in horror movies), or sex as indiscretion or transgression for which someone will soon pay dearly (the basis of the erotic thriller and much other drama).

Thursday, May 30, 2013

The Politics of Continuum, Part I

In the science fiction series Continuum North America in 2077 is under the direct rule of business corporations, which operate a police state to protect their domination of society.

In discussing the show's premise many casually label the show's "North American Union" fascist. It is indisputably anti-democratic and right-wing, but fascism is a specific form of such rule. Defining fascism in such a way that it encompasses anything wider than the Fascist party of Mussolini has always been problematic, but some analysts have made useful attempts to at least identify characteristics distinguishing it from other political forms.

To take one example, Chip Berlet and Matthew Nemiroff Lyons characterize it as a political ideology which "glorifies national, racial or cultural unity and collective rebirth while seeking to to purge imagined enemies, and attacks both revolutionary socialism and liberal pluralism in favor of militarized, totalitarian mass politics."1 Such characteristics seem pointedly absent from the world of Continuum. The political culture of the North American Union appears cosmopolitan, and neither celebrates a golden past, nor promises a golden future. Nor does it seem to display much concern with the mobilization of masses, or militarism as such. Additionally fascism tends to be at least formally critical of capitalism, and offer a vision of class reconciliation through a corporatist economics in which business, labor and a strong state ostensibly cooperate at an institutional level to achieve national economic goals. By contrast, the dominance of corporate power is naked and matter-of-fact in the show's milieu.

This makes the label "fascist" a misnomer. The NAU is, rather, a "corporatocracy," a polity directly ruled by corporations (not unlike India under the East India Company, prior to the Raj). This regime may serve similar ends to fascism (typically viewed from the left as capitalism's defensive reaction against socialism), but as shown by what is absent from the NAU's order, there are significant differences in the rhetorical and practical means they use to achieve those ends.

1. This description, which appears in their book Right-Wing Populism in America: Too Close for Comfort (New York: Guilford Press, 2000), seems consistent with compelling but less clinical analyses offered by other observers. Wilhelm Reich's description of fascism as the "mixture of rebellious emotions and reactionary social ideas"; and Walter Benjamin's characterization of fascism as a politics which organizes "the masses" around their self-expression rather than the self-interest it seeks to deny them, and "the introduction of aesthetics into political life"; both fit quite well with Berlet and Lyons' usage of the term.

Saturday, May 25, 2013

Game of Thrones: Toward the End of the Game

Over at Tor.com Chris Lough considers the possibility that HBO's Game of Thrones may come to an end before the series of novels on which it is based gets its concluding volume into print. Might, then, the TV series (projected to run about seven seasons, and thus wind things up circa 2017) offer the ending of the saga before the books do?

It certainly does seem plausible that we will not get both of the last two volumes to George R.R. Martin's series within the next four years, given how slowly volumes four and five came out (A Feast for Crows and A Dance of Dragons taking some eleven years to appear after A Storm of Swords). Still, as Lough points out, the series' creators have many alternatives to actually presenting that conclusion, including the series giving only some of the ending; leaving the real conclusion to a film to come out later; or putting off the conclusion for a bit longer by going on hiatus for another season.

All of these strategies have been used before (though given the norms of American TV, that first option, giving only some of the ending, seems most likely). Yet the challenge for the series' writers begins well before the conclusion. A Storm of Swords offered material sufficient for two seasons--but the fourth and fifth books were rather less satisfactory for many readers, because so much of what they contain seems to be of marginal importance to the whole.

Arya and Sansa do continue their journeys--but these are much further removed from the core of the conflict in Westeros than in the first three volumes. The same goes for Tyrion's adventure after his escape from Westeros, while this seems even more the case with Daenerys' time attempting to establish a new order in Mereen in the fifth book. The attention these books devote to established characters now being used as viewpoint figures (Brienne, Samwell), and new characters in new places which had received little direct attention prior to these volumes (the events in the Iron Islands and Dorne) only deepens the impression of this part of the story as looser and lacking in significant events.

Only Cersei's misrule in King's Landing, Stannis' struggle against her and Jon Snow's tenure as Lord-Commander of the Night's Watch remain at the heart of the drama, and even these portions of the books lacked the tightness of their earlier treatments. The fact that Feast and Dance mostly depict events that happened simultaneously (with Crows dealing with only some plot threads, and Dragons picking up others where Storm left off) adds yet another complication.

Next to what came before, it can seem diffuse and anemic, and it ought not to be assumed that viewers of the show will be more forgiving than readers of the books, confronting the writers with a significant challenge if they mean to hold their interest for another three seasons, and making some alterations seem all but inevitable. One is that they will synchronize the events of Feast and Dance (so that, for instance, we will see Cersei's and Tyrion's plots unfolding in the same episodes). Another is that they will compress these events, perhaps by turning them into a single season by dropping anything not absolutely essential to the story's trajectory. (I certainly expect that we will see much less of the Iron Islands and Dorne.) I also think we are likely to see rather more revision of the material that is retained than we have seen in the series to date (throwing in as many surprises as the story can stand to spice things up).

Of course, even after all that, I doubt the results will match the vigor and pace of the first four seasons, but they might be sufficiently strong to hold on to the viewers' loyalty until the revelations of the (hopefully) more eventful The Winds of Winter.

Subscribe Now: Feed Icon