Berkeley, CA: The University of California Press, 1957, pp. 319.
It is a commonplace that the literary genre of the novel is distinguished by its telling about the lives of everyday individuals in everyday circumstances, in a manner distinct in two ways--its plainness of style, and its attentiveness to the "inner lives" of its characters.
In his classic discussion of the matter in The Rise of the Novel Ian Watt does far, far more than present that view, instead getting to the root of the matter. As he makes clear the ancient-Medieval tendency had been to think of the "universal" and "timeless" as real, and particularities as ephemeral and meaningless. That earlier logic had authors striving to portray the universal above all, serving up characters who are "characteristic" types down to their names, and time and place often conceived imprecisely, with the authors' ornamentation of the portrayal the principal object and measure of their skill.
By contrast, this era saw writers aspiring to tell the stories of particular individuals, in particular places and times--and indeed, the unfolding of the events of their lives through cause and effect sequences, over time. That shift, in turn, was the reason for the plainness of style--the prioritization of a denotative, descriptive usage of language, with the lavishness in detailing taking the place of the old lavishness of ornament. He stresses, too, how this came together in a genre of private accounts, read privately, permitting an "intimacy" unattainable in other, publicly performed and publicly enjoyed styles of work (plays, poems) that both extended the bounds of what seemed permissible in art, and allowed a new intensity of audience identification with protagonists.
All of these features--the strong sense of particularity, time and cause and effect; the use of language as description rather than ornament; the private, "intimate" novel-reading experience--seem implausible outside the rise of a rationalistic, individualistic, science-touched Modernity that afforded individuals a meaningful range of economic choices (a capitalism, increasingly thoroughgoing and increasingly industrialized), and sanctioned them ideologically (Puritan ideas about salvation, secular Enlightenment thought). And indeed, Watt's classic is best remembered for its stress on the historical context in which the novel emerged, which reflects a good deal of what Mills was to call "the sociological imagination."
Appropriately the book begins with a good deal of historical background, specifically two chapters regarding the philosophical developments underlying the rise of what we think of as "realism," and a reconstruction of the eighteenth century reading public from the concrete facts of the era. In these the territory ranges from the epistemology of John Locke to the nitty-gritty of how many and who would have had the time and money to buy this kind of reading material and a place in which to read it in that private way he described (delving into prices and incomes and the rest). Afterward, when turning to the founding authors and works themselves--Daniel Defoe's Robinson Crusoe and Moll Flanders, Samuel Richardson's Pamela and Clarissa, Henry Fielding's Tom Jones--he continued to draw on historians' understanding of aspects of the period ranging from the rise of Grub Street to the rise of suburbia, and how these factored into the authors' creative work by way of their personal backgrounds, and quite profitably. (Wary as I tend to be of the biographical approach, I found it impossible to come away from this study thinking it accidental that Richardson, a retiring, low-born, Low Church suburbanite earnest about "middle class morality" wrote epistolary novels about domestic themes, and Fielding, a robust Tory squire who attended Eton with William Pitt, served up Jones' picaresque adventure.)
Watt is all the more effective and interesting for his attentiveness to the finer points of form so much more difficult to discuss than content, and the ways in which form, too, was remade by practical imperatives of the writers' business. (That exhaustive descriptiveness, and the need of the writer to make themselves understood to readers of a less certain educational level--both connected with their selling to a wide public rather than catering to an elite patron--combined with per-word pay rates to put a premium on prolixity.)
All this is considerably enriched by a good many observations of wider literary significance. Not the least of these concerned the tendency of writers to essentially write themselves when presenting a protagonist, especially in a first-person narrative. ("Defoe's identification with Moll Flanders was so complete that, despite a few feminine traits, he created a personality that was in essence his own" (115).) Interesting, too, is his quip about an aspect of the style of fiction Pamela helped pioneer far less likely to be acknowledged by any critic in our day wishing to retain their mainstream respectability. ("[T]he direction of the plot . . . outrageously flatters the imagination of the readers of one sex and severely disciplines that of the other" (153-154)--the female and male sexes respectively here.)
Indeed, the breadth of his vision and scope of his interest is such that some of his more interesting remarks have nothing to do with eighteenth century fiction at all. (As Watt observes Shakespeare, the "inventor of the human," is less a modern than a Medieval--similarly different in his thought about time and causality so that these come across as loosely handled, and different in his thought about language, too, so that the propensity for the purple prevails over the comprehensible in his poetry, some of the reasons why he is less accessible and compelling to most of us than we think he is "supposed to be.") Watt is attentive, too, to the trade-offs that writers have to make when they actually create something--like that between plot and character (the one as a practical matter attended to in "inverse proportion" to the other (279)), and many of the pitfalls of literary criticism in his time, and our own. ("Coleridge's enthusiasm" as critic, he remarks in one instance, "may . . . serve to remind us of the danger . . . of seeing too much" in a work (120).)
As the cited passages indicate, Watt, for all his richness in insight, is also extraordinarily accessible, partly because in comparison with our pretentiously and trivially theoretical, jargon-laden contemporary work Watt is clearly focused on his subject and straightforward in his communication, while also exceptionally gifted as a wordsmith himself. Altogether this makes the book not just a key work for anyone trying to understand eighteenth century literature, or the novel that remains the central fictional form of today, but literature in general.
Thursday, July 19, 2018
The Best Words Ever Written About "Human Nature"
Back in tenth grade I was first introduced to F. Scott Fitzgerald, by way of The Great Gatsby. I wasn't all that impressed with the book at the time, but I later came to be more appreciative--in part, I think, because I happened to run across Fitzgerald's earlier book This Side of Paradise.
The book has its quirks and limitations--its at times' ostentatiously Modernist experimentation, for one. Still, I was impressed with some of its more substantial exchanges, which happen to include, as the title of the blog post implies, the last of them, where the protagonist Amory Blaine debates the Social Question with an acquaintance of his dead college friend's father. Said acquaintance, dismissing socialism, speaks of "'certain things which are human nature' . . . with an owl-like look, 'which always have been and always will be, which can't be changed.'"
Amory's response, after astonished looking from one man to the other, that he "can name offhand over one hundred natural phenomena that have been changed by the will of man--a hundred instincts in man that have been wiped out or are now held in check by civilization," and that to imply that such things cannot be is not only false but "a flat impeachment of all that's worth while in human nature" that "negates the efforts of every scientist, statesman, moralist, reformer, doctor, and philosopher that ever gave his life to humanity's service." Indeed, far from being the argument-ender that speaker with the owl-like look thought it was, it was in Amory's view "the last refuge of the associated mutton-heads of the world," such that any individual "over twenty-five years old who makes that statement in cold blood ought to be deprived of the franchise."
Alas, a hundred years later we are still apt to hear that inane use of those two words, "human nature," most dismayingly of all, from ostensible progressives. But at the same time, Fitzgerald's reply to it still stands.
The book has its quirks and limitations--its at times' ostentatiously Modernist experimentation, for one. Still, I was impressed with some of its more substantial exchanges, which happen to include, as the title of the blog post implies, the last of them, where the protagonist Amory Blaine debates the Social Question with an acquaintance of his dead college friend's father. Said acquaintance, dismissing socialism, speaks of "'certain things which are human nature' . . . with an owl-like look, 'which always have been and always will be, which can't be changed.'"
Amory's response, after astonished looking from one man to the other, that he "can name offhand over one hundred natural phenomena that have been changed by the will of man--a hundred instincts in man that have been wiped out or are now held in check by civilization," and that to imply that such things cannot be is not only false but "a flat impeachment of all that's worth while in human nature" that "negates the efforts of every scientist, statesman, moralist, reformer, doctor, and philosopher that ever gave his life to humanity's service." Indeed, far from being the argument-ender that speaker with the owl-like look thought it was, it was in Amory's view "the last refuge of the associated mutton-heads of the world," such that any individual "over twenty-five years old who makes that statement in cold blood ought to be deprived of the franchise."
Alas, a hundred years later we are still apt to hear that inane use of those two words, "human nature," most dismayingly of all, from ostensible progressives. But at the same time, Fitzgerald's reply to it still stands.
"Geography, Technology and the Flux of Opportunity"
Back when I wrote the first edition of The Many Lives and Deaths of James Bond I included in the appendix a brief essay discussing Britain's extraordinary industrial, imperial and military predominance at its nineteenth century peak, and its subsequent passing over the following hundred years. Basically what it came down to was the spread of industrialization as other nations increasingly consolidated. One result was larger-scale industrialized states, another was states which evolved a more sophisticated form of industrialization, and still another were states which combined both features, epitomized by the United States. The balance of economic power shifted in a hurry, the balance of military power with it. Meanwhile the declining acceptability of colonial rule, and the cost of two world wars, accelerated the unraveling of Britain's empire, but that economic change was first and foremost.
Given my historical and other interests I did much more reading and thinking about the issue, unavoidably rethinking what I wrote earlier (just why did others make a bigger success of the "Second" Industrial Revolution, for example?), and writing more. Initially I intended to produce a longer, better-grounded version of the piece in my appendix, later a few short pieces, but it eventually blew up into the paper I have just published through SSRN, "Geography, Technology and Opportunity: The Rise and Decline of British Economic Power" (the second revision of which is now up). About 74,000 words long it is less an appendix than a book in itself--which I suppose offers the same explanation in the end. With (I hope) more rigor, in detail and depth (unexpectedly I found myself coping with matters raging from the comparative phosphorous content of different iron ores to waterway-territory ratios in countries around the world to the finer points of the 1965 National Plan), but nonetheless, the same essential explanation.
If nothing else, I've done a fair job of convincing myself.
Given my historical and other interests I did much more reading and thinking about the issue, unavoidably rethinking what I wrote earlier (just why did others make a bigger success of the "Second" Industrial Revolution, for example?), and writing more. Initially I intended to produce a longer, better-grounded version of the piece in my appendix, later a few short pieces, but it eventually blew up into the paper I have just published through SSRN, "Geography, Technology and Opportunity: The Rise and Decline of British Economic Power" (the second revision of which is now up). About 74,000 words long it is less an appendix than a book in itself--which I suppose offers the same explanation in the end. With (I hope) more rigor, in detail and depth (unexpectedly I found myself coping with matters raging from the comparative phosphorous content of different iron ores to waterway-territory ratios in countries around the world to the finer points of the 1965 National Plan), but nonetheless, the same essential explanation.
If nothing else, I've done a fair job of convincing myself.
Wednesday, July 18, 2018
Twentieth Century Britain
Originally published as part of THE MANY LIVES AND DEATHS OF JAMES BOND
At the end of the Napoleonic Wars in 1815, Britain was the only industrialized country in the world, and the only naval and colonial power of any weight. London was also the undisputed financial capital of the world. All of this meant not just British dominance in the world, but easy dominance by a wide margin, enabling Britain to run its empire "on a shoestring" as many a historian has put it.
However, the situation was clearly beginning to change by the 1860s, due to three interconnected developments. The first was a profound change the terms of economic competition. Industrialization was spreading and deepening elsewhere. Indeed, a "Second" Industrial Revolution based on the new technologies of oil, electricity and chemicals was getting underway, while the characteristic sectors of the "First" Industrial Revolution, like coal, steam-powered machinery and textiles, were going into decline. The tendency not only diminished the value of Britain's early lead, but threatened to make that early lead a liability as other countries began from scratch with the newest plant and newest practices.
The second was a series of geopolitical changes driven by nationalism and a revived imperialism that made some comparatively minor states consolidate themselves into larger and more formidable powers, and made other, established powers appear more threatening. The unification of Germany disrupted the old balance of power on the European continent. At the same time the consolidation of the United States in the Civil War, and its expansion to the Pacific and then overseas in the following decades; and Japan's emergence from its isolation to become a naval and colonial power; changed the balance of power in the Western hemisphere and the Far East. Meanwhile, France and Russia enlarged their empires, acquiring territories abutting British possessions in eastern Africa and southern Asia.
The third trend was likewise driven by nationalism, but within Britain's empire. At its height that empire ruled territories on which "the sun never set," encompassing a quarter of the world's land area – but the very diversity and dispersal of its subjects also made it subject to the centrifugal forces that wrested much of North America away from it in the American Revolution. Now nationalism was spreading in other, settler-dominated states like Canada and Australia, as well as nations like India where small numbers of British officials and soldiers ruled over vast populations of native subjects treated as a racial and religious "Other." Together these three trends raised the prospect of Britain being just one industrialized country among many instead of the unique "workshop of the world" it had earlier been; the metropole increasingly being outweighed by other powers which were both bigger and more dynamic; and the colonies ceasing to be a support to Britain's position. Essentially, it meant the reduction of Britain to being merely one of Europe's larger countries, and the first steps in this transition were already apparent at this time. First the United States, and then Germany, overtook Britain as manufacturers, while the country's commodity trade surpluses turned into deficits, covered by Britain's income from financial services and overseas investments. As this was happening, the growth of several foreign navies, especially those of Germany, Japan and the U.S., eroded Britain's confidence in its ability to project power globally. In the case of Germany, it even raised concerns about Britain's ability to protect itself against invasion (worries reflected in the flourishing invasion story genre). Britain consequently looked to alliances to protect interests it had once been able to provide for by itself, coming to terms with France and Russia (and concluding an alliance with Japan to secure its Asian interests) to better concentrate its military resources against the perceived German threat.
When war did come in 1914, the "Triple Entente" did of course prove victorious. Britain's primary challenger, Germany, was defeated and disarmed. Additionally, Britain gained new imperial acquisitions in the Middle East and central and eastern Africa, while South Africa, Australia and New Zealand also received former German colonies as mandates. Yet, the war was costly in ways beyond the bill for the fighting. The mobilization of the economy for total war meant that all production went to the war effort – exports collapsing, while imports skyrocketed (the more so because of what Britain's declining industrial economy was failing to produce at home, from steel to electrical goods). At the same time, world finance suffered amid the fighting, while much of the revenue-earning merchant fleet was sunk by German U-boats, forcing Britain to import even in this area too.
Covering the difference meant selling off foreign assets and raising loans abroad, while the overall national debt was twelve times bigger than it had been at the war's start. Meanwhile, many of the country's old markets had been snapped up during the conflict by its own, less-taxed Allies, like the United States and Japan. And matters were the worse because of the worldwide slump that followed the war, with wartime demand vanishing. Indeed, the gold standard that had been a pillar of the pre-war world economy was suspended in 1919, a major step toward Britain's ceding the financial predominance of the City of London to Wall Street, while stagnation prevailed through the 1920s.
Economic weakness, naturally, manifested itself in military weakness. America's size and wealth meant that Britain's days as the leading naval power were fast running out, while the country could not fight a major power without at least the "benevolent neutrality" of the U.S. – a fact which quickly had practical consequences, starting with Britain's not renewing its alliance with Japan.
This reinforced, and was reinforced by, Britain's weakening grip on its colonies. Independence movements grew more formidable from India to Ireland, where a successful revolt forced Britain to accept the country's independence in 1923. Even the Dominions increasingly asserted their independence, Canada refusing to back Britain in the Chanak Crisis that same year, when Allied troops at the Dardanelles confronted Mustapha Kemal's troops. The Allies ended up backing down in that crisis, with the result that Germany's wartime ally Turkey overturned the terms imposed on it at the end of the conflict – a humiliation that ended the career of Prime Minister Lloyd George.
The onset of the Great Depression in 1929 worsened matters. The older sectors on which British industry (steel, textiles, shipbuilding) remained dependent were particularly hard-hit, and put an end to the gold standard for good in 1931. That same year Britain recognized the independence of the Dominions of Canada, South Africa, Australia and New Zealand under the Statute of Westminster (confirming Britain's decreasing ability to rely on their significant resources). Germany, Italy and the Soviet Union appeared in the ascendant in these years as Britain remained mired in outdated economic policies at home (responding to depression with austerity). Abroad, Britain backed down in confrontations with Italy over its invasion of Ethiopia, and German over its remilitarization and expansionism (the enlargement of the German navy in 1935, the militarization of the Rhineland in 1936, the country's unification with Austria in 1938 and dismemberment of Czechoslovakia in 1938 and 1939). World War II came despite this policy of "appeasement," and after the Battle of France, left the British Empire alone for nearly a year against Germany and its allies, until German actions – its invasion of the Soviet Union, and declaration of war on the U.S. after Pearl Harbor – brought both these powers into the war against it as full-scale belligerents.
The End of Empire
At the end of the war in 1945 Britain looked stronger than ever. Germany was defeated again, far more decisively than it had been in World War I, so that British troops were in Berlin, while Italy and Japan were similarly defeated and under Allied occupation – three of the great threats of the interwar period to Britain and its empire neutralized. Britain also possessed military forces larger than ever before in its history, nearly five million of its people in uniform, distributed around the world to an extent that had never been seen as they occupied an assortment of French, Italian and Dutch colonies from Libya to Vietnam. Reflecting this position, Britain appeared at the wartime conferences of the Allies in Tehran, Yalta and Potsdam as one of the "Big Three."
Nonetheless, unlike the U.S., which realized its enormous latent power in the conflict, Britain exhausted its own power, the demands of the conflict exacerbating an already weak international position. Again British trade was dislocated. Again the shipping fleet was savaged by U-boats. Again foreign investments had to be sold off to cover expenses, this time half of them (compared with just a thirteenth in World War I). And as if this were not all enough, German bombing inflicted heavy physical damage on the country.
In all, one-quarter of the country's remaining wealth was gone by war's end, while the debt load had increased, markets had again been lost, and the country left insolvent, even before it could demobilize its forces and its economy, rebuild an industrial plant run-down by the sacrifice of all investment and maintenance to all-out production, and try to start paying its own way in the world again. Indeed, coming on top of the massive net wartime borrowing from the United States (and Canada), in 1946 the government had to take out yet a new multi-billion dollar American loan to keep its head above water.
Besides damaging British resources, the war also damaged British prestige, particularly in Asia, where many of its colonies were conquered and occupied by Japanese troops, including Hong Kong, Malaysia, Singapore and Burma. These losses destroyed Britain's credibility as a military power in the area, while strengthening local independence movements (in Burma, in control of the country when British troops returned), further undermining what remained of Britain's ability to control them. India (which at the time included present-day Pakistan and Bangladesh) was independent by 1947, Burma by 1948, Ceylon by 1949.
Nonetheless, withdrawal from these territories was initially conceived as part of a restructuring and rationalization of the empire, rather than its winding up. There were, for instance, hopes that Africa might constitute a "third" British Empire (replacing the "second" empire that was India, which in its turn had replaced the "first" American empire Britain lost in the Revolutionary War). Local opposition, and British weakness, made this impracticable, and by 1960 Harold Macmillan was speaking of the "Wind of Change." Nigeria, the Southern Cameroons, British Somaliland and Somalia gained their independence that year, Sierra Leone and Tanganyika the next. Kenya, which some had hoped would become a second South Africa, but which instead saw the Mau Mau revolt, was independent by 1963, as was Zanzibar, and all of Britain's remaining African possessions in the next few years.
With these colonies went both a crucial claim to Britain's great power status, and a crucial foundation of it, as they had continued to provide important props to the metropole's economy (like captive markets) and significant additions to it (like supplementary military manpower). The Commonwealth of Nations, despite all the hopes held out for it, proved no substitute for empire of any kind, and did little if anything to slow the drift of even Canada and Australia away from Britain toward the U.S. in their security and trading arrangements.
Naturally, Britain could not compete with the far vaster, and more fully mobilized, economic and military resources of the United States and the Soviet Union, the new "superpowers." Additionally, the reach of both the U.S. and the Soviet Union penetrated into Europe far more deeply and extensively by 1945 than had been the case in 1939, epitomized by the boots they had on the ground down through the middle of the continent. Even at sea, where Britain had still commanded the world's largest fleet at the war's start, it was eclipsed by the United States Navy during the conflict, without ever coming close to catching up. Not long after the war, it was to become increasingly clear that Britain could not even compete qualitatively with the superpowers in techno-military areas like nuclear weaponry, aerospace and blue-water naval capabilities. (The country's nuclear deterrent was to be founded on Polaris ballistic missiles imported from the U.S., its newer fighters like the Tornado built through international agreements rather than an independent British industry, its last fleet carrier retired in 1979.)
The result was that while Britain had historically been the "offshore balancer" in European politics, preventing any one country from growing powerful enough to dominate the continent by throwing its weight to this side or that one in a conflict (as when it joined the coalitions against France in the Napoleonic era), it had largely been supplanted in that role by the United States. The situation was much the same elsewhere in the world. Britain continued to play something like its old role in the Middle East and Southeast Asia through the 1950s and 1960s through client regimes, basing arrangements and the like. However, it quit Palestine in 1948, and lost its Egyptian client King Farouk just four years later – and when he was succeeded by Gamal Nasser, Britain's biggest overseas military base, its facility at Suez. Britain's attempt to reestablish its earlier position in the country by joining with France and Israel to overthrow Nasser, quashed by Soviet and American pressure, only underlined Britain's inability to act independently. Just two years later another client regime, the Hashemite monarchy, fell in Iraq, ending Britain's anemic attempt to organize collective security in the region under the "Baghdad Pact." In 1963 British troops faced an insurgency in the strategic colony of Aden, which led to British withdrawal from the country four years later.
Britain did not suffer such dramatic setbacks in Southeast Asia, the Malayan Emergency and the "Konfrontasi" with Indonesia that followed generally chalked up as successes. However, by the late 1960s the maintenance of a military presence there proved more financially costly than the country could afford, leading to the 1968 announcement that except for the retention of Hong Kong, Britain would withdraw "east of Suez" by 1971, an intention on which it soon made good.
There is no question about what these adjustments meant in practical terms, but there is considerable debate over what they meant in cultural and political terms at home. Some historians view them as having been relatively painless, and compare the process favorably with France's convulsions over Algeria at the same time. At the other end of the spectrum are those who believe British culture has never quite come to terms with the fact, even now three generations after World War II's end; that the country never quite moved on, the old pretensions turning up in the most inappropriate places. As Jeremy Paxman put it last year,
When a British prime minister puffs out his chest and declares he "will not tolerate" some African or Middle Eastern despot, he speaks not as a creature of a 21st-century political party in a dilapidated democracy but as the latest reincarnation of Castlereagh or Palmerston – somehow, British foreign policy has never shaken off a certain 19th-century swagger, and the implied suggestion that, if anything happens to a British citizen, a Royal Navy gunboat will be dispatched to menace the impertinent perpetrators.
Nonetheless, wherever the truth lies, there is no doubt that at the time there was a significant current, especially in elite opinion, which was unhappy with the change.
Change at Home
While Britain experienced these changes in its international standing, British society was in flux domestically. In the years prior to World War I, the classical liberal consensus which had prevailed in Britain was challenged by increasing pressure from increasingly strong labor and feminist movements, and the dispute over the status of Ireland. The outbreak of the war only deferred these issues, however, the conflicts over them continuing in such events as the successful breakaway of Ireland; the decline of the Liberal Party to third-party status, while the Labour Party formed its first government in 1924; and the General Strike of 1926.
During and after World War II the discrediting of the Conservative establishment by its mismanagement of the early part of the war (epitomized by Dunkirk, viewed at the time not as some glorious "miracle," but as a catastrophic defeat); the failures of conservative, free-market economic orthodoxy in the 1930s, and the successes of government planning during the war, on the home front as well as on the battlefield; the war's effect in creating a sense of national togetherness, idealism, and promise of "broad sunlit uplands" ahead; contributed to a vision of a more egalitarian post-war Britain as a goal of the conflict, epitomized by the wide support for the wartime Beveridge Report, which called for "a comprehensive policy of social progress."
In the August 1945 general election, the Conservative government of Winston Churchill was swept out of power, and replaced with a Labor government under Clement Attlee with a broad reform mandate. This extended beyond the treatment of full employment as a national economic priority, to labor-friendly legislation (like the Trade Disputes and Trade Unions Act of 1946, which widened the powers of organized labor), the establishment of a "cradle-to-grave" welfare state (with such measures as the National Insurance Act of 1946, and the launch of the National Health Service in 1948), and even the nationalization of key sectors of the economy (including the Bank of England, the coal, gas and steel industries, and the railways). These policies ultimately became the foundation of the "social market" or "welfare" capitalism that characterized the "postwar consensus."
As economic policy the consensus proved a significant success. Despite all the difficulties previously described, the post-World War II was a period of rapid economic growth and rising living standards for Britons as a whole – of prosperity unexampled in the country's history – such that many historians have dubbed it a "golden age" for the nation. Nonetheless, like all compromises, it left many dissatisfied at both ends of the political spectrum. On the left were those who had hoped that the policies of the Labor government of 1945-1951 would prove just the beginning of a movement toward deeper change. On the right were those who felt these changes had already gone too far, and even that any such movement was much too far, opposed as they were to such "leveling" tendencies on principle. (Author Evelyn Waugh famously remarked that during the years of the Attlee government the country felt to him like it was "under enemy occupation.")
These economic and political changes were accompanied by socio-cultural changes, evident in everything from personal relations (from a new working-class assertiveness to the Sexual Revolution) to the arts (shaken up by everything from the arrival of the "Angry Young Men" in British letters to rock music). And all these together interacted with those currents in opinion that lamented Britain's lost imperial status, with many resentful about the former seeing the latter as a cause, pointing to such things as generation gaps and rising labor militancy as culprits. All this lent a new edge to the age-old lament of the privileged that the young and the lower social orders did not "know their place," and that other age-old lament of anxious conservatives that "things are going wrong," but it seemed to many that all they could do was wax nostalgic for a romanticized version of the past.
After 1973
Britain's shift from world-spanning empire and international hegemon to "normal" country had largely run its course by the 1970s. Nonetheless, the decade saw other problems intensify as the post-war boom came to an end in Britain, just as it was doing in the rest of the world. The downturn saw increases in inflation, unemployment and fiscal strain, such that the country got a loan from the International Monetary Fund in 1976. The times took their toll on labor relations as unions clamored for pay increases in line with rising prices. The response among some observers to these events was hysteria, exemplified by the rise of organizations like GB 75 intended to combat trade unionists if they got too far out of line, and the rumored coup plot against Prime Minister Harold Wilson from within his own government, and culminated in the "winter of discontent" of 1978-1979.
The period's troubles also created a significant opportunity for right-wing opponents of the post-war consensus, with Prime Minister Margaret Thatcher making history with the implementation of a neoliberal economic program. This aimed at reviving the economy by making the country more business-friendly by privatizing state enterprises (like those nationalized under Attlee), deregulating business activity (as in the famous "Big Bang" deregulation of British financial markets), chipping away at the welfare state (as with the 1986 Social Security Act) and the suppression of organized labor (as in the Employment Act of 1980, which targeted closed shops and picketing), as well as monetarist policies aimed at controlling inflation.
The actual accomplishments of these policies were meager, their effectiveness at taming inflation and unemployment spotty at best. United Nations statistics on British GDP, when adjusted for population growth and inflation, show that even with the help of North Sea oil, growth in the eighteen years of Conservative rule that began with Thatcher (1979-1997) was only slightly better than what the country enjoyed in the much-maligned 1973-1979 period (1.9 percent, compared with the earlier 1.4 percent a year), and a far cry from its pace in the '50s and '60s. These years also saw the continued decay of British manufacturing, and along with it, a worsening balance of payments situation as sizable trade deficits in manufactured goods became a way of life.
The result is that such growth in Gross Domestic Product as did occur was hollow. Moreover, the harm such policies did the less affluent made Thatcherism proved deeply divisive in Britain, in a way that contemporary "Reaganomics" has not been for the United States. However, this "post-postwar consensus" continued to prevail through the "New" Labor premierships of Tony Blair (1997-2007) and Gordon Brown (2007-2010). The 2008 economic downturn, which internationally has contributed to some of the deepest questioning of neoliberal economics seen in a long time, has instead pushed the country farther along this path as the British government responded to recession in the same way that it did in the 1930s, with austerity measures.
The Thatcher era is also associated with a more assertive Britain on the world stage. Nonetheless, while British Prime Ministers sometimes spoke as if their country were still a superpower, the country's economic weight only continued to decline. Quickly as Britain grew in the post-war years, Japan, West Germany and France grew even faster, and all eventually overtook it in the area of economic output. Unsurprisingly, Britain was most prominent when acting in concert with the U.S., as when it allowed the U.S. to station intermediate-range nuclear forces in the country in the 1980s, cooperated in the American bombing of Libya in 1986, and committed more troops than any other of the U.S.'s NATO allies to Desert Shield and Desert Storm in 1990-1991. Even when Britain did pursue its own objectives, as in the 1982 Falklands War, it proved heavily reliant on the assistance of the United States (like its provision of the latest Sidewinder missiles for its Harriers, reportedly crucial to winning the air war over the islands).
The end of the Cold War, and the years that followed, saw the continuation of these trends, as the reunification of Germany and the closer integration of the European Union – a process from which Britain kept aloof – changed the distribution of economic power in its region. Globally, there was also a shift in world output and the balance of trade to East Asia, with the rise of Japan followed by China's emergence as an economic colossus, while India, Brazil and Russia have also, according to certain measures, overtaken Britain in the area of GDP. In the military sphere, defense cutbacks following the dissolution of the Soviet Union, and the 2008 economic crisis, have only further diminished Britain's capacity for long-range military action, as demonstrated in the Anglo-French intervention in Libya in 2011 – ultimately dependent on the backing of the U.S. for its success.
Today Britain still enjoys a number of great power trappings, like its permanent seat on the United Nations Security Council. The City of London remains an international financial center, in the view of some of comparable importance to Wall Street. Yet, Britain's strongest claim to global influence in the post-war era may actually be British culture and media, reflected in the status of the BBC and Reuters, the Beatles and Mr. Bean, and of course, James Bond himself.
At the end of the Napoleonic Wars in 1815, Britain was the only industrialized country in the world, and the only naval and colonial power of any weight. London was also the undisputed financial capital of the world. All of this meant not just British dominance in the world, but easy dominance by a wide margin, enabling Britain to run its empire "on a shoestring" as many a historian has put it.
However, the situation was clearly beginning to change by the 1860s, due to three interconnected developments. The first was a profound change the terms of economic competition. Industrialization was spreading and deepening elsewhere. Indeed, a "Second" Industrial Revolution based on the new technologies of oil, electricity and chemicals was getting underway, while the characteristic sectors of the "First" Industrial Revolution, like coal, steam-powered machinery and textiles, were going into decline. The tendency not only diminished the value of Britain's early lead, but threatened to make that early lead a liability as other countries began from scratch with the newest plant and newest practices.
The second was a series of geopolitical changes driven by nationalism and a revived imperialism that made some comparatively minor states consolidate themselves into larger and more formidable powers, and made other, established powers appear more threatening. The unification of Germany disrupted the old balance of power on the European continent. At the same time the consolidation of the United States in the Civil War, and its expansion to the Pacific and then overseas in the following decades; and Japan's emergence from its isolation to become a naval and colonial power; changed the balance of power in the Western hemisphere and the Far East. Meanwhile, France and Russia enlarged their empires, acquiring territories abutting British possessions in eastern Africa and southern Asia.
The third trend was likewise driven by nationalism, but within Britain's empire. At its height that empire ruled territories on which "the sun never set," encompassing a quarter of the world's land area – but the very diversity and dispersal of its subjects also made it subject to the centrifugal forces that wrested much of North America away from it in the American Revolution. Now nationalism was spreading in other, settler-dominated states like Canada and Australia, as well as nations like India where small numbers of British officials and soldiers ruled over vast populations of native subjects treated as a racial and religious "Other." Together these three trends raised the prospect of Britain being just one industrialized country among many instead of the unique "workshop of the world" it had earlier been; the metropole increasingly being outweighed by other powers which were both bigger and more dynamic; and the colonies ceasing to be a support to Britain's position. Essentially, it meant the reduction of Britain to being merely one of Europe's larger countries, and the first steps in this transition were already apparent at this time. First the United States, and then Germany, overtook Britain as manufacturers, while the country's commodity trade surpluses turned into deficits, covered by Britain's income from financial services and overseas investments. As this was happening, the growth of several foreign navies, especially those of Germany, Japan and the U.S., eroded Britain's confidence in its ability to project power globally. In the case of Germany, it even raised concerns about Britain's ability to protect itself against invasion (worries reflected in the flourishing invasion story genre). Britain consequently looked to alliances to protect interests it had once been able to provide for by itself, coming to terms with France and Russia (and concluding an alliance with Japan to secure its Asian interests) to better concentrate its military resources against the perceived German threat.
When war did come in 1914, the "Triple Entente" did of course prove victorious. Britain's primary challenger, Germany, was defeated and disarmed. Additionally, Britain gained new imperial acquisitions in the Middle East and central and eastern Africa, while South Africa, Australia and New Zealand also received former German colonies as mandates. Yet, the war was costly in ways beyond the bill for the fighting. The mobilization of the economy for total war meant that all production went to the war effort – exports collapsing, while imports skyrocketed (the more so because of what Britain's declining industrial economy was failing to produce at home, from steel to electrical goods). At the same time, world finance suffered amid the fighting, while much of the revenue-earning merchant fleet was sunk by German U-boats, forcing Britain to import even in this area too.
Covering the difference meant selling off foreign assets and raising loans abroad, while the overall national debt was twelve times bigger than it had been at the war's start. Meanwhile, many of the country's old markets had been snapped up during the conflict by its own, less-taxed Allies, like the United States and Japan. And matters were the worse because of the worldwide slump that followed the war, with wartime demand vanishing. Indeed, the gold standard that had been a pillar of the pre-war world economy was suspended in 1919, a major step toward Britain's ceding the financial predominance of the City of London to Wall Street, while stagnation prevailed through the 1920s.
Economic weakness, naturally, manifested itself in military weakness. America's size and wealth meant that Britain's days as the leading naval power were fast running out, while the country could not fight a major power without at least the "benevolent neutrality" of the U.S. – a fact which quickly had practical consequences, starting with Britain's not renewing its alliance with Japan.
This reinforced, and was reinforced by, Britain's weakening grip on its colonies. Independence movements grew more formidable from India to Ireland, where a successful revolt forced Britain to accept the country's independence in 1923. Even the Dominions increasingly asserted their independence, Canada refusing to back Britain in the Chanak Crisis that same year, when Allied troops at the Dardanelles confronted Mustapha Kemal's troops. The Allies ended up backing down in that crisis, with the result that Germany's wartime ally Turkey overturned the terms imposed on it at the end of the conflict – a humiliation that ended the career of Prime Minister Lloyd George.
The onset of the Great Depression in 1929 worsened matters. The older sectors on which British industry (steel, textiles, shipbuilding) remained dependent were particularly hard-hit, and put an end to the gold standard for good in 1931. That same year Britain recognized the independence of the Dominions of Canada, South Africa, Australia and New Zealand under the Statute of Westminster (confirming Britain's decreasing ability to rely on their significant resources). Germany, Italy and the Soviet Union appeared in the ascendant in these years as Britain remained mired in outdated economic policies at home (responding to depression with austerity). Abroad, Britain backed down in confrontations with Italy over its invasion of Ethiopia, and German over its remilitarization and expansionism (the enlargement of the German navy in 1935, the militarization of the Rhineland in 1936, the country's unification with Austria in 1938 and dismemberment of Czechoslovakia in 1938 and 1939). World War II came despite this policy of "appeasement," and after the Battle of France, left the British Empire alone for nearly a year against Germany and its allies, until German actions – its invasion of the Soviet Union, and declaration of war on the U.S. after Pearl Harbor – brought both these powers into the war against it as full-scale belligerents.
The End of Empire
At the end of the war in 1945 Britain looked stronger than ever. Germany was defeated again, far more decisively than it had been in World War I, so that British troops were in Berlin, while Italy and Japan were similarly defeated and under Allied occupation – three of the great threats of the interwar period to Britain and its empire neutralized. Britain also possessed military forces larger than ever before in its history, nearly five million of its people in uniform, distributed around the world to an extent that had never been seen as they occupied an assortment of French, Italian and Dutch colonies from Libya to Vietnam. Reflecting this position, Britain appeared at the wartime conferences of the Allies in Tehran, Yalta and Potsdam as one of the "Big Three."
Nonetheless, unlike the U.S., which realized its enormous latent power in the conflict, Britain exhausted its own power, the demands of the conflict exacerbating an already weak international position. Again British trade was dislocated. Again the shipping fleet was savaged by U-boats. Again foreign investments had to be sold off to cover expenses, this time half of them (compared with just a thirteenth in World War I). And as if this were not all enough, German bombing inflicted heavy physical damage on the country.
In all, one-quarter of the country's remaining wealth was gone by war's end, while the debt load had increased, markets had again been lost, and the country left insolvent, even before it could demobilize its forces and its economy, rebuild an industrial plant run-down by the sacrifice of all investment and maintenance to all-out production, and try to start paying its own way in the world again. Indeed, coming on top of the massive net wartime borrowing from the United States (and Canada), in 1946 the government had to take out yet a new multi-billion dollar American loan to keep its head above water.
Besides damaging British resources, the war also damaged British prestige, particularly in Asia, where many of its colonies were conquered and occupied by Japanese troops, including Hong Kong, Malaysia, Singapore and Burma. These losses destroyed Britain's credibility as a military power in the area, while strengthening local independence movements (in Burma, in control of the country when British troops returned), further undermining what remained of Britain's ability to control them. India (which at the time included present-day Pakistan and Bangladesh) was independent by 1947, Burma by 1948, Ceylon by 1949.
Nonetheless, withdrawal from these territories was initially conceived as part of a restructuring and rationalization of the empire, rather than its winding up. There were, for instance, hopes that Africa might constitute a "third" British Empire (replacing the "second" empire that was India, which in its turn had replaced the "first" American empire Britain lost in the Revolutionary War). Local opposition, and British weakness, made this impracticable, and by 1960 Harold Macmillan was speaking of the "Wind of Change." Nigeria, the Southern Cameroons, British Somaliland and Somalia gained their independence that year, Sierra Leone and Tanganyika the next. Kenya, which some had hoped would become a second South Africa, but which instead saw the Mau Mau revolt, was independent by 1963, as was Zanzibar, and all of Britain's remaining African possessions in the next few years.
With these colonies went both a crucial claim to Britain's great power status, and a crucial foundation of it, as they had continued to provide important props to the metropole's economy (like captive markets) and significant additions to it (like supplementary military manpower). The Commonwealth of Nations, despite all the hopes held out for it, proved no substitute for empire of any kind, and did little if anything to slow the drift of even Canada and Australia away from Britain toward the U.S. in their security and trading arrangements.
Naturally, Britain could not compete with the far vaster, and more fully mobilized, economic and military resources of the United States and the Soviet Union, the new "superpowers." Additionally, the reach of both the U.S. and the Soviet Union penetrated into Europe far more deeply and extensively by 1945 than had been the case in 1939, epitomized by the boots they had on the ground down through the middle of the continent. Even at sea, where Britain had still commanded the world's largest fleet at the war's start, it was eclipsed by the United States Navy during the conflict, without ever coming close to catching up. Not long after the war, it was to become increasingly clear that Britain could not even compete qualitatively with the superpowers in techno-military areas like nuclear weaponry, aerospace and blue-water naval capabilities. (The country's nuclear deterrent was to be founded on Polaris ballistic missiles imported from the U.S., its newer fighters like the Tornado built through international agreements rather than an independent British industry, its last fleet carrier retired in 1979.)
The result was that while Britain had historically been the "offshore balancer" in European politics, preventing any one country from growing powerful enough to dominate the continent by throwing its weight to this side or that one in a conflict (as when it joined the coalitions against France in the Napoleonic era), it had largely been supplanted in that role by the United States. The situation was much the same elsewhere in the world. Britain continued to play something like its old role in the Middle East and Southeast Asia through the 1950s and 1960s through client regimes, basing arrangements and the like. However, it quit Palestine in 1948, and lost its Egyptian client King Farouk just four years later – and when he was succeeded by Gamal Nasser, Britain's biggest overseas military base, its facility at Suez. Britain's attempt to reestablish its earlier position in the country by joining with France and Israel to overthrow Nasser, quashed by Soviet and American pressure, only underlined Britain's inability to act independently. Just two years later another client regime, the Hashemite monarchy, fell in Iraq, ending Britain's anemic attempt to organize collective security in the region under the "Baghdad Pact." In 1963 British troops faced an insurgency in the strategic colony of Aden, which led to British withdrawal from the country four years later.
Britain did not suffer such dramatic setbacks in Southeast Asia, the Malayan Emergency and the "Konfrontasi" with Indonesia that followed generally chalked up as successes. However, by the late 1960s the maintenance of a military presence there proved more financially costly than the country could afford, leading to the 1968 announcement that except for the retention of Hong Kong, Britain would withdraw "east of Suez" by 1971, an intention on which it soon made good.
There is no question about what these adjustments meant in practical terms, but there is considerable debate over what they meant in cultural and political terms at home. Some historians view them as having been relatively painless, and compare the process favorably with France's convulsions over Algeria at the same time. At the other end of the spectrum are those who believe British culture has never quite come to terms with the fact, even now three generations after World War II's end; that the country never quite moved on, the old pretensions turning up in the most inappropriate places. As Jeremy Paxman put it last year,
When a British prime minister puffs out his chest and declares he "will not tolerate" some African or Middle Eastern despot, he speaks not as a creature of a 21st-century political party in a dilapidated democracy but as the latest reincarnation of Castlereagh or Palmerston – somehow, British foreign policy has never shaken off a certain 19th-century swagger, and the implied suggestion that, if anything happens to a British citizen, a Royal Navy gunboat will be dispatched to menace the impertinent perpetrators.
Nonetheless, wherever the truth lies, there is no doubt that at the time there was a significant current, especially in elite opinion, which was unhappy with the change.
Change at Home
While Britain experienced these changes in its international standing, British society was in flux domestically. In the years prior to World War I, the classical liberal consensus which had prevailed in Britain was challenged by increasing pressure from increasingly strong labor and feminist movements, and the dispute over the status of Ireland. The outbreak of the war only deferred these issues, however, the conflicts over them continuing in such events as the successful breakaway of Ireland; the decline of the Liberal Party to third-party status, while the Labour Party formed its first government in 1924; and the General Strike of 1926.
During and after World War II the discrediting of the Conservative establishment by its mismanagement of the early part of the war (epitomized by Dunkirk, viewed at the time not as some glorious "miracle," but as a catastrophic defeat); the failures of conservative, free-market economic orthodoxy in the 1930s, and the successes of government planning during the war, on the home front as well as on the battlefield; the war's effect in creating a sense of national togetherness, idealism, and promise of "broad sunlit uplands" ahead; contributed to a vision of a more egalitarian post-war Britain as a goal of the conflict, epitomized by the wide support for the wartime Beveridge Report, which called for "a comprehensive policy of social progress."
In the August 1945 general election, the Conservative government of Winston Churchill was swept out of power, and replaced with a Labor government under Clement Attlee with a broad reform mandate. This extended beyond the treatment of full employment as a national economic priority, to labor-friendly legislation (like the Trade Disputes and Trade Unions Act of 1946, which widened the powers of organized labor), the establishment of a "cradle-to-grave" welfare state (with such measures as the National Insurance Act of 1946, and the launch of the National Health Service in 1948), and even the nationalization of key sectors of the economy (including the Bank of England, the coal, gas and steel industries, and the railways). These policies ultimately became the foundation of the "social market" or "welfare" capitalism that characterized the "postwar consensus."
As economic policy the consensus proved a significant success. Despite all the difficulties previously described, the post-World War II was a period of rapid economic growth and rising living standards for Britons as a whole – of prosperity unexampled in the country's history – such that many historians have dubbed it a "golden age" for the nation. Nonetheless, like all compromises, it left many dissatisfied at both ends of the political spectrum. On the left were those who had hoped that the policies of the Labor government of 1945-1951 would prove just the beginning of a movement toward deeper change. On the right were those who felt these changes had already gone too far, and even that any such movement was much too far, opposed as they were to such "leveling" tendencies on principle. (Author Evelyn Waugh famously remarked that during the years of the Attlee government the country felt to him like it was "under enemy occupation.")
These economic and political changes were accompanied by socio-cultural changes, evident in everything from personal relations (from a new working-class assertiveness to the Sexual Revolution) to the arts (shaken up by everything from the arrival of the "Angry Young Men" in British letters to rock music). And all these together interacted with those currents in opinion that lamented Britain's lost imperial status, with many resentful about the former seeing the latter as a cause, pointing to such things as generation gaps and rising labor militancy as culprits. All this lent a new edge to the age-old lament of the privileged that the young and the lower social orders did not "know their place," and that other age-old lament of anxious conservatives that "things are going wrong," but it seemed to many that all they could do was wax nostalgic for a romanticized version of the past.
After 1973
Britain's shift from world-spanning empire and international hegemon to "normal" country had largely run its course by the 1970s. Nonetheless, the decade saw other problems intensify as the post-war boom came to an end in Britain, just as it was doing in the rest of the world. The downturn saw increases in inflation, unemployment and fiscal strain, such that the country got a loan from the International Monetary Fund in 1976. The times took their toll on labor relations as unions clamored for pay increases in line with rising prices. The response among some observers to these events was hysteria, exemplified by the rise of organizations like GB 75 intended to combat trade unionists if they got too far out of line, and the rumored coup plot against Prime Minister Harold Wilson from within his own government, and culminated in the "winter of discontent" of 1978-1979.
The period's troubles also created a significant opportunity for right-wing opponents of the post-war consensus, with Prime Minister Margaret Thatcher making history with the implementation of a neoliberal economic program. This aimed at reviving the economy by making the country more business-friendly by privatizing state enterprises (like those nationalized under Attlee), deregulating business activity (as in the famous "Big Bang" deregulation of British financial markets), chipping away at the welfare state (as with the 1986 Social Security Act) and the suppression of organized labor (as in the Employment Act of 1980, which targeted closed shops and picketing), as well as monetarist policies aimed at controlling inflation.
The actual accomplishments of these policies were meager, their effectiveness at taming inflation and unemployment spotty at best. United Nations statistics on British GDP, when adjusted for population growth and inflation, show that even with the help of North Sea oil, growth in the eighteen years of Conservative rule that began with Thatcher (1979-1997) was only slightly better than what the country enjoyed in the much-maligned 1973-1979 period (1.9 percent, compared with the earlier 1.4 percent a year), and a far cry from its pace in the '50s and '60s. These years also saw the continued decay of British manufacturing, and along with it, a worsening balance of payments situation as sizable trade deficits in manufactured goods became a way of life.
The result is that such growth in Gross Domestic Product as did occur was hollow. Moreover, the harm such policies did the less affluent made Thatcherism proved deeply divisive in Britain, in a way that contemporary "Reaganomics" has not been for the United States. However, this "post-postwar consensus" continued to prevail through the "New" Labor premierships of Tony Blair (1997-2007) and Gordon Brown (2007-2010). The 2008 economic downturn, which internationally has contributed to some of the deepest questioning of neoliberal economics seen in a long time, has instead pushed the country farther along this path as the British government responded to recession in the same way that it did in the 1930s, with austerity measures.
The Thatcher era is also associated with a more assertive Britain on the world stage. Nonetheless, while British Prime Ministers sometimes spoke as if their country were still a superpower, the country's economic weight only continued to decline. Quickly as Britain grew in the post-war years, Japan, West Germany and France grew even faster, and all eventually overtook it in the area of economic output. Unsurprisingly, Britain was most prominent when acting in concert with the U.S., as when it allowed the U.S. to station intermediate-range nuclear forces in the country in the 1980s, cooperated in the American bombing of Libya in 1986, and committed more troops than any other of the U.S.'s NATO allies to Desert Shield and Desert Storm in 1990-1991. Even when Britain did pursue its own objectives, as in the 1982 Falklands War, it proved heavily reliant on the assistance of the United States (like its provision of the latest Sidewinder missiles for its Harriers, reportedly crucial to winning the air war over the islands).
The end of the Cold War, and the years that followed, saw the continuation of these trends, as the reunification of Germany and the closer integration of the European Union – a process from which Britain kept aloof – changed the distribution of economic power in its region. Globally, there was also a shift in world output and the balance of trade to East Asia, with the rise of Japan followed by China's emergence as an economic colossus, while India, Brazil and Russia have also, according to certain measures, overtaken Britain in the area of GDP. In the military sphere, defense cutbacks following the dissolution of the Soviet Union, and the 2008 economic crisis, have only further diminished Britain's capacity for long-range military action, as demonstrated in the Anglo-French intervention in Libya in 2011 – ultimately dependent on the backing of the U.S. for its success.
Today Britain still enjoys a number of great power trappings, like its permanent seat on the United Nations Security Council. The City of London remains an international financial center, in the view of some of comparable importance to Wall Street. Yet, Britain's strongest claim to global influence in the post-war era may actually be British culture and media, reflected in the status of the BBC and Reuters, the Beatles and Mr. Bean, and of course, James Bond himself.
Monday, July 16, 2018
Admiral Kirk and Captain Spock Ride The Bus
Recently I ran across Star Trek IV: The Voyage Home again--the bit where Kirk and Spock walk around twentieth century San Francisco and then ride the bus and have their run-in with the punk with the boom box. I was impressed by how much satire (remarking how we treat each other, what passes for literature in our era) was packed into this one little scene, and how gracefully, how entertainingly, how charmingly it plays (for a scene with an obnoxious punk annoying a whole bus and giving our heroes the finger, anyway).
Star Trek IV was, of course, a summer blockbuster in its day. And recently it occurred to me--when was the last time that our bombastic summer blockbusters gave us a scene that worked in such a way? Simple, straightforward filmmaking (two people walking around, getting on a bus, talking), which said so much and amused so much at the same time, and proved so memorable? Certainly I can't remember any such thing in the recent Trek films. Anyone else able to think of one?
Star Trek IV was, of course, a summer blockbuster in its day. And recently it occurred to me--when was the last time that our bombastic summer blockbusters gave us a scene that worked in such a way? Simple, straightforward filmmaking (two people walking around, getting on a bus, talking), which said so much and amused so much at the same time, and proved so memorable? Certainly I can't remember any such thing in the recent Trek films. Anyone else able to think of one?
Empire, Spies and the Twentieth Century
Studying international relations certain subjects come up again and again and again. Even when one approaches such things as the rise and fall of great powers, hegemony and hegemonic cycles, systemic wars, industrialization and deindustrialization on a principally theoretical level, Britain, its rise and subsequent decline as an economic, military and imperial powers comes up again and again and again--and does so that much more when one takes an interest in concrete history, when one considers the situation of the recent or present or future United States in which so many have seen so many parallels.
Dealing with the spy fiction genre and its history, the same theme is almost inescapable. Much as we may think of spy fiction as a Cold War genre, the truth is that it was already well into middle age when the Cold War took center stage in international life, the thread that has really run through it from its beginning to the present the passing of the British Empire that already by 1900 saw its peak behind it. Anxiety that that empire would pass, bitterness that it seemed to be passing or actually was passing, denial about what was happening, and criticism of all these attitudes, are already there at the very beginning, in such founding works as Erskine Childers' celebrated The Riddle of the Sands and William Le Queux's contemporaneous but much less celebrated Secrets of the Foreign Office. ("The name is Drew, Duckworth Drew.")
As both of these have been major interests for me for decades, thinking about one matter naturally leads back to my thinking about the other, one reason why when writing about the James Bond films and books I spent a certain amount discussing the actual history of the period--more than most such writers, I think. When I published The Many Lives and Deaths of James Bond I actually included a thirteen page essay devoted to just such historical background in the appendix (and included it again when the second edition of the book came out back in 2015). You can check it out here, reposted on this blog.
Dealing with the spy fiction genre and its history, the same theme is almost inescapable. Much as we may think of spy fiction as a Cold War genre, the truth is that it was already well into middle age when the Cold War took center stage in international life, the thread that has really run through it from its beginning to the present the passing of the British Empire that already by 1900 saw its peak behind it. Anxiety that that empire would pass, bitterness that it seemed to be passing or actually was passing, denial about what was happening, and criticism of all these attitudes, are already there at the very beginning, in such founding works as Erskine Childers' celebrated The Riddle of the Sands and William Le Queux's contemporaneous but much less celebrated Secrets of the Foreign Office. ("The name is Drew, Duckworth Drew.")
As both of these have been major interests for me for decades, thinking about one matter naturally leads back to my thinking about the other, one reason why when writing about the James Bond films and books I spent a certain amount discussing the actual history of the period--more than most such writers, I think. When I published The Many Lives and Deaths of James Bond I actually included a thirteen page essay devoted to just such historical background in the appendix (and included it again when the second edition of the book came out back in 2015). You can check it out here, reposted on this blog.
Of Starhunter
Starhunter appeared at the tail end of the '90s-era science fiction television boom, and came into that long-crowded field quietly, at least in the United States. Rather than a prime time slot on even a cable network, it went to syndication--and in my area, late night syndication. It ran for only one year before getting a major overhaul (it became Starhunter 2300, with the principal cast mostly changed), then lasted for only one year after that during which, even in that period when I followed the scene very closely and caught something of just about every show (in fact, watched just about all of just about every other small screen space opera), I not only didn't see it but was scarcely aware of its existence until it was already off the air, past which it didn't have much of a life in reruns (not in my area, anyway), and only happened to see it years later on DVD. (Remember those?)
The slightness of the show's wiki (which, despite dating back to June 2009, has a mere 31 articles, most of them quite short) does not suggest much has changed since then.
All the same, a continuation of the show, Starhunter: Transformation is reportedly in the works, and perhaps that is why the show has recently been remastered and reissued as Starhunter Redux, while since May airing in prime time reruns on Robert Rodriguez's TV network, El Rey.
Catching one of those reruns recently the show reminded me of another space opera that flew under most people's radars, Lexx. Like Starhunter, Lexx was an internationally financed Canadian project, and relatively low-budget, but there they parted ways. Despite the slenderness of the resources put up for it, Lexx, packed with colorful sets and bright CGI and surprisingly globe-trotting location shooting (from Namibia to Thailand to Iceland the crew went, and it's all up there on the screen) inclined toward the exotic, the weird, the extravagant--the end of the universe in a gray goo-induced Big Crunch (and no, it wasn't just a dream, an alternate timeline or any other such lame cop-out, it really was the end of the universe) a mere season finale.
By contrast Starhunter, while having some hints of something bigger going on in the background (and not always just the background, as the season two cliffhanger shows), tended toward the low-key and small-scale in its plots and its look from episode to episode, the space ships and stations and colonies utilitarian to the point of being bare bones, and most of the episodes taking place in their dark, dusty interiors, which matched the tendency toward the noirisih and gritty in tone. Looking back on it I suspect this probably did not help it win over a broad audience, but it did set it apart from the generally flashier, splashier, zanier fare that characterized the genre then and now.
The slightness of the show's wiki (which, despite dating back to June 2009, has a mere 31 articles, most of them quite short) does not suggest much has changed since then.
All the same, a continuation of the show, Starhunter: Transformation is reportedly in the works, and perhaps that is why the show has recently been remastered and reissued as Starhunter Redux, while since May airing in prime time reruns on Robert Rodriguez's TV network, El Rey.
Catching one of those reruns recently the show reminded me of another space opera that flew under most people's radars, Lexx. Like Starhunter, Lexx was an internationally financed Canadian project, and relatively low-budget, but there they parted ways. Despite the slenderness of the resources put up for it, Lexx, packed with colorful sets and bright CGI and surprisingly globe-trotting location shooting (from Namibia to Thailand to Iceland the crew went, and it's all up there on the screen) inclined toward the exotic, the weird, the extravagant--the end of the universe in a gray goo-induced Big Crunch (and no, it wasn't just a dream, an alternate timeline or any other such lame cop-out, it really was the end of the universe) a mere season finale.
By contrast Starhunter, while having some hints of something bigger going on in the background (and not always just the background, as the season two cliffhanger shows), tended toward the low-key and small-scale in its plots and its look from episode to episode, the space ships and stations and colonies utilitarian to the point of being bare bones, and most of the episodes taking place in their dark, dusty interiors, which matched the tendency toward the noirisih and gritty in tone. Looking back on it I suspect this probably did not help it win over a broad audience, but it did set it apart from the generally flashier, splashier, zanier fare that characterized the genre then and now.
Reconsidering Philo-Fiction
Some years ago the philosopher Terence Blake raised the question of "philo-fiction," fiction which uses philosophy the way science fiction uses science, and whether we might see it come into its own as a genre. Of course, that raises the question of what we mean by "philosophy." My initial thought was that philo-fiction as he used the term (fiction where the fundamental rules of the universe differed so deeply from our own) could be thought of a subset of science fiction, and so, perhaps.
My answer's changed since then. Fiction dependent on such a radical difference, it seems to me, is so demanding for the writer, and the reader, that it could probably never be very prolific--so that while we probably will keep seeing people try their hand at it every now and then, I don't think I could see it becoming a full-blown genre, certainly not on the scale that science fiction has at its peak.
However, I have also found myself thinking about the matter of philo-fiction another way, because I find myself ever less satisfied with the way we delineate "philosophy." After all, all intellectual investigation was known by that name, once. However, what happened was that proponents of a particular philosophical approach--old-fashioned induction and deduction, applied in a materialist, empirical way--was formalized by figures like Francis Bacon into the scientific method, after which it was known not as "philosophy" anymore, but a separate enterprise. This has in fact gone so far that many, maybe even most, of today's scientists actually have little intellectual grasp of the premises of their life's work.
So has it also gone with investigation of the social world. Studying International Relations in college we were exposed to a considerable amount of Plato, Aristotle, Augustine, Aquinas, Hobbes, Spinoza, Locke, Kant, Hegel, St.-Simon, Marx. In time the IR student (especially if they go past the B.A. level) is expected to read at least some of the actual texts deemed more important, which amount to a healthy chunk of the philosophical canon. Yet one is unlikely to read anyone labeled a "philosopher" from after the mid-nineteenth century, not because people stopped doing philosophy relevant to the subject, but because the labeling of those philosophers of obvious and direct relevance to the field changed to "economist" or "sociologist," "political scientist" or "social scientist," because of their use of a particular epistemological approach now labeled not "philosophy," but the science held to be a very different thing.
Now we use "philosophy" to denote inquiry into epistemology and ethics and little else, just those things that "we haven't learned to treat scientifically yet," with the invidious comparison between the rigorous applier of the scientific method and the fuzzy, verbal, non-quantitative folks with the ever-smaller turf not at all subtle.1 I remember, for instance, a scene in Robert Sawyer's book Flashforward where his protagonist Lloyd Simcoe reacts contemptuously to a philosophy professor's remark on the titular event, which seems worth citing here.
1. I find myself thinking of other eighteenth century terms similarly narrowed--and impoverished. Take, for example, "manners," which has been reduced from culture to etiquette, or "education," which rather than a whole upbringing seems to mean formal academic training and that alone.
My answer's changed since then. Fiction dependent on such a radical difference, it seems to me, is so demanding for the writer, and the reader, that it could probably never be very prolific--so that while we probably will keep seeing people try their hand at it every now and then, I don't think I could see it becoming a full-blown genre, certainly not on the scale that science fiction has at its peak.
However, I have also found myself thinking about the matter of philo-fiction another way, because I find myself ever less satisfied with the way we delineate "philosophy." After all, all intellectual investigation was known by that name, once. However, what happened was that proponents of a particular philosophical approach--old-fashioned induction and deduction, applied in a materialist, empirical way--was formalized by figures like Francis Bacon into the scientific method, after which it was known not as "philosophy" anymore, but a separate enterprise. This has in fact gone so far that many, maybe even most, of today's scientists actually have little intellectual grasp of the premises of their life's work.
So has it also gone with investigation of the social world. Studying International Relations in college we were exposed to a considerable amount of Plato, Aristotle, Augustine, Aquinas, Hobbes, Spinoza, Locke, Kant, Hegel, St.-Simon, Marx. In time the IR student (especially if they go past the B.A. level) is expected to read at least some of the actual texts deemed more important, which amount to a healthy chunk of the philosophical canon. Yet one is unlikely to read anyone labeled a "philosopher" from after the mid-nineteenth century, not because people stopped doing philosophy relevant to the subject, but because the labeling of those philosophers of obvious and direct relevance to the field changed to "economist" or "sociologist," "political scientist" or "social scientist," because of their use of a particular epistemological approach now labeled not "philosophy," but the science held to be a very different thing.
Now we use "philosophy" to denote inquiry into epistemology and ethics and little else, just those things that "we haven't learned to treat scientifically yet," with the invidious comparison between the rigorous applier of the scientific method and the fuzzy, verbal, non-quantitative folks with the ever-smaller turf not at all subtle.1 I remember, for instance, a scene in Robert Sawyer's book Flashforward where his protagonist Lloyd Simcoe reacts contemptuously to a philosophy professor's remark on the titular event, which seems worth citing here.
Lloyd . . . found himself crumpling up the newspaper page and throwing it across his office. A philosophy professor! . . . Lloyd sighed. Couldn't they have gotten a scientist to address this issue? Someone who understands what really constitutes evidence? A philosophy professor. Give me a fucking break.While I won't claim infallibility on this point, to my knowledge no one has remarked the scene in a significant and public way--and it does not strain credulity that they have not, because this attitude seems so commonplace. But perhaps that is one reason why it seems to me all the more important to argue that despite the relabeling, science did not stop being philosophy--perhaps the more so because of how invidious the comparisons between philosophy and science can get. Accordingly we may regard speculative fiction depicting or extrapolating from or simply playing with the theories, practice, knowledge gained by the sciences as "philo-fiction." Even if in deference to the irrationality of prevailing usages we only regard fiction which is more narrowly interested in epistemology or ethics as philo-fiction, then science fiction has been doing that too, concerned with what we can know and how, and what we ought to do about it, can go by that name as well, at times more recognizable than others, but by any plausible measure never rare or marginal, and by this point quite prolific. (I certainly would not deny that Isaac Asimov, in those early Robot stories, was dealing with philosophy, in "philo-fiction," even as he was producing some of what we think of as the hardest of hard Campbellian science fiction.) In fact, I will go further and say that, in this sense at least, science fiction has simply been philo-fiction all along.
1. I find myself thinking of other eighteenth century terms similarly narrowed--and impoverished. Take, for example, "manners," which has been reduced from culture to etiquette, or "education," which rather than a whole upbringing seems to mean formal academic training and that alone.
Friday, June 1, 2018
Reflections of a NewSpace Skeptic
NOTE: I penned this piece last summer but have only got around to publishing it now. While some of the details have since dated, it still seems to me worth sharing because its purpose, after all, is reflection on the trend of a decade, unaltered by subsequent events.
I recall many years ago running across a headline about Virgin Galactic's unveiling SpaceShip Two. Thinking that progress on the vehicle had gone a lot faster than I had heard, and expecting to see it rolling out of a hangar before the world press, I clicked on it—and saw instead a picture of a broadly grinning Richard Branson holding a model of what SpaceShip Two would eventually look like when it was built.
Of course, headlines commonly promise much more than they deliver. Still, the particular way in which they do so where technology stories are concerned is especially problematic. A company announces an R & D project, or even some scientist mentions a technical possibility—and the journalist in question presents the technology as if it were already developed, put on the market, and in satisfactory, practical, everyday use. This kind of writing is, unfortunately, evident all across the spectrum of cutting-edge technologies today, sometimes in publications whose editors ought to know better. (I recall one respectable popular science magazine that, six years ago, made it sound as if submarine communications systems were already encrypted with quantum keys.) Moreover, this kind of distorting coverage does not prove that a particular line of research and development is all smoke and no fire.
All the same, following the news regarding space technology, and the space business generally, it seemed that from year to year we were hearing much more about boldly announced initiatives, or even the discussion of mere possibilities, than about evidence of concrete progress—and that expectations generally were overblown, especially given the combination of financial and technical hurdles to be overcome, especially in regard to the more genuinely exciting opportunities. (Space tourism was one thing, space-based solar power of significance on a macroeconomic scale a much more interesting but much more demanding thing.)
It seems to me that much of what has happened has validated my skepticism. Virgin Galactic was, after all, was supposed to take its first suborbital tourists up in 2007. Ten years on neither Virgin, nor any other private company, has flown a single customer. Quite the contrary, the greatest success story for NewSpace in recent years has not been a matter of tourism, but the more established (and less glamorous) business of launching satellites, and that not by means of newfangled spaceplanes, but multi-stage rockets of the sort we have been using since the beginning of the space age.
The protagonist of this story, SpaceX's Falcon 9, has yet to attain even the reliability of existing boosters in completing its mission. Its success rate, recently appraised at 94 percent, compares unfavorably with the 99 percent success rate of the Atlas V, the 97 percent success rate of the much more heavily used Soyuz-U, or even the reusable Space Shuttle. The same goes for the success rate of its recovery for reuse, its achievement of aircraft-like turnaround times, or its holding up under repeated missions—thus far, the Falcon 9 not yet bettering the track record of the much-derided space shuttle. Only the first stage of the rocket has been recoverable so far, and that just 80 percent of the time, its refurbishment for another flight recently took six months (again, no better than the shuttle, even post-Challenger) and even its most heavily used rocket remains a long way from matching Discovery's 39 safely completed flights.
Is this an entirely fair comparison? I freely admit that it is not. The Falcon 9 program, after all, remains not just a work in progress, but at an early point in its history compared with these other programs. Despite that the price charged by the company makes it plausible to claim its improving on the dollars-per-pound-to-orbit calculus (for the first time, a Western satellite-launcher might be approaching the $2,500 a pound-to-orbit mark, however one regards the terms), while its technique of rocket recovery, even if less complete or reliable than might be hoped, is clearly deserving of the history books. Still, it is worth remembering that the concrete progress we have seen did not happen anywhere near so quickly as some hoped. Moreover, the advances the company has made so far are more a matter of incremental improvement of familiar equipment where space vehicles are concerned, and a turn to developments from outside spacecraft-building, narrowly conceived, like 3-D printing (now being used to produce rocket parts).
I think it plausible and even likely that the company will continue to make headway in improving the rocket's performance according to all the relevant metrics—higher reliability in its launchings, more frequent and complete recovery, and more rapid refurbishment, which in turn may bring prices down appreciably. However, it seems likely that this will go on being a matter of incremental progress with familiar technology, and the broader improvement of production methods, and that the journey to $100 a pound-to-orbit will be measured in decades rather than years. Additionally, even the most guarded projection is a very different thing from a done deal—all this not having happened yet. However, I am much more optimistic about the efforts of SpaceX and the whole sector than I have been in a long time.
I recall many years ago running across a headline about Virgin Galactic's unveiling SpaceShip Two. Thinking that progress on the vehicle had gone a lot faster than I had heard, and expecting to see it rolling out of a hangar before the world press, I clicked on it—and saw instead a picture of a broadly grinning Richard Branson holding a model of what SpaceShip Two would eventually look like when it was built.
Of course, headlines commonly promise much more than they deliver. Still, the particular way in which they do so where technology stories are concerned is especially problematic. A company announces an R & D project, or even some scientist mentions a technical possibility—and the journalist in question presents the technology as if it were already developed, put on the market, and in satisfactory, practical, everyday use. This kind of writing is, unfortunately, evident all across the spectrum of cutting-edge technologies today, sometimes in publications whose editors ought to know better. (I recall one respectable popular science magazine that, six years ago, made it sound as if submarine communications systems were already encrypted with quantum keys.) Moreover, this kind of distorting coverage does not prove that a particular line of research and development is all smoke and no fire.
All the same, following the news regarding space technology, and the space business generally, it seemed that from year to year we were hearing much more about boldly announced initiatives, or even the discussion of mere possibilities, than about evidence of concrete progress—and that expectations generally were overblown, especially given the combination of financial and technical hurdles to be overcome, especially in regard to the more genuinely exciting opportunities. (Space tourism was one thing, space-based solar power of significance on a macroeconomic scale a much more interesting but much more demanding thing.)
It seems to me that much of what has happened has validated my skepticism. Virgin Galactic was, after all, was supposed to take its first suborbital tourists up in 2007. Ten years on neither Virgin, nor any other private company, has flown a single customer. Quite the contrary, the greatest success story for NewSpace in recent years has not been a matter of tourism, but the more established (and less glamorous) business of launching satellites, and that not by means of newfangled spaceplanes, but multi-stage rockets of the sort we have been using since the beginning of the space age.
The protagonist of this story, SpaceX's Falcon 9, has yet to attain even the reliability of existing boosters in completing its mission. Its success rate, recently appraised at 94 percent, compares unfavorably with the 99 percent success rate of the Atlas V, the 97 percent success rate of the much more heavily used Soyuz-U, or even the reusable Space Shuttle. The same goes for the success rate of its recovery for reuse, its achievement of aircraft-like turnaround times, or its holding up under repeated missions—thus far, the Falcon 9 not yet bettering the track record of the much-derided space shuttle. Only the first stage of the rocket has been recoverable so far, and that just 80 percent of the time, its refurbishment for another flight recently took six months (again, no better than the shuttle, even post-Challenger) and even its most heavily used rocket remains a long way from matching Discovery's 39 safely completed flights.
Is this an entirely fair comparison? I freely admit that it is not. The Falcon 9 program, after all, remains not just a work in progress, but at an early point in its history compared with these other programs. Despite that the price charged by the company makes it plausible to claim its improving on the dollars-per-pound-to-orbit calculus (for the first time, a Western satellite-launcher might be approaching the $2,500 a pound-to-orbit mark, however one regards the terms), while its technique of rocket recovery, even if less complete or reliable than might be hoped, is clearly deserving of the history books. Still, it is worth remembering that the concrete progress we have seen did not happen anywhere near so quickly as some hoped. Moreover, the advances the company has made so far are more a matter of incremental improvement of familiar equipment where space vehicles are concerned, and a turn to developments from outside spacecraft-building, narrowly conceived, like 3-D printing (now being used to produce rocket parts).
I think it plausible and even likely that the company will continue to make headway in improving the rocket's performance according to all the relevant metrics—higher reliability in its launchings, more frequent and complete recovery, and more rapid refurbishment, which in turn may bring prices down appreciably. However, it seems likely that this will go on being a matter of incremental progress with familiar technology, and the broader improvement of production methods, and that the journey to $100 a pound-to-orbit will be measured in decades rather than years. Additionally, even the most guarded projection is a very different thing from a done deal—all this not having happened yet. However, I am much more optimistic about the efforts of SpaceX and the whole sector than I have been in a long time.
Wednesday, May 30, 2018
Is the Sci-Fi Blockbuster Frozen in the Early '80s?
Those who have paid any attention to film history know how high concept and the action/science fiction blockbuster arrived in Hollywood in the mid-'70s, and in the subsequent three decades virtually swallowed up the market. (Because so many of the suck-up poptimist critics immediately "Nuh-uh!" any such claim, I afford some hard numbers about this here that should turn that pathetic "nuh-uh" into a spluttering "But, but, but . . .")
Looking back, I'm struck by the extent to which not just this broad trend, but the specific franchises date back to that time--and along with them, the common touchstones in discussion of science ficion film. Star Wars, Star Trek, Alien, Blade Runner, The Terminator. (Indeed, it often seems that our hack journalists can't have a single discussion of artificial intelligence or robotics without bringing up an Arnold Schwarzenegger B-movie from 1984. Am I really the only one who finds this pathetic and tiresome?)
This past weekend we got our tenth live-action Star Wars feature film, while the last three years saw brand-new, high-profile, big-budget installments in each and every one of these franchises. Our thirteenth Star Trek feature film in 2016 (while the franchise has also returned to TV--I hear, in debased form as Discovery), our sixth (or if you count the two Alien vs. Predator movies, eighth) Alien film last year, a Blade Runner sequel a few months after that, a fifth Terminator movie back in 2015 with the promise that the franchise had finally come to an end quickly broken when Kathryn Bigelow's ex-husband put off the sequels to the much more profitable Avatar yet again to reteam with his other ex-wife and the septugenarian former governor of California for what will apparently be an all senior citizen reboot of the movie they made back when the author of this post was too little even to know about R-rated movies.
Basically, it seems, Hollywood in that early, post-Star Wars boom period when high-concept and sci-fi action blockbusters were new amassed a certain number of properties and concepts that we can call "creative capital." To a great extent this side of its production has been living off that capital ever since, very little not connected with it in some way, coming from the same people, using its images, following in its footsteps. (Avatar, for instance, was a James Cameron production.) The principal exception would seem to be the comic book superhero-based blockbusters, which also had a crucial precedent in this period--the original Superman, somewhat ahead of its time, but with the slowness to tread the same path more than made up for in the enthusiasm that has left us up to our ears in movies based on comic books that are a half century old or older, with one Marvel movie barely leaving theaters before the next has arrived in them (indeed, the #1 position passed directly from Avengers 3 to Deadpool 2 this month, before passing again to Star Wars this weekend), and DC Comics failing to match Marvel but still taking a big bite out of the market in the process. (That disappointing Justice League movie was still the #10 hit of last year, while Wonder Woman, the champion of the previous summer, was #3.)
This is partly a testimony to how salable all this has been to a public extremely susceptible to brand name and nine figure marketing budgets, and very tolerant of repetition of the same material, even the same CGI imagery, far, far past the point of diminishing returns, but also a testimony to the sheer determination to keep milking an old IP, as the flops show. According to the figures over at Box Office Mojo, the last really impressive commercial performance by an Alien movie was in 1986, when Cameron's Aliens was #7 in its year at the American box office, and a very big hit internationally as well. Alien 3 was only #28 in 1992, Alien: Resurrection #43 in 1997, Prometheus a better but still less than stellar #24 in 2012, and last year's Alien: Covenant just #42, with room for doubt about whether there was any real profit in it. Compared with the colossal success of Terminator 2: Judgment Day, Terminator 3: Rise of the Machines was also a less than stellar performer, while Terminators 4 and especially 5 were real disappointments. (Hence the reboot.) But to high concept-minded executives, hey, following up a string of underperformers or even outright flops with more of the same beats actually giving a new idea a chance. And the fact that government tax breaks, product placement, merchandising and foreign moviegoers to whom the experience of Hollywood's offerings are still more novel helps them get away with this approach by reducing their out-of-pocket expenses and relying less on the readiness of those moviegoers who have seen it all before to fork over twenty bucks to sit in front of a big screen in 3-D glasses.
For now.
Looking back, I'm struck by the extent to which not just this broad trend, but the specific franchises date back to that time--and along with them, the common touchstones in discussion of science ficion film. Star Wars, Star Trek, Alien, Blade Runner, The Terminator. (Indeed, it often seems that our hack journalists can't have a single discussion of artificial intelligence or robotics without bringing up an Arnold Schwarzenegger B-movie from 1984. Am I really the only one who finds this pathetic and tiresome?)
This past weekend we got our tenth live-action Star Wars feature film, while the last three years saw brand-new, high-profile, big-budget installments in each and every one of these franchises. Our thirteenth Star Trek feature film in 2016 (while the franchise has also returned to TV--I hear, in debased form as Discovery), our sixth (or if you count the two Alien vs. Predator movies, eighth) Alien film last year, a Blade Runner sequel a few months after that, a fifth Terminator movie back in 2015 with the promise that the franchise had finally come to an end quickly broken when Kathryn Bigelow's ex-husband put off the sequels to the much more profitable Avatar yet again to reteam with his other ex-wife and the septugenarian former governor of California for what will apparently be an all senior citizen reboot of the movie they made back when the author of this post was too little even to know about R-rated movies.
Basically, it seems, Hollywood in that early, post-Star Wars boom period when high-concept and sci-fi action blockbusters were new amassed a certain number of properties and concepts that we can call "creative capital." To a great extent this side of its production has been living off that capital ever since, very little not connected with it in some way, coming from the same people, using its images, following in its footsteps. (Avatar, for instance, was a James Cameron production.) The principal exception would seem to be the comic book superhero-based blockbusters, which also had a crucial precedent in this period--the original Superman, somewhat ahead of its time, but with the slowness to tread the same path more than made up for in the enthusiasm that has left us up to our ears in movies based on comic books that are a half century old or older, with one Marvel movie barely leaving theaters before the next has arrived in them (indeed, the #1 position passed directly from Avengers 3 to Deadpool 2 this month, before passing again to Star Wars this weekend), and DC Comics failing to match Marvel but still taking a big bite out of the market in the process. (That disappointing Justice League movie was still the #10 hit of last year, while Wonder Woman, the champion of the previous summer, was #3.)
This is partly a testimony to how salable all this has been to a public extremely susceptible to brand name and nine figure marketing budgets, and very tolerant of repetition of the same material, even the same CGI imagery, far, far past the point of diminishing returns, but also a testimony to the sheer determination to keep milking an old IP, as the flops show. According to the figures over at Box Office Mojo, the last really impressive commercial performance by an Alien movie was in 1986, when Cameron's Aliens was #7 in its year at the American box office, and a very big hit internationally as well. Alien 3 was only #28 in 1992, Alien: Resurrection #43 in 1997, Prometheus a better but still less than stellar #24 in 2012, and last year's Alien: Covenant just #42, with room for doubt about whether there was any real profit in it. Compared with the colossal success of Terminator 2: Judgment Day, Terminator 3: Rise of the Machines was also a less than stellar performer, while Terminators 4 and especially 5 were real disappointments. (Hence the reboot.) But to high concept-minded executives, hey, following up a string of underperformers or even outright flops with more of the same beats actually giving a new idea a chance. And the fact that government tax breaks, product placement, merchandising and foreign moviegoers to whom the experience of Hollywood's offerings are still more novel helps them get away with this approach by reducing their out-of-pocket expenses and relying less on the readiness of those moviegoers who have seen it all before to fork over twenty bucks to sit in front of a big screen in 3-D glasses.
For now.
Thoughts on Alien: Covenant
WARNING: SPOILERS
I recently saw Alien: Covenant.
I didn't expect much. I got even less.
To be fair, the Alien series has not been a favorite of mine. I appreciate the place the first two films have in science fiction film history (and I did enjoy the second movie in particular, as a shoot 'em up from a period when that kind of thing was still fresh), but my interest in it was limited all the same, and it soon came to seem yet another instance of a series dragging on way, way too long mostly because Hollywood studios are so adamant about keeping every single established IP going for as long as possible.
Besides, I was dubious about the film's predecessor, Prometheus, for reasons John McCalmont described fairly well at the time of release. It raised fairly commonplace, trite (to me, silly) questions, and then didn't try to answer them, instead giving us a plot that "is really nothing more than a series of doors slammed in characters' faces by a cruelly indifferent universe."1
Rather than "playful," it was hateful.
I grant that this sort of thing might--might--have meant something, once. But after three generations of smug postmodernist "subversion" (itself, really just a recycling of a tradition of misanthropy elites have promulgated for self-serving reasons going back to the ancients), do we really need more of this?
I say that we don't.
But I got a vague idea from some of the discussion of the movie that it had something to say about the mysteries--what the deal was with those alien "Engineers." So I gave it a chance.
Instead we got a typically pretentious opening in a huge white room with a grand piano in it (does no one else notice this cliche?), Billy Crudup's character whining about people of faith being discriminated against in an atheist world (that must be that "liberal Hollywood" at it again), and more Frankenstein complex inanity as yet another robot created in our image decides to turn on us, and once again actors look terrified as pieces of rubber (or were they CGI?) jumped on them and members of the crew splashed blood all over the set, because instead of dropping the xenomorphs from the film, like was apparently discussed at one point (and like I would have preferred), the movie pretty was mostly xenomorphs attacking people, all on the way to a final "twist" that even the dumbest viewer of such movies must have seen coming prior to a pretentious close where David has to declare to us his own choice of soundtrack as he heads off to wreak interplanetary havoc.
After seeing this film I wasn't terribly surprised to see that Transcendence's Jack Paglen had been involved (Frankenstein, Frankenstein, Frankenstein, groan), and that John Logan had his hand in this too (groan again).
Allegedly the moviemakers involved with the next installment (this film wasn't such a moneymaker that such an installment could be taken for granted, but, hey, sequel) are again thinking about not spending so much time on the xenomorphs, but I wouldn't hold my breath for that. Simply recycling the stuff of a forty year old movie is a lot easier than actually doing anything seriously interesting. And I expect that when that movie inevitably comes out, having learned my lesson, I will take a pass on it.
1. I'd link back to McCalmont's original piece on the film, which is well worth a read, but apparently the trolls have driven him to turn Ruthless Culture "private." Yes, what you asswipes do does have consequences.
I recently saw Alien: Covenant.
I didn't expect much. I got even less.
To be fair, the Alien series has not been a favorite of mine. I appreciate the place the first two films have in science fiction film history (and I did enjoy the second movie in particular, as a shoot 'em up from a period when that kind of thing was still fresh), but my interest in it was limited all the same, and it soon came to seem yet another instance of a series dragging on way, way too long mostly because Hollywood studios are so adamant about keeping every single established IP going for as long as possible.
Besides, I was dubious about the film's predecessor, Prometheus, for reasons John McCalmont described fairly well at the time of release. It raised fairly commonplace, trite (to me, silly) questions, and then didn't try to answer them, instead giving us a plot that "is really nothing more than a series of doors slammed in characters' faces by a cruelly indifferent universe."1
Rather than "playful," it was hateful.
I grant that this sort of thing might--might--have meant something, once. But after three generations of smug postmodernist "subversion" (itself, really just a recycling of a tradition of misanthropy elites have promulgated for self-serving reasons going back to the ancients), do we really need more of this?
I say that we don't.
But I got a vague idea from some of the discussion of the movie that it had something to say about the mysteries--what the deal was with those alien "Engineers." So I gave it a chance.
Instead we got a typically pretentious opening in a huge white room with a grand piano in it (does no one else notice this cliche?), Billy Crudup's character whining about people of faith being discriminated against in an atheist world (that must be that "liberal Hollywood" at it again), and more Frankenstein complex inanity as yet another robot created in our image decides to turn on us, and once again actors look terrified as pieces of rubber (or were they CGI?) jumped on them and members of the crew splashed blood all over the set, because instead of dropping the xenomorphs from the film, like was apparently discussed at one point (and like I would have preferred), the movie pretty was mostly xenomorphs attacking people, all on the way to a final "twist" that even the dumbest viewer of such movies must have seen coming prior to a pretentious close where David has to declare to us his own choice of soundtrack as he heads off to wreak interplanetary havoc.
After seeing this film I wasn't terribly surprised to see that Transcendence's Jack Paglen had been involved (Frankenstein, Frankenstein, Frankenstein, groan), and that John Logan had his hand in this too (groan again).
Allegedly the moviemakers involved with the next installment (this film wasn't such a moneymaker that such an installment could be taken for granted, but, hey, sequel) are again thinking about not spending so much time on the xenomorphs, but I wouldn't hold my breath for that. Simply recycling the stuff of a forty year old movie is a lot easier than actually doing anything seriously interesting. And I expect that when that movie inevitably comes out, having learned my lesson, I will take a pass on it.
1. I'd link back to McCalmont's original piece on the film, which is well worth a read, but apparently the trolls have driven him to turn Ruthless Culture "private." Yes, what you asswipes do does have consequences.
Remake, Remake and Remake Again
Hollywood has always been quick to remake movies. Astonishingly it made three versions of Dashiell Hammett's The Maltese Falcon between 1931 and 1941. (It's actually the third, John Huston-Humphrey Bogart movie we generally remember.)
Still, Hollywood was different then. The remakes were part of a far higher output of feature films, a major studio like MGM putting out one movie a week. And film was seen as nearly disposable back then, a bit more like how we view TV than movies, the more so because of how rapid changes were seen as making older material unsalable. With the talkies, "no one" wanted silent movies, while color and widescreen changed the terms yet again. At the same time the more straitlaced "Hayes' Code" meant that a lot of older material made in a freer period was no longer screenable--while if you were going to screen something to which you would have to sell tickets in competition with brand new movies, why not have new stars in it when they were what people wanted to see? All this was reflected in, and itself reflected, the fact that the studios didn't work very hard to old onto older material, much of it literally lost over the tumult of these decades, while the relaxation of censorship later meant that old stories which were presented only in bowdlerized fashion could get more faithful adaptation. (This was, in fact, a justification for the flurry of remakes of noir classics in the '70s and early '80s--The Big Sleep, The Postman Always Rings Twice.)
None of that applies now. The output of feature film is limited in quantity, each film representing a bigger proportion of the whole. The medium is no longer going through such flux as it did earlier, and the same applies for the bounds of censorship. We no longer think of films as disposable--each and every one treated as representing precious Intellectual Property, to be clasped tightly until the end of time. In fact, far from competing for ticket sales because it is the only way that one can see movies, TV, the Internet and the rest mean that audiences have never had cheaper, easier access to older movies. In the process, much of the justification for remakes has disappeared. Accordingly, it was possible to justify three Maltese Falcons over the '30s in a way that it does not seem possible to justify not three Spiderman movies but three Spiderman franchises in a decade of the twenty-first century (2007-2017). That Hollywood insists on doing it anyway is solely a matter of a critical (or is it uncritical?) minimum of people being willing to come in and see the resulting product, as has undeniably been the case. In commercial terms, high concept remains a success. And so long as that remains the case, it too will remain with us, no matter how much film critics and cultural commentators complain.
Still, Hollywood was different then. The remakes were part of a far higher output of feature films, a major studio like MGM putting out one movie a week. And film was seen as nearly disposable back then, a bit more like how we view TV than movies, the more so because of how rapid changes were seen as making older material unsalable. With the talkies, "no one" wanted silent movies, while color and widescreen changed the terms yet again. At the same time the more straitlaced "Hayes' Code" meant that a lot of older material made in a freer period was no longer screenable--while if you were going to screen something to which you would have to sell tickets in competition with brand new movies, why not have new stars in it when they were what people wanted to see? All this was reflected in, and itself reflected, the fact that the studios didn't work very hard to old onto older material, much of it literally lost over the tumult of these decades, while the relaxation of censorship later meant that old stories which were presented only in bowdlerized fashion could get more faithful adaptation. (This was, in fact, a justification for the flurry of remakes of noir classics in the '70s and early '80s--The Big Sleep, The Postman Always Rings Twice.)
None of that applies now. The output of feature film is limited in quantity, each film representing a bigger proportion of the whole. The medium is no longer going through such flux as it did earlier, and the same applies for the bounds of censorship. We no longer think of films as disposable--each and every one treated as representing precious Intellectual Property, to be clasped tightly until the end of time. In fact, far from competing for ticket sales because it is the only way that one can see movies, TV, the Internet and the rest mean that audiences have never had cheaper, easier access to older movies. In the process, much of the justification for remakes has disappeared. Accordingly, it was possible to justify three Maltese Falcons over the '30s in a way that it does not seem possible to justify not three Spiderman movies but three Spiderman franchises in a decade of the twenty-first century (2007-2017). That Hollywood insists on doing it anyway is solely a matter of a critical (or is it uncritical?) minimum of people being willing to come in and see the resulting product, as has undeniably been the case. In commercial terms, high concept remains a success. And so long as that remains the case, it too will remain with us, no matter how much film critics and cultural commentators complain.
E.H. Carr and William Haggard
One of the classics one becomes familiar with studying International Relations is E.H. Carr's The Twenty Years' Crisis, widely considered the starting point for modern "realist" thought.
Appropriately the book is no narrow discussion of billiard-ball-type politics among nations, considering a good deal else, with one issue I find myself returning to every now and then what Carr referred to as the problem of the "Intellectual and the Bureaucrat." The intellectual inclines to theory, reasoning, principles, and what is good and right, and it is from this that their tendency to be involved with radical movements derives.
The bureaucrat--the civil servant--by contrast, "recoils from written constitutions and solemn covenants, and lets himself be guided by precedent, by instinct, by feel for the right thing," a feel guided by experience that leads them to claim "an esoteric understanding of appropriate procedures . . . not accessible even to the most intelligence outsider," and the superiority of bureaucratic experience and training to the most brilliant intellect or refined theoretical understanding in these matters. And whether one sees this as self-serving, obscurantist nonsense or not, it carries carries serious political implications. That "practical practice" by which they set such store "easily degenerates into the rigid and empty formalism of the mandarin," with "politics an end in themselves," adding to the implications inherent in their position. More than just about "any other class of the community," the bureaucrat is "bound up with the existing order, the maintenance of tradition, and . . . precedent as the 'safe' criterion of action.'"
Recently recalling Carr's comment on this "antithesis" I found myself thinking of William Haggard's Colonel Russell novels. I can think of no other works of spy fiction that treat the opposition between the two types so directly or extensively. Nor of any that, in treating that opposition, is so vehement in taking a side.
Where political life is concerned, Haggard is as hostile to intellectuals as anyone you might care to name. In Slow Burn Haggard had the latter to say of scientists:
In The Power House, Haggard bashes a type of intellectual to which one would expect him to be even more hostile, not the physicist who dares have an opinion about politics, but those whose principal concern is the social, economic, political order, in his depiction of the hapless Labour MP Victor Demuth. A "fossil," espousing "doctrines as archaic for a modern left-leaning party as the Divine Right of Kings was now archaic to the Right," Russell held the man's attitudes to be a mark of deep personal failure, resulting from deep defects of what a certain sort of pompous person would call "character." Despite a background of great privilege, which combined with a genuine intelligence and determination "equipped [him] to compete at any level he'd cared to aim at," the Prime Ministership included, a lack of confidence and inclination to "flinch from conflict" made Demuth "slip . . . into the security of protest" instead.1 Ineffectual protest because, as another character in the novel reflects, "The Time of the Left would come perhaps, but it wouldn't be . . . the intellectuals, the professional washed-out rebels, but ruthless and determined men" who made it happen, ruthless and determined men who, whatever else they happened to be, would not be mere intellectuals.
By contrast Haggard's hero, Colonel Russell, is the consummate civil servant, and not merely by virtue of his title or pay grade, but his being an administrator who, unlike most spy chiefs in spy novels, actually administers, and plays his main part in the story by administering. However many times Fleming calls Bond a civil servant, what we usually see is Bond playing commando. And even as he ascended to a fairly senior level in Central Intelligence, Jack Ryan's adventures tended to have him caught up in heroics of some kind or another--in Clear and Present Danger this Acting Deputy Director of the CIA personally flying down to Colombia in a Pave Low and manning a minigun with which he mows down drug cartel soldiers in the course of rescuing an American special forces team inserted into the country. (I repeat: Deputy Director shooting lots and lots of people with a gatling machine gun as if he were Arnold Schwarzenegger.) Such black bag work and gunplay as the Haggard stories offer, however, Russell leaves to others, while he navigates the system, and in doing so "saves the day" in dramas that, like no other, celebrate the Bureaucrat as Hero.
Of course, in continuing his series over the next few decades Haggard did eventually start playing up the action, with Yesterday's Enemy an example of this. Still, this is how they started, and tended to run. In Yesterday's, certainly, what drew Russell into his more conventional spy adventure was in fact his old, legendary reputation as Super-Bureaucrat Extraordinaire.
In hindsight, it is an additional way in which these largely forgotten books were and remain unique within this genre.
1. To round out the right-wing cliche, we also see the upper-class leftist portrayed as a snob and a bigot, disliking his niece's suitor, allegedly, for his being in the casino industry, unintellectual, Catholic and therefore "a reactionary fascist beast."
Appropriately the book is no narrow discussion of billiard-ball-type politics among nations, considering a good deal else, with one issue I find myself returning to every now and then what Carr referred to as the problem of the "Intellectual and the Bureaucrat." The intellectual inclines to theory, reasoning, principles, and what is good and right, and it is from this that their tendency to be involved with radical movements derives.
The bureaucrat--the civil servant--by contrast, "recoils from written constitutions and solemn covenants, and lets himself be guided by precedent, by instinct, by feel for the right thing," a feel guided by experience that leads them to claim "an esoteric understanding of appropriate procedures . . . not accessible even to the most intelligence outsider," and the superiority of bureaucratic experience and training to the most brilliant intellect or refined theoretical understanding in these matters. And whether one sees this as self-serving, obscurantist nonsense or not, it carries carries serious political implications. That "practical practice" by which they set such store "easily degenerates into the rigid and empty formalism of the mandarin," with "politics an end in themselves," adding to the implications inherent in their position. More than just about "any other class of the community," the bureaucrat is "bound up with the existing order, the maintenance of tradition, and . . . precedent as the 'safe' criterion of action.'"
Recently recalling Carr's comment on this "antithesis" I found myself thinking of William Haggard's Colonel Russell novels. I can think of no other works of spy fiction that treat the opposition between the two types so directly or extensively. Nor of any that, in treating that opposition, is so vehement in taking a side.
Where political life is concerned, Haggard is as hostile to intellectuals as anyone you might care to name. In Slow Burn Haggard had the latter to say of scientists:
Take a clever boy . . . and put him into a laboratory for the next seven or eight years. What emerged inevitably was a materialist . . . a man who would assume without question that the methods of science could be applied to human societies.In short, in the eyes of this particular right-winger, they were a bunch of damned crypto-Communists, all too likely to turn traitor. And indeed, one of them (not just a leftist, but "no gentleman" either) did prove to be the traitor Russell spent the novel ferreting out.
In The Power House, Haggard bashes a type of intellectual to which one would expect him to be even more hostile, not the physicist who dares have an opinion about politics, but those whose principal concern is the social, economic, political order, in his depiction of the hapless Labour MP Victor Demuth. A "fossil," espousing "doctrines as archaic for a modern left-leaning party as the Divine Right of Kings was now archaic to the Right," Russell held the man's attitudes to be a mark of deep personal failure, resulting from deep defects of what a certain sort of pompous person would call "character." Despite a background of great privilege, which combined with a genuine intelligence and determination "equipped [him] to compete at any level he'd cared to aim at," the Prime Ministership included, a lack of confidence and inclination to "flinch from conflict" made Demuth "slip . . . into the security of protest" instead.1 Ineffectual protest because, as another character in the novel reflects, "The Time of the Left would come perhaps, but it wouldn't be . . . the intellectuals, the professional washed-out rebels, but ruthless and determined men" who made it happen, ruthless and determined men who, whatever else they happened to be, would not be mere intellectuals.
By contrast Haggard's hero, Colonel Russell, is the consummate civil servant, and not merely by virtue of his title or pay grade, but his being an administrator who, unlike most spy chiefs in spy novels, actually administers, and plays his main part in the story by administering. However many times Fleming calls Bond a civil servant, what we usually see is Bond playing commando. And even as he ascended to a fairly senior level in Central Intelligence, Jack Ryan's adventures tended to have him caught up in heroics of some kind or another--in Clear and Present Danger this Acting Deputy Director of the CIA personally flying down to Colombia in a Pave Low and manning a minigun with which he mows down drug cartel soldiers in the course of rescuing an American special forces team inserted into the country. (I repeat: Deputy Director shooting lots and lots of people with a gatling machine gun as if he were Arnold Schwarzenegger.) Such black bag work and gunplay as the Haggard stories offer, however, Russell leaves to others, while he navigates the system, and in doing so "saves the day" in dramas that, like no other, celebrate the Bureaucrat as Hero.
Of course, in continuing his series over the next few decades Haggard did eventually start playing up the action, with Yesterday's Enemy an example of this. Still, this is how they started, and tended to run. In Yesterday's, certainly, what drew Russell into his more conventional spy adventure was in fact his old, legendary reputation as Super-Bureaucrat Extraordinaire.
In hindsight, it is an additional way in which these largely forgotten books were and remain unique within this genre.
1. To round out the right-wing cliche, we also see the upper-class leftist portrayed as a snob and a bigot, disliking his niece's suitor, allegedly, for his being in the casino industry, unintellectual, Catholic and therefore "a reactionary fascist beast."
Monday, May 28, 2018
Notes on High Concept: Movies and Marketing in Hollywood, by Justin Wyatt
Picking up Justin Wyatt's High Concept I (like most people who pay much attention to this sort of thing these days, I suppose) already had a fairly good idea of what the term denotes. As Wyatt explains it, in great depth and yet concisely, it is a "straightforward, easily communicated and easily comprehended" narrative whose themes and "appeal" to a broad audience are "immediately obvious" (8). The project that sounds good to an executive in a 25-word "elevator pitch"; that they can be persuaded at least has the potential to look good in a 30-second TV spot or even a poster.
The ease of communication and comprehension is simplified in the case of a "pre-sold property" with a "built-in audience," like a sequel to a prior hit--because not only is there a proven past success, but because the job of selling the product has already been done, and all one has to do is remind the audience of it. It helps, too, for the film to have other, pre-sold features--"bankable" stars (however shaky this concept is), a soundtrack capitalizing on already popular hits (sales of which are, in turn, helped by the movie), a compelling look that in itself is the subject of the sale, integrated with and even overwhelming the narrative. (Fast! Flashy! Sleek! Ultra-modern! Maybe there's nothing much to "see" here, but you can't stop "looking" at it, can you?)
A high-concept movie is a movie that looks good in a commercial, or a promotional music video, because in contrast with a classically made movie it is essentially a very long commercial or music video, in part because it was probably made by a director whose background, at any rate, is in directing commercials or music videos. (Wyatt cites Adrian Lyne, and the Scott brothers Ridley and Tony, and one can spend a long time listing those who have entered filmmaking in similar fashion since--David Fincher and Michael Bay and Simon West and Alex Proyas and Spike Jonze and Dominic Sena and Antoine Fuqua and McG and Gore Verbinski and Zack Snyder and Tarsem Singh and and and . . . while by this point such directors have so long dominated the medium that even a director who learned their craft actually studying movies probably can't help being influenced by their practice.)
Still, Wyatt's book did have some surprises for me, the biggest of which was the range of cinematic concepts to which he saw this as being applicable. Looking at the cover's array of images from major films of the '70s and '80s I am unsurprised to recognize a shot from Jaws. But I am surprised to see shots from Flashdance and Saturday Night Fever above Jaws. Where the actual text of the book is concerned, Wyatt begins not with Steven Spielberg and George Lucas, but Grease (and specifically, a comparison of that critical flop and commercial success with the diametrical opposite in All That Jazz).
And this is, I think, a reflection of profound change in the business over the years. We think of the late '70s as the era which saw the rise of the blockbuster, the '80s as the rise of the Hollywood action movie as a staple of the market. Still, the action movie was a comparatively small part of the market back then. Where out of the top ten films of the year in the '80s at the North American box office there were apt to be a couple of action films, in the twenty-first century the figure was more likely to be five movies--half the list--and often more. At the same time the rest of the top ten list was apt to be dominated by a kind of movie which did not make Wyatt's cover at all--the big-budget, Disney or Shrek-style animated movie, which in the 2010s have averaged three of the top ten annually. Which means that between one and the other, they have accounted for eight of the top ten hits year in, year out, with the other movies on the list likely to be of closely related types (the live-action version of the animated Cinderella, for instance). And since action movies and cartoons are what you make if you want a blockbuster, they comprise a much larger share of the market overall than any other one or two such distinctive styles of film ever have before.
As a result, one would not think of many other kinds of movie as high concept (even if musicals, for example, are occasionally popular and profitable). However, as Wyatt shows through a much deeper development of the concept of a film as an ad than I anticipated, movies were not just ad-like in their aesthetic or feel, but ads for a "lifestyle." (Beverly Hills Cop was about the fantasy of what it is to be rich in "Beverly Hills" as much as it was about the adventure of the "Cop.") I do not think that this is quite as prominent in film today, the use of "lifestyle" in it different. Certainly luxury is common currency in today's commercial filmmaking, anything remotely resembling actual middle-class life or working-class life or poverty generally banished from the screen, but a movie, while expected to depict an "attractive lifestyle," gets a lot less mileage out of doing so, enough so that a movie principally selling lifestyle seems unlikely to earn $1 billion in ticket sales and surcharges. (Tony Stark's high-end consumption in the Marvel movies is part of the package, but it is backdrop, relatively less important to the movie than in a comparable film from the '80s that would have luxuriated in it all a great deal more, more noticeably amid the slighter special effects and slower cutting.)
Still, if Wyatt's profuse discussion of blockbuster as lifestyle ad seems a bit dated now, his discussion retains a relevance for other aspects of pop culture, like music--exemplified by Ted Gioia's observation that, in the course of its swallowing up everything else, "Music Criticism Has Degenerated into Lifestyle Reporting."
As Gioia remarks (finally, other people noticing this!), "During the entire year 1967, The Chicago Tribune only employed the word 'lifestyle' seven times," but today "newspapers have full-time lifestyle editors," while coverage of all the rest of life from weather to business is construed in lifestyle terms ("your commute," "your money"), and certainly, pop culture, with music "treated as one more lifestyle accessory, no different from a stylish smartphone" and "music journalism . . . retreated into a permanent TMZ-zone." Indeed,
But that reality doesn't save them from the delusion they do.
Groan, groan and groan again.
The ease of communication and comprehension is simplified in the case of a "pre-sold property" with a "built-in audience," like a sequel to a prior hit--because not only is there a proven past success, but because the job of selling the product has already been done, and all one has to do is remind the audience of it. It helps, too, for the film to have other, pre-sold features--"bankable" stars (however shaky this concept is), a soundtrack capitalizing on already popular hits (sales of which are, in turn, helped by the movie), a compelling look that in itself is the subject of the sale, integrated with and even overwhelming the narrative. (Fast! Flashy! Sleek! Ultra-modern! Maybe there's nothing much to "see" here, but you can't stop "looking" at it, can you?)
A high-concept movie is a movie that looks good in a commercial, or a promotional music video, because in contrast with a classically made movie it is essentially a very long commercial or music video, in part because it was probably made by a director whose background, at any rate, is in directing commercials or music videos. (Wyatt cites Adrian Lyne, and the Scott brothers Ridley and Tony, and one can spend a long time listing those who have entered filmmaking in similar fashion since--David Fincher and Michael Bay and Simon West and Alex Proyas and Spike Jonze and Dominic Sena and Antoine Fuqua and McG and Gore Verbinski and Zack Snyder and Tarsem Singh and and and . . . while by this point such directors have so long dominated the medium that even a director who learned their craft actually studying movies probably can't help being influenced by their practice.)
Still, Wyatt's book did have some surprises for me, the biggest of which was the range of cinematic concepts to which he saw this as being applicable. Looking at the cover's array of images from major films of the '70s and '80s I am unsurprised to recognize a shot from Jaws. But I am surprised to see shots from Flashdance and Saturday Night Fever above Jaws. Where the actual text of the book is concerned, Wyatt begins not with Steven Spielberg and George Lucas, but Grease (and specifically, a comparison of that critical flop and commercial success with the diametrical opposite in All That Jazz).
And this is, I think, a reflection of profound change in the business over the years. We think of the late '70s as the era which saw the rise of the blockbuster, the '80s as the rise of the Hollywood action movie as a staple of the market. Still, the action movie was a comparatively small part of the market back then. Where out of the top ten films of the year in the '80s at the North American box office there were apt to be a couple of action films, in the twenty-first century the figure was more likely to be five movies--half the list--and often more. At the same time the rest of the top ten list was apt to be dominated by a kind of movie which did not make Wyatt's cover at all--the big-budget, Disney or Shrek-style animated movie, which in the 2010s have averaged three of the top ten annually. Which means that between one and the other, they have accounted for eight of the top ten hits year in, year out, with the other movies on the list likely to be of closely related types (the live-action version of the animated Cinderella, for instance). And since action movies and cartoons are what you make if you want a blockbuster, they comprise a much larger share of the market overall than any other one or two such distinctive styles of film ever have before.
As a result, one would not think of many other kinds of movie as high concept (even if musicals, for example, are occasionally popular and profitable). However, as Wyatt shows through a much deeper development of the concept of a film as an ad than I anticipated, movies were not just ad-like in their aesthetic or feel, but ads for a "lifestyle." (Beverly Hills Cop was about the fantasy of what it is to be rich in "Beverly Hills" as much as it was about the adventure of the "Cop.") I do not think that this is quite as prominent in film today, the use of "lifestyle" in it different. Certainly luxury is common currency in today's commercial filmmaking, anything remotely resembling actual middle-class life or working-class life or poverty generally banished from the screen, but a movie, while expected to depict an "attractive lifestyle," gets a lot less mileage out of doing so, enough so that a movie principally selling lifestyle seems unlikely to earn $1 billion in ticket sales and surcharges. (Tony Stark's high-end consumption in the Marvel movies is part of the package, but it is backdrop, relatively less important to the movie than in a comparable film from the '80s that would have luxuriated in it all a great deal more, more noticeably amid the slighter special effects and slower cutting.)
Still, if Wyatt's profuse discussion of blockbuster as lifestyle ad seems a bit dated now, his discussion retains a relevance for other aspects of pop culture, like music--exemplified by Ted Gioia's observation that, in the course of its swallowing up everything else, "Music Criticism Has Degenerated into Lifestyle Reporting."
As Gioia remarks (finally, other people noticing this!), "During the entire year 1967, The Chicago Tribune only employed the word 'lifestyle' seven times," but today "newspapers have full-time lifestyle editors," while coverage of all the rest of life from weather to business is construed in lifestyle terms ("your commute," "your money"), and certainly, pop culture, with music "treated as one more lifestyle accessory, no different from a stylish smartphone" and "music journalism . . . retreated into a permanent TMZ-zone." Indeed,
if you force pop culture insiders to be as precise as possible in articulating the reasons why they favor a band or a singer, it almost always boils down to: "I like fill in the name because they make me feel good about my lifestyle."A still bigger irony would be if most consumers of music actually thought in the same way. After all, they don't have lifestyles to feel good or badly about. They simply can't afford lifestyles.
But that reality doesn't save them from the delusion they do.
Groan, groan and groan again.
Finally, Some Grounds for Optimism?
NOTE: I penned this piece last summer but have only got around to publishing it now. While some of the details have since dated, it still seems to me worth sharing because its purpose, after all, is reflection on the trend of a decade--the decade between the 2006-2008 period when I was researching and writing a great deal about energy issues--and the present, and this still seems relevant.
A decade ago when considering the problem of fossil fuel scarcity and the prospects for alternatives as a way to fill the gap, I was consistently struck by the fact that fossil fuels were only cheap because of the externalization of so many of their environmental and other costs, and on top of this, consistent, massive state support. I was struck, too, by how despite these ways of lowering their apparent cost, the trend was in the direction of their deceptively low market price rising anyway. At the same time it was impossible not to notice how much less support renewable energy had had by comparison, the steady progress many forms of renewable energy were making in terms of price and energy return on investment in spite of this lack of support, and the sheer range of plausible concepts that held out the hope of far better results (at least some of which might amount to something)—all of these indicating enormous untapped potential.
Between the rise in the cost of fossil fuels, on the one hand (especially when one looked past the obfuscations of market prices); and the increasing productivity and cheapness of at least some forms of renewable energy; it seemed likely that the latter would become competitive with the former. It also seemed that this could be greatly accelerated if states were to shift their support from fossil fuels to renewables, and strive for energy efficiency—the reduction of energy consumption involved bringing the target of 100 percent renewable energy production within easier reach—in a focused, massive program.
I did not assume that the 2003-2008 oil price shock represented a new plateau, but I also did not expect that prices would stay as low as they have since then. (That the average annual price of a barrel of oil has, for almost a decade, not gone above $100 again, that the price touched $27 in 2016 in 2008 dollars—is a surprise to me.) At the same time, the switchover of state support from fossil fuels to renewables has simply not happened, the opposite actually happening in the United States and elsewhere by many measures, the subsidies for fossil fuels actually grown more lavish. Meanwhile progress on energy efficiency has been underwhelming, especially when one looks beyond the more simplistic and misleading measures.
Still, the installation of renewable energy-based electrical production capacity proceeded much more rapidly these past several years than it had before the crisis, in the process lowering costs. In 2016 the headline went that the electricity produced by photovoltaic solar cells became cheaper than electricity generated by the coal-fired plants that were recently the cheapest of sources. To be sure, this is not yet the case everywhere in the world, under all business conditions, according to every method of crunching the numbers—as career detractors of renewable energy never miss a chance to remind everyone. Nonetheless, many a reading of the market indicates that in much of the world, and certainly much of the Western world (the central and southern U.S., for example), a new solar energy-based power plant can viably produce cheaper electricity over its lifetime than a new coal-fired one. And the math is robust enough that the change is touted not merely by environmentalists given to advocating renewables because of their ecological benefits, but by thoroughly mainstream business news outlets that, if anything, are prone to the opposite bias, like Bloomberg and Fortune.
Especially given the unpromising market conditions and policy circumstances the world attained this milestone rather faster than I expected. And unlike the great majority of what is reported about energy and the environment, this was welcome news—and a significant boost to optimism about rebuilding the world's energy base along more sustainable lines. Certainly it has demonstrated the enormous, untapped potential of a technology derided and dismissed by so much mainstream opinion. (Simply put, Goldman Sachs was wrong and Greenpeace was right.)
Additionally, it seems very likely that this trend will continue through subsequent years, to the advantage of photovoltaics over fossil fuels. And not only is it expected that within the next decade or so its price advantage will become a global norm. It is widely expected that even without rises in the price of fossil fuels that advantage will go on deepening. Bloomberg New Energy Finance (BNEF), which has tended to be conservative in its past estimates, forecasts that the cost of solar-generated electricity will fall to a third its present level by 2040. According to one projection, well before that date, perhaps in the early 2030s, energy companies will find it cost-effective not just to opt for solar over coal in building new plants from scratch, but to actually junk an old coal plant on which they are only paying operating costs in favor of building a new solar plant. Another, still more aggressive projection by Ray Kurzweil has solar energy providing 100 percent of the world's electricity by then, simply by sustaining its recently observed rate of growth in its share of the energy mix for a mere decade.
Given that no one seriously expects solar to carry the burden alone, that long-established hydroelectric power already contributes over a tenth of what the world uses, and that comparably cheap wind installations are also making rapid progress (up 13 percent in 2016 over the prior year according to the Global Wind Energy Council), the date at which all electricity is generated by renewables would come well before the arrival of 100 percent solar. And, in contrast with many of the forecasts for which is so well-known, he is far from alone here, our producing 80 percent of our electricity from renewables by 2030 now a subject of serious debate.
Of course, in considering such expectations, certain caveats have to be remembered. The most significant is that even if the expansion of renewable energy production has been very rapid, the world is still dependent on fossil fuels for five-sixths of its energy. Additionally, of the one-sixth derived from renewables, long-established hydroelectric energy is still the principal contributor (supplying two-thirds). By this measure, not very much has changed from before. It might be added that the recent woes of the coal industry have been due not to the drop in price of solar or other renewables, but the expansion of that last hope of the fossil fuel industry, natural gas. Meanwhile coal production and consumption appear to be bouncing back after the drop of the prior year, not only in the United States, but China and India as well.
Additionally, some parts of the transition will make the matter more complicated than a straight extrapolation from observed growth rates. It remains far easier to phase out a coal-fired electric plant in favor of photovoltaics, or wind turbines, than to shift to renewable energy-based vehicle fleets, even just looking at ground transport, rather than the more difficult matters of ships and aircraft. (Sales of even electric private cars are, if fast-growing, still a very small share of the sale of new vehicles, at the same time that the American taste for large cars has gone global.) The efforts to develop alternatives to fuel oil, like algal biofuels, sadly, cannot point to equally dramatic progress, while simple math demonstrates that even should we fully electrify transport, it will mean that much more demand for electricity, delaying the point at which we get 100 percent of our electricity from renewables. (Indeed, as the situation stands it is worth remarking that the consumption of oil as well as coal has been rebounding recently in the United States and elsewhere.)
Moreover, the possibility that the transition will be resisted politically rather than just in the marketplace has to be taken seriously, given the dismal record of governments during the four decades since the energy crisis of the 1970s forced the question. Certainly the career fossil fuel-boosters and renewables-bashers continue to blanket the media with specious arguments about the unworkability of renewables on a large-scale, from dubiously calculated figures on Energy Return On Investment to straw man arguments about grid unreliability to sanctimonious ranting and raving against any and all instances of government support for renewable energy by those who turn a blind eye to Big Oil's vastly larger support to silly charges that from an ecological standpoint renewables can only ever be a curse "worse than the disease." (And of course, accompanying all this are plenty of Cornucopian promises of fossil fuel superabundance and climate change denial directed against the two most significant non-price arguments for a shift away from fossil fuels.) Even if it increasingly looks like a rear-guard action, the extent to which such an action can slow down the desperately needed transition ought not to be slighted given how much longer it has taken to reach this point than was hoped by the more forward-thinking in the 1970s. Indeed, the recent direction of policy in several of the principal energy-consuming nations has been less than encouraging, with this going even for the nations which have so often been looked to for leadership in the area of energy and climate, particularly Germany and China.
All the same, the current press regarding Big Oil's hopes for a "golden age" of natural gas carries with it considerable qualifications and doubts not merely about its grossly exaggerated attractiveness from a greenhouse gas emissions standpoint, but even its price-competitiveness. At the same time many observers suggest that this year's spike in coal prices and output is a temporary, short-term shift that ought not to distract from the long downward trend (peak coal possibly behind us already).
Additionally, thorny as the problem of transport may be, transportation accounts for less than a third of energy consumption, such that slower progress here does not eliminate the reality of enormous gains if electricity production is effectively shifted to renewables. And even the hints of backtracking in national policy have been judged by many to be of less enduring significance than alleged—with even the more bullish predictions regarding the coal use of China and India appearing less inconsistent with their longer-run commitment to reduced fossil fuel use and carbon emissions when their policies are examined comprehensively.
Still, even the most optimistic reading of the situation underlines how complacent and wrong-headed it would be to trust merely to the progress of one or two technologies to wholly transform the energy base. It would be complacent also to trust to market forces—or more accurately, the current combination of market forces and policymaking. Instead there is a need for ambitious policies at the local, national, regional and global levels committing governments to locking in and accelerating the deployment of renewable energy production of all types to the greatest extent possible, not simply by encouraging the expanded installation of photovoltaic solar and ground-based and offshore wind, but investigating and developing the fuller range of options in this area. The next generation of solar cells (thin-film cells, etc.), holding out the prospect of greater efficiencies, also hold out the possibility of lowering costs and enlarging capacity still more rapidly than the BNEF analysis suggests, while there are, too, new ways of deploying these technologies that may expand capacity and reliability are well worth examination (like airborne solar and wind generation). The same goes for other sources scarcely exploited to date (tidal energy, wave energy), while algal and comparable biofuels remain worthy of continued interest.
These policies should facilitate the fullest possible use of the energy produced, with power generation more widely distributed at one end (with net-zero and net-positive buildings as the goal), and grids more integrated at the other (perhaps working toward a resilient global smart grid), while improving storage capability through support for the development of more cost-effective batteries and biofuels. They should devote attention to the special problems of transport based on renewables. (Equipping those most voracious of oil-burners, large commercial ships, with SkySails-style kites could be just the beginning.) They should encourage a more efficient use of energy at the demand end, interest in which seems to have sadly declined after the post-2008 oil price drop, despite unending demonstrations of the capacity of properly designed goods from housing to vehicles to electronics to provide meaningful savings in energy consumption without compromising (and even improving) economic productivity or living standards. (While more novel, more dynamic and therefore less certain, the same can be said for changed approaches to the production and delivery of goods and services, from telecommuting to 3-D printing to cellular agriculture.) And of course, the more affluent and technologically developed nations should do all in their power to promote and facilitate the implementation of these technologies in those poorer and less developed nations still building their infrastructures.
Indeed, robust policies promoting the full gamut of potential contributors to the solution, developing the options with which too little has been done, and searching out the possibilities scarcely thought of now is likely to be essential to turning the target of a 100 percent transition to renewables inside the next generation from a pious wish into a reality—a goal all the more desirable given how even the most rapid progress envisaged by today's optimists is still less than what the climate crisis demands.
A decade ago when considering the problem of fossil fuel scarcity and the prospects for alternatives as a way to fill the gap, I was consistently struck by the fact that fossil fuels were only cheap because of the externalization of so many of their environmental and other costs, and on top of this, consistent, massive state support. I was struck, too, by how despite these ways of lowering their apparent cost, the trend was in the direction of their deceptively low market price rising anyway. At the same time it was impossible not to notice how much less support renewable energy had had by comparison, the steady progress many forms of renewable energy were making in terms of price and energy return on investment in spite of this lack of support, and the sheer range of plausible concepts that held out the hope of far better results (at least some of which might amount to something)—all of these indicating enormous untapped potential.
Between the rise in the cost of fossil fuels, on the one hand (especially when one looked past the obfuscations of market prices); and the increasing productivity and cheapness of at least some forms of renewable energy; it seemed likely that the latter would become competitive with the former. It also seemed that this could be greatly accelerated if states were to shift their support from fossil fuels to renewables, and strive for energy efficiency—the reduction of energy consumption involved bringing the target of 100 percent renewable energy production within easier reach—in a focused, massive program.
I did not assume that the 2003-2008 oil price shock represented a new plateau, but I also did not expect that prices would stay as low as they have since then. (That the average annual price of a barrel of oil has, for almost a decade, not gone above $100 again, that the price touched $27 in 2016 in 2008 dollars—is a surprise to me.) At the same time, the switchover of state support from fossil fuels to renewables has simply not happened, the opposite actually happening in the United States and elsewhere by many measures, the subsidies for fossil fuels actually grown more lavish. Meanwhile progress on energy efficiency has been underwhelming, especially when one looks beyond the more simplistic and misleading measures.
Still, the installation of renewable energy-based electrical production capacity proceeded much more rapidly these past several years than it had before the crisis, in the process lowering costs. In 2016 the headline went that the electricity produced by photovoltaic solar cells became cheaper than electricity generated by the coal-fired plants that were recently the cheapest of sources. To be sure, this is not yet the case everywhere in the world, under all business conditions, according to every method of crunching the numbers—as career detractors of renewable energy never miss a chance to remind everyone. Nonetheless, many a reading of the market indicates that in much of the world, and certainly much of the Western world (the central and southern U.S., for example), a new solar energy-based power plant can viably produce cheaper electricity over its lifetime than a new coal-fired one. And the math is robust enough that the change is touted not merely by environmentalists given to advocating renewables because of their ecological benefits, but by thoroughly mainstream business news outlets that, if anything, are prone to the opposite bias, like Bloomberg and Fortune.
Especially given the unpromising market conditions and policy circumstances the world attained this milestone rather faster than I expected. And unlike the great majority of what is reported about energy and the environment, this was welcome news—and a significant boost to optimism about rebuilding the world's energy base along more sustainable lines. Certainly it has demonstrated the enormous, untapped potential of a technology derided and dismissed by so much mainstream opinion. (Simply put, Goldman Sachs was wrong and Greenpeace was right.)
Additionally, it seems very likely that this trend will continue through subsequent years, to the advantage of photovoltaics over fossil fuels. And not only is it expected that within the next decade or so its price advantage will become a global norm. It is widely expected that even without rises in the price of fossil fuels that advantage will go on deepening. Bloomberg New Energy Finance (BNEF), which has tended to be conservative in its past estimates, forecasts that the cost of solar-generated electricity will fall to a third its present level by 2040. According to one projection, well before that date, perhaps in the early 2030s, energy companies will find it cost-effective not just to opt for solar over coal in building new plants from scratch, but to actually junk an old coal plant on which they are only paying operating costs in favor of building a new solar plant. Another, still more aggressive projection by Ray Kurzweil has solar energy providing 100 percent of the world's electricity by then, simply by sustaining its recently observed rate of growth in its share of the energy mix for a mere decade.
Given that no one seriously expects solar to carry the burden alone, that long-established hydroelectric power already contributes over a tenth of what the world uses, and that comparably cheap wind installations are also making rapid progress (up 13 percent in 2016 over the prior year according to the Global Wind Energy Council), the date at which all electricity is generated by renewables would come well before the arrival of 100 percent solar. And, in contrast with many of the forecasts for which is so well-known, he is far from alone here, our producing 80 percent of our electricity from renewables by 2030 now a subject of serious debate.
Of course, in considering such expectations, certain caveats have to be remembered. The most significant is that even if the expansion of renewable energy production has been very rapid, the world is still dependent on fossil fuels for five-sixths of its energy. Additionally, of the one-sixth derived from renewables, long-established hydroelectric energy is still the principal contributor (supplying two-thirds). By this measure, not very much has changed from before. It might be added that the recent woes of the coal industry have been due not to the drop in price of solar or other renewables, but the expansion of that last hope of the fossil fuel industry, natural gas. Meanwhile coal production and consumption appear to be bouncing back after the drop of the prior year, not only in the United States, but China and India as well.
Additionally, some parts of the transition will make the matter more complicated than a straight extrapolation from observed growth rates. It remains far easier to phase out a coal-fired electric plant in favor of photovoltaics, or wind turbines, than to shift to renewable energy-based vehicle fleets, even just looking at ground transport, rather than the more difficult matters of ships and aircraft. (Sales of even electric private cars are, if fast-growing, still a very small share of the sale of new vehicles, at the same time that the American taste for large cars has gone global.) The efforts to develop alternatives to fuel oil, like algal biofuels, sadly, cannot point to equally dramatic progress, while simple math demonstrates that even should we fully electrify transport, it will mean that much more demand for electricity, delaying the point at which we get 100 percent of our electricity from renewables. (Indeed, as the situation stands it is worth remarking that the consumption of oil as well as coal has been rebounding recently in the United States and elsewhere.)
Moreover, the possibility that the transition will be resisted politically rather than just in the marketplace has to be taken seriously, given the dismal record of governments during the four decades since the energy crisis of the 1970s forced the question. Certainly the career fossil fuel-boosters and renewables-bashers continue to blanket the media with specious arguments about the unworkability of renewables on a large-scale, from dubiously calculated figures on Energy Return On Investment to straw man arguments about grid unreliability to sanctimonious ranting and raving against any and all instances of government support for renewable energy by those who turn a blind eye to Big Oil's vastly larger support to silly charges that from an ecological standpoint renewables can only ever be a curse "worse than the disease." (And of course, accompanying all this are plenty of Cornucopian promises of fossil fuel superabundance and climate change denial directed against the two most significant non-price arguments for a shift away from fossil fuels.) Even if it increasingly looks like a rear-guard action, the extent to which such an action can slow down the desperately needed transition ought not to be slighted given how much longer it has taken to reach this point than was hoped by the more forward-thinking in the 1970s. Indeed, the recent direction of policy in several of the principal energy-consuming nations has been less than encouraging, with this going even for the nations which have so often been looked to for leadership in the area of energy and climate, particularly Germany and China.
All the same, the current press regarding Big Oil's hopes for a "golden age" of natural gas carries with it considerable qualifications and doubts not merely about its grossly exaggerated attractiveness from a greenhouse gas emissions standpoint, but even its price-competitiveness. At the same time many observers suggest that this year's spike in coal prices and output is a temporary, short-term shift that ought not to distract from the long downward trend (peak coal possibly behind us already).
Additionally, thorny as the problem of transport may be, transportation accounts for less than a third of energy consumption, such that slower progress here does not eliminate the reality of enormous gains if electricity production is effectively shifted to renewables. And even the hints of backtracking in national policy have been judged by many to be of less enduring significance than alleged—with even the more bullish predictions regarding the coal use of China and India appearing less inconsistent with their longer-run commitment to reduced fossil fuel use and carbon emissions when their policies are examined comprehensively.
Still, even the most optimistic reading of the situation underlines how complacent and wrong-headed it would be to trust merely to the progress of one or two technologies to wholly transform the energy base. It would be complacent also to trust to market forces—or more accurately, the current combination of market forces and policymaking. Instead there is a need for ambitious policies at the local, national, regional and global levels committing governments to locking in and accelerating the deployment of renewable energy production of all types to the greatest extent possible, not simply by encouraging the expanded installation of photovoltaic solar and ground-based and offshore wind, but investigating and developing the fuller range of options in this area. The next generation of solar cells (thin-film cells, etc.), holding out the prospect of greater efficiencies, also hold out the possibility of lowering costs and enlarging capacity still more rapidly than the BNEF analysis suggests, while there are, too, new ways of deploying these technologies that may expand capacity and reliability are well worth examination (like airborne solar and wind generation). The same goes for other sources scarcely exploited to date (tidal energy, wave energy), while algal and comparable biofuels remain worthy of continued interest.
These policies should facilitate the fullest possible use of the energy produced, with power generation more widely distributed at one end (with net-zero and net-positive buildings as the goal), and grids more integrated at the other (perhaps working toward a resilient global smart grid), while improving storage capability through support for the development of more cost-effective batteries and biofuels. They should devote attention to the special problems of transport based on renewables. (Equipping those most voracious of oil-burners, large commercial ships, with SkySails-style kites could be just the beginning.) They should encourage a more efficient use of energy at the demand end, interest in which seems to have sadly declined after the post-2008 oil price drop, despite unending demonstrations of the capacity of properly designed goods from housing to vehicles to electronics to provide meaningful savings in energy consumption without compromising (and even improving) economic productivity or living standards. (While more novel, more dynamic and therefore less certain, the same can be said for changed approaches to the production and delivery of goods and services, from telecommuting to 3-D printing to cellular agriculture.) And of course, the more affluent and technologically developed nations should do all in their power to promote and facilitate the implementation of these technologies in those poorer and less developed nations still building their infrastructures.
Indeed, robust policies promoting the full gamut of potential contributors to the solution, developing the options with which too little has been done, and searching out the possibilities scarcely thought of now is likely to be essential to turning the target of a 100 percent transition to renewables inside the next generation from a pious wish into a reality—a goal all the more desirable given how even the most rapid progress envisaged by today's optimists is still less than what the climate crisis demands.
Subscribe to:
Posts (Atom)