Tuesday, March 21, 2017

What Ever Happened to the Japanese Microchip Industry?

David E. Sanger wrote in the New York Times in 1986 that the
American electronics industry . . . has virtually lost the entire memory market to low-cost Japanese competitors . . . Dozens of American manufacturers have fled the commodity memory chip business, unable to match Japan's remarkable manufacturing efficiencies or constant price cutting . . . By most estimates, Japanese manufacturers have seized a remarkable 85 percent of the market for the current generation of chips . . .
Moreover, Japan's lead showed signs of widening. By 1988 the country accounted for over half the world's microchip production (51 percent in 1988), with the top three makers (NEC, Hitachi, Toshiba) all Japanese. At the end of the decade Japanese chip makers were expected to put 16-megabit chips onto the market while U.S. companies were just getting around to making 4-megabit ones (two doublings behind), and Japan was the first to unveil the prototypes of far more advanced chips than those--the 64-megabit chip in 1990 and the 256-megabit in 1992 (64 times--6 doublings--as powerful as the 4-megabit chip standard at the time), events which got considerable press coverage. Indeed, Japan's position in the sector was regarded as emblematic of its industrial prowess, and even a basis for claims of superpower status.1

Such talk evaporated during the '90s, and seems virtually forgotten today.

What happened?

The conventional wisdom chalks it up to the idea that the Japanese companies weren't doing so well as they had been given credit for, the U.S. better--that Japanese business had got about as far as its practices could take it and was too rigid to change, in contrast with freewheeling, endlessly reinventing itself America, epitomized by that subject of endless libertarian paeans, the Silicon Valley start-up.

However, while American firms rallied (with Intel the outstanding success story) and Japanese firms made mistakes in coping with the resulting, more competitive market (like the flawed restructuring efforts that spun off firms too undercapitalized to compete), the reality is more complex. Such a disproportionate share of any market as Japan enjoyed in the late '80s generally tends to be fleeting--especially when the product is evolving rapidly, and the market rapidly growing. Both of these considerations applied here, as rapidly growing chip capabilities meant rapid changes in the productive plant, and chip consumption rose exponentially--making advantage temporary, and creating opportunity for those looking to grab a piece of the action.

There was, too, the question of how Japanese chipmakers came to enjoy their extraordinary '80s-era position in the first place--not only through the quality of their chips, but the success of their most important customers in their own lines. These happened to be Japanese makers of consumer electronics (like Sony), who had the dominant global position in their own sectors in those years--which, since they bought their chips from a few Japanese firms, made those industrial giants' squarely orienting themselves to their chip needs a plausible strategy. However, as the share of the world consumer electronics market these companies enjoyed declined, so did the share of the potential customer base for microchips they constituted--just as new, lower-cost chip producers entered the world market (like South Korea's Samsung).

Still, for all the changes, and the missteps, Japan remains a significant producer of chips today--accounting for 14 percent of the world total in 2013, after just the United States and South Korea.

Additionally, there is the matter of the market for the equipment needed to make the chips, like the raw materials from which the chips are made (like silicon wafers) and the manufacturing equipment needed to print circuits on them (like photolithography equipment). Japan's production accounts for an extraordinary 50 percent of the former, and 30 percent of the latter. While less often publicized, such totals in these difficult, exacting areas are arguably even more impressive testaments to Japan's manufacturing prowess than the chip sales of three decades ago.

1. Shintaro Ishihara famously declared in his book The Japan That Can Say No that Japan's lead in the technology put the nuclear balance in its hands, because that nation alone could make the chips needed for accurate nuclear warhead guidance, while the victory of U.S. forces in the 1991 Gulf War was hailed as actually a triumph of Japanese technology, because there were Japanese components guiding its weapons.

James Bond for the YA Crowd?

It has long been impossible to pay much attention to contemporary science fiction and fantasy and not be hugely aware of the presence of young adult fiction within it. The historic commercial success of such franchises as Harry Potter, Twilight, Percy Jackson, The Hunger Games and Divergent (among others) has loomed large within not just the genre landscape, but popular culture as a whole. Indeed, Game of Thrones apart, virtually every really major publishing success of the genre has been a YA phenomenon. And while the esteem for such works among the hard core of science fiction and fantasy fans, and especially the "higherbrow" among them, has not been on a par with their commercial success, YA has attracted the attention of such critical darlings as Cory Doctorow, Paul di Filippo and Scott Westerfeld, each writing heavily in this area in recent years, and even getting some critical recognition for the results, as with the Hugo nomination Doctorow got for Little Brother.

The same cannot be said of other genres. Indeed, science fiction and fantasy have done as well as they have in YA because of the lingering prejudice that they are kid's stuff anyway, and ironically, because of the way in which the adult stuff has become so adult--so involved, so dense, so literary, often at the same time, that someone who has not been a longtime reader of contemporary science fiction and fantasy has a hard time getting into it, or even getting it at all. (That a book like The Hunger Games is so apt to seem derivative and undemanding and unimpressive to someone who reads full-blown literary science fiction for grown-ups is an asset, not a liability, in the marketplace.)

By contrast, a YA thriller is necessarily a toned-down thriller--which does not mean that it cannot be entertaining, but must eschew the easier ways of achieving its effects by being restrained in handling its violence and other, rougher fare. All the same, writers do write thrillers aimed at the YA market, and that has long included spin-offs of 007--going all the way back at least to R.D. Mascott's The Adventures of James Bond Junior 003 1/2 (1967)--as well as imitations, one of the more successful of which has been the Alex Rider novels of Anthony Horowitz, of which there are presently ten in print, with an eleventh reportedly on the way this year.

Personally I have taken little interest in these efforts, giving them only cursory attention even while tracking down and reading every one of the regular continuation novels. Still, having recently reviewed Horowoitz's Bond continuation novel Trigger Mortis (2015), I decided to give the first Alex Rider book, Stormbreaker, a look, not because I thought he had done anything really new with the concept (the tradition was already well-worn when Fleming came up with Bond, just an update of a half century of clubland heroes), but because I was curious as to how he would cram Bondian adventure into the life of a young adult, and whether very much of such adventure would remain in it when he was done. You can read my thoughts on that book here.

Thursday, February 23, 2017

Review: Power-Up: How Japanese Video Games Gave the World an Extra Life, by Chris Kohler

Indianapolis, IN: Bradygames, 2004, pp. 312.

Chris Kohler's book, as the title promises, is concerned with the revolution wrought by Japanese video game makers between the 1970s and the 1990s, in which Nintendo and its most famous titles (from Donkey Kong to Pokemon) loomed so large. The subject is complex enough that, rather than attempting to offer a relatively linear history of the field, the book goes through it subject by subject, generally with good results. Kohler displays a robust interest in the formal aspects of the games--their appearance, structure, modes of play and storytelling--and explains these incisively, both in cases of specific, earlier pioneering games, and the larger history of the form. This extends to minute analysis of classics from Donkey Kong to the original Final Fantasy with the aid of numerous screen captures arranged in flow chart-like fashion, affording not just something of a lesson in the Poetics of Video Gaming, but enabling us to look at these old games with fresh eyes.

In doing justice to his subject matter Kohler's discussion extends well beyond gaming to its interconnections with manga, anime, pop music and other corners of Japanese pop culture--which did so much to make Japanese video games what they were. The creative stars were not techies, but had interests and backgrounds in the visual arts and storytelling media (like manga), as was the case with Pac-Man creator Toru Iwatani; Donkey Kong, Mario and Zelda creator Shigeru Miyamoto; and Dragon Quest creator Yuji Horii. And this was crucial to the contributions that they did make to their field--the appealing characters, narratives, visuals, music and gameplay experience, and the associated techniques and refinements (like cinematic cut scenes) that lifted the genre above the minimalist sports and shooting games that defined the early, more narrowly programmer-driven history of the field. This extends even to glances at intriguing but oft-overlooked games like Miyamoto's RPG Mother, and Fumito Ueda's ICO. There is also a good deal of material on how the product had to be translated and localized for sale in foreign markets.

It might be added that Kohler's book contains enough material on the history of the business to be interesting as a recounting of the development of the information technology sector--the more so for how different this recounting is from the Silicon Valley mythology to which discussion of the American gaming industry (all too predictably) conforms. Start-ups by computer programmers (like INIS) do have their part in the story. However, the businesses that played the central role were established companies in other fields that took a chance on the new sector (Nintendo, for example, was an almost century-old manufacturer of playing cards), and the stars of the story, artists that they were, Company Men, rather than entrepreneurs striking out on their own.

Moreover, while Kohler's book has undoubtedly dated in its coverage of a relatively fast-changing field, much of what it says still seems relevant--in particular, Nintendo's accent on sheer fun (which in the years since held the Wii in good stead, relative to the high cost and hardcore gaming orientation of other consoles). Additionally, the timing of Power-Up's release--in the early 2000s--is still recent enough to provide a certain amount of perspective on the major changes in gaming since then, namely the decline of the standing of Japanese firms relative to Western ones in the field; the claims that Nintendo may have gone from cutting-edge to backward-looking; and the possibility that this is just part of a still larger story, specifically the transition of gaming away from consoles to online and mobile devices.

That said, it could be argued that, important as Nintendo is in this story, the discussion may be a bit too Nintendo-centric to be taken as a history of Japanese gaming. (Despite some very real successes, Sega, for example, does not even rate an index listing here.1) It should also be remembered that while Kohler is generally knowledgeable about and respectful of the culture of which he is writing, this is nonetheless an American book aimed at an American reader. Anyone looking for much discussion of the reception for Japanese games in any foreign market but the U.S. would have to look elsewhere. On some occasions, he also approaches his subject through American stereotypes about the two countries (e.g. conformist, stifling Japan vs. everybody-chases-their-dreams America), as when he recounts the career of Pokemon creator Satoshi Tajiri, declaring that
had Japanese societal norms had their way, Pokemon would never have been born, its creator not given the freedom to follow his own path . . . he wouldn't go on to college, but spent two years at a technical school. His father got him a job as an electrical technician. He refused to take it.
The expectation that one go on to college and take the workaday job rather than devote oneself to a creative career is hardly some foreign exoticism, but similarly the norm in America (where parents are also apt to be far from pleased to hear their kid tell them they mean to be an artist rather than go for the practical degree and the steady eight-to-six).

There are also instances in which the presentation of his information could have been improved. Despite the distinctly American view, Kohler does not always provide the clarifying notes that an American audience would expect. (The Super Mario Brothers 2 he discusses in his overview of the Super Mario series is not the one we recall in North America, an earlier game more derivative of the original that was not released in the U.S. at the time--and that fact only gets proper acknowledgment in a much later chapter.) Additionally, there are sometimes listings of information within the main text of a chapter that might have been better reserved for tables or appendices--as with a seven page listing of releases of soundtracks of the music of the Final Fantasy game franchise.

However, these are comparative quibbles in regard to a book that I, for one, found to be well worth my while all these years later, a continued relevance reflected in Kohler's releasing an updated and expanded edition of the book in 2016 (24 pages longer, according to its page on Amazon). You can read about it here.

1. Sega's Master system, after all, was the closest thing Nintendo had to a rival in the 8-bit era, and its Genesis console virtually on par as competitor in the 16-bit era, while later consoles met with varying degrees of success up to the Dreamcast. Additionally Sega produced one of the few video game characters that can be compared with Mario as a pop culture icon, Sonic the Hedgehog.

Sunday, February 5, 2017

A Fragment on Fan Writing

In researching Star Wars in Context I was time and again surprised to find that answers to many fairly obvious questions about the Star Wars franchise--even discussion of those questions--were awfully scarce on the web. Some of it was even nonexistent.

For example, "everyone" knows that George Lucas based Star Wars on Akira Kurosawa's The Hidden Fortress to a considerable degree. Indeed, in a typically shallow and pointless display of erudition, Ted Mosby even dropped this little factoid on Stella while sitting her down for her first viewing of the movie on How I Met Your Mother.

Yet, just how did one film inspire the other? And not less important, what are the significant differences? Actual discussion of the connections and parallels between one film and the other, even of the most superficial kind, is actually quite elusive for a Googler--and for anyone in general.

This was not a problem for me because I had plenty of my own to say about that; and anyway, to the extent that I was saying something others weren't, well, that said to me that writing the book wasn't so pointless as I'd feared.

All the same, why is this kind of thing so often the case?

Simply put, it's a lot easier to serve up generalizations than home in on the finer details, easier to offer impressions than analysis, easier to assert than to really explain--and while generalizations, impressions and assertions are not necessarily uninformative or unhelpful (some of them actually are informative and helpful), we could generally do with a whole lot more detail, analysis and explanation than we are getting in our online chatter.1

It probably doesn't help that the comparative few capable of doing better--who have the grasp of the history and technique of the medium they want to write about, who really know something about film and actually see films like Hidden Fortress (black and white, subtitles, etc.) so that they understand them and can communicate that understanding--rarely (not never, but rarely) write about films like Star Wars. And when they bother, they rarely take the same trouble that they do with more "serious" subjects.

And so a Mosby can get away with such a pointless display of erudition as dropping the name of Kurosawa's film--pointless because I'm not sure what significance this could have had for Stella in the scene, unless she was familiar with the other film, and it was some sort of cinematic touchstone for her, and we were never given reason to think that she was; and because making too much of the connection has doubtless confused the issue for many.

Book Review: Britain's War Machine: Weapons, Resources and Experts in the Second World War, by David Edgerton

New York: Oxford University Press, 2011, pp. 445.

In his book Warfare State, David Edgerton made the case that British historiography has tended to overlook the fact of a military-industrial complex as a massive presence in the country's life at mid-century, distorting understandings of matters like government support for industry and science, and the welfare state that gets far more attention.

Edgerton followed up this study with a book concentrating on the British warfare state in the World War II period, Britain's war machine.

Rather than a comprehensive history of Britain's war effort, the book focuses on particular aspects of that effort, and makes a number of contentions, among the most important the following:

* Far from being finished as an economic power by Victorian decline, World War I and Depression, Britain remained a considerable economic power in the 1930s, a central element of which was its still being a considerable industrial power. Moreover, that industrial base was not just a matter of strength in old, "declining" sectors like coal, iron, steel and shipbuilding, but also the emergence of major British players in the new high-tech sectors; while in contrast with the derisory view of British productivity, the country was actually quite up-to-date in this respect--often superior to Germany, and in respects on a level with the U.S..

* British technological prowess extended beyond the civilian sphere to the military. Moreover, military innovation was not, as is often imagined, largely a matter of civilians (e.g. inspired individual outsiders) who had to fight against conservative authorities. Rather there was a vast military establishment actively initiating programs seen through by state scientists. Indeed, even those thought of as outsiders fighting the establishment, like jet engine pioneer Frank Whittle, were often insiders--Whittle "an air force officer" sent to Cambridge by his service which then seconded him to a company set up specifically to develop his idea. If anything, British decisionmakers may have been too quick to place their faith in technical fixes for the problems they faced. That we think otherwise is due to the civilian academics having been in a better position to tell their story.

* More generally, Britain translated its considerable economic, industrial, technological strength into commensurate military strength. Far from being disarmed, Britain had the world's largest navy, a first-rank air force with an unmatched bombing arm and defensive radar system, and the world's most thoroughly mechanized and motorized army through the interwar period. Reflecting the strength of its military-industrial base, it was also the world's largest arms exporter of the interwar years--while this base was massively expanded in the years of rearmament, beginning at the relatively early date of 1935 (the same time as Germany's rearmament from a much weaker position and smaller economic base).

* To the extent that Britain was bent on appeasement during the 1930s, it was not a matter of an anti-military left, but a pro-military right which rearmed while conciliating Hitler, even as the normally more pacifistic left sought a harder line in dealings with him.

* When Britain did finally enter the fight, the initial expectation was not one of a hopeless conflict, but that its superior wealth and techno-industrial capability, in contact with the larger world's resources by way of the sea, gave it confidence of eventual victory. This is not belied but affirmed by the manner in which Britain went to war: partnering with continental allies backed up by a British contingent, while relying on its naval and economic instrument to bring the aggressor to heel.

* This confidence was not extirpated by the fall of France, in part because of Britain's considerable resources; and in part because at the time Britain never considered itself alone, even in the June 1940-June 1941 period--having as it did an empire covering a quarter of the world, the backing of exiled governments which brought over significant assets (like Norway's huge fleet of merchant ships), and access to the production and resources of the Americas. (The most that could be said was that it was the only great power directly engaging the Nazis.)

* Rather than a period of national unity (or leftist triumph) which unprecedentedly brought Labor into government, and the Left more generally, the war was largely an affair of the conservative political Establishment and its military-industrial complex.

* When the war ended, Britain--validating the optimism about its ability to win its war--was less damaged than is widely appreciated. Certainly it suffered far less loss of blood and treasure than its continental counterparts, even at the height of the war. (In the 1940-1943 period when the U-boat war was raging, Britain, despite the U-boats, actually managed to get by fairly well by enlarging domestic production and making more effective use of its shipping.) Rather its position relative to the rest of the world was diminished mainly by the extraordinary rebound of the U.S. from the Depression, combined with the decision of the U.S. to remain engaged in Eurasian and world affairs in the way it had opted not to be after World War I.

I see little room for argument with many of these claims. As Edgerton argues, Britain did remain a substantial economic, industrial and military power that went into the war very well-armed rather than unprepared, thanks in part to an inventive and highly productive military-industrial complex. Appeasement was more a reflection of the will of the right than the left, and the country then went to war not under the anti-Hitler left but an essentially Establishment regime. Utilizing its traditional military approach, there was wide expectation that the country would see the war through to victory with its allies (Britain was never alone even after the fall of France), the country was never more than a long way from being broken by the U-boat attacks, and its economic-military capacity came out of the war less diminished in absolute terms than in relative ones, thanks to the extraordinary American growth of the 1930s.

Indeed, to the extent that Edgerton sets the record straight on the "While Britain Slept" image of a country that could have avoided the war but for its failure to rearm; on the actuality and weight of a military-industrial complex in British political life; on just who was really promoting appeasement; on the consistency in Britain's pattern of war-making; and on the real limits of the Left's influence and accomplishments in this era; he does the historiography a considerable service. To a lesser extent, one may say the same for his putting the British experience during the war into perspective. (Others had it far, far worse.)

However, his study also has its weaknesses. His characterization of the strategic situation is particularly flawed. While he compares how Britain and Germany stacked up against one another, in the 1930s that was far from the only relevant balance of power. For British planners, the concern was Britain and Germany in Europe, Britain and Italy in the Mediterranean, and Japan in the Far East, with the nightmare scenario Britain having to fight all three at the same time--as was actually the case by December 1941.1

Still more significant is his often superficial treatment of the macroeconomic picture, and the way that side of the situation evolved during the conflict. Edgerton seems to me correct about the country's large, sophisticated military-industrial complex--but slights the important matter of the rest of the industrial base. As it happens, he actually makes favorable comparison of the arms factories with the coal mines, cotton mills and civilian shipyards that he himself notes had received little investment since the early 1920s--but avoids drawing the conclusion about that lack of investment. Equally, he shows very little sense of nuance in discussing the country's newer, high-tech sectors, taking no interest in whether Britain's firms were world-class companies, or mere second-stringers unable to compete outside a protected home and sterling area market; for whether the British divisions of foreign firms were low-end assembly units putting together imported parts as a way of circumventing the tariff barrier, or genuine high-end production capacity testifying to and developing a broader and deeper British know-how.

Still less does the book consider what any or all of these facts meant for even the narrower question of the military-industrial complex, let alone the larger matter of financing the war. After all, a robust defense sector still needs metal products, machine tools and other goods not strictly in its line, so that a really first-class military-industrial capacity requires a first-class industrial capacity generally--and Britain's position was problematic there. To a very great degree the steel and the machine tools it needed to make its weapons had to be imported from the United States (and even the Germans), while many of the components of successes like the Supermarine Spitfire had to be imported also (the plane not just made with American tools, but packing American instruments and machine guns). And as it was ultimately the civilian economy that had to pay for such efforts, all of this meant that, coming on top of an already deteriorating export position, trade balance, balance of payments, the country's economy was under serious strain before the war even began (less severe than Germany's, but an unsustainable strain all the same).

The war, of course, made matters much worse--a fact again given short shrift in the book. The section of Chapter Three considering the matter is headed "SAVED BY THE U.S.A.?" with the question mark conspicuous and significant. He emphasizes that Britain paid its way up to that point in dollars and declares that "it was buying from the USA without heed to its longer-term economic needs . . . because it knew from the end of 1940 that U.S.-financed help was likely to be available for the future." However, there is no way to take this as anything but a slighting of the hard fact of British bankruptcy a year and a half into the war, when the country's hard currency reserve was down to nearly nothing while the conflict was far from won. There is also no conclusion other than that had it not been for America's turning on the money spigot, Britain would have had to make peace with the Axis powers in early 1941, a peace that would have left the Nazis dominant in Western Europe, and free to turn east, after which the Soviet Union really would have been alone. Still less does this refute the equally hard fact of Britain's weakened financial condition after the war, when it was dependent on massive U.S. backing (a billion-pound loan in 1946, support for its currency and outright bail-outs for decades, techno-military transfers like the nuclear submarine and Polaris missile), which at times came at high cost (like the painful post-war devaluation of sterling), despite which it consistently fell short of realizing its governments' schemes for reinvigorating its economy, expanding its welfare state or retaining its world political and military role.

All of this testified to a very real weakness on Britain's part, specifically the deficiency of its manufacturing sector when broadly approached, with all its economic consequences, not least for its ability to bear the stresses of war as well as not only the larger U.S., but as well as the country had done in the World War I era. The result is that what Edgerton really does in this part of the discussion is remind the reader that Britain had strengths as well as weaknesses, successes as well as failures in its economic life in the interwar era, and its economic effort during the war, rather than integrate the two to create a more satisfying whole. As a result Britain's War Machine works less well as a new history than a corrective to some of the conventional wisdom--needed as that may be.

1. This is, of course, the more important because of the implications of Japan's military victories for the endurance of Britain's south and east Asian empire--and that, in turn, for its standing as a world power.

Notes on Kirby Buckets Warped

I'm just as surprised as anyone else to be writing about this show.

I was scarcely aware of the existence of Kirby Buckets until just a few weeks ago, and have as yet seen very little of it prior to the recent third season, which caught my attention because of how unusual it has been for broadcast television--the sharp shift of the show in genre and structure (the episodic tween sitcom about an aspiring cartoonist become a 13-episode story of interdimensional hopping, heavy on science fiction parody), and the unique airing schedule (the 13 episodes airing on 13 weekday mornings over three weeks).

Alas, the writing rarely rises above the level of the mildly amusing. In fact, the heavy reliance on gross-out humor reflects a certain laziness in its pursuit of its target demographic. All the same, the makers of the show actually do serve up an arc, rather than just tease the audience with the prospect of one--and manage to have some fun with the science fiction clichés they evoke. (Of course there's a post-apocalyptic dimension where the characters meet the Mad Max versions of the people they know; here they come complete with Australian accents.)

Additionally, the cast is a pleasant surprise, accomplishing a lot even when they have just a little to work with, with three of its members pleasant surprises. Suzi Barrett shows a good deal of comedic flair in the role of Kirby's mom, getting her fair share of laughs. Olivia Stuck somehow makes Kirby's sister-from-hell Dawn sympathetic (or at least, pitiable). And of course, improv master, veteran voice actor and "Simlish" cocreator Stephen Kearin's Principal Mitchell is a memorable mass of eccentricities (almost) sufficient to singlehandedly make the hackiest of shows watchable.

And so it went down to the finale (aired Thursday), which, to the creative team's credit, actually wrapped up a storyline, and in the process, offered the sense of a bigger tale ending as Kirby, Mitchell, Dawn and the rest closed one chapter in their lives and began another. However, whether all this will be enough to lead to a fourth season is a different matter. The show, poorly rated to begin with (one reason for the change, I suspect), has seen its viewership plunge to abysmal levels--under 200,000 if I read the numbers right, rather worse than the shows Disney XD so recently axed, Lab Rats: Elite Force and Gamer's Guide to Pretty Much Everything. Whether it will survive that will depend, I suppose, on whether viewership picks up during the reruns this weekend (and further airings of the show), whether the executives feel bothered by the way they are running out of live-action shows to put on the air--or the creators can sell them on another sharp shift of course. And maybe all of them together.

SFTV for the Younger Crowd: A Few Thoughts

When writing my article "The Golden Age of Science Fiction Television: Looking Back at SFTV the Long 1990s," (or revising it for reissue in my book After the New Wave) I specifically concentrated on North American-produced live-action television oriented toward adult audiences, leaving out animation, and children-oriented programming of both the animated and live-action varieties. This was partly because I'd seen very little of it since elementary school (certainly nowhere near enough to even think about writing anything comprehensive), but also because I didn't think that the scope I set for the article was overly narrow, and I didn't think that broadening it would have changed the picture it presented very much.

I'm still not sure that it would have. Given how long and how heavily science fiction's tropes have been mined, and how dependent the genre has become on Baroque handling of old concepts and inside jokes, TV aimed at a young audience seems an unlikely place for fresh ideas.

Still, in hindsight it does seem fair to note that fiction supposedly for children often isn't really that--fiction genuinely about children and their experiences, or at least engaging the way they really see the world. Rather it's just the same stuff written for the regular, "adult" audience, sanitized sufficiently to pass a more stringent censor. This isn't altogether new--the early versions of fairy tales children grow up were not the stuff of Disney films. Still, this seems to have become more conspicuous in recent years, partly as pop culture has gone increasingly metafictional. I remember, for example, an episode of Animaniacs that was an extended parody of Apocalypse Now (and even more obscure still, the making-of documentary Hearts of Darkness). Now on the Disney Channel they have a sitcom about a summer camp run by a family with the last name Swearengen, where one of the campers quotes Omar from The Wire ("You come at the king . . .") in an episode that is basically Scarface with candy and video games instead of the original product.

And as all this might suggest, I have noticed some relatively sophisticated genre content here and there. An obvious instance is Jimmy Neutron. That show can seem like merely another iteration on the century-and-a-half-old tradition of scrawny boy genius inventors.1 Still, it is notable for its consciously retro handling of that retro idea--Jimmy growing up in a household where everything from mom's hairdo to the design of the family television set looks like it came out of the '50s, in a town called, of all things, Retroville (complete with a Clark Gable lookalike for mayor).2 Steampunk has remained a genre of cult rather than real mass appeal--but Harvey Beaks included an eccentric steampunk enthusiast in its cast of characters, and served up an hour-long musical steampunk special.

Something of this has been evident in live-action programming as well. Television has been awash in superheroes for years--but Disney XD offered a more original take than most in developing a show about a hospital where they go when they need medical treatment and the two kids who stumble into it in Mighty Med. More recently the same channel's Kirby Buckets, for its third season, shifted from sitcom wackiness set (mostly) in the mundane, regular world, to a season-long, dimension-hopping adventure through alternate timelines.3 A good deal more tightly written than most of the season-length stories we get (arcs on American TV remain mostly about stringing audiences along), it actually is a story with a real beginning, middle and end, and the experiment has been all the more striking for its 13 episodes being aired in a mere three weeks (an exceptionally rare move for broadcast television).

Whatever else one may saw about it, very little of the comparable stuff pitched at grown-ups in prime time as of late has been as knowing or risky or innovative or audacious. (And when these shows are at their best, rarely as much fun.)

1. The original "Edisonade" was Edward Sylvester Ellis' 1868 dime novel The Huge Hunter; or The Steam Man of the Prairies.
2. While Ellis celebrated the young, lone amateur inventor, the red-brick universities and Land Grant colleges and Massachusetts Institute of Technology were promoting national science policies through their graduation of their first candidates, while scarcely a decade later Thomas Edison established the world's first big corporate lab.

Making Sense of B.O.

We hear all the time about the economics of filmmaking--that this movie cost this much, that it made that much. But a lot of the talk strikes me as far, far removed from even the best known realities of the business.

Movies cost far more than their production budgets--and their backers make less than their grosses at the box office. On the one hand, marketing and promotion budgets are much less often announced than production budgets, but are often as large. The result is that the producers of a $150 million movie may be thought to have spent twice that much when the distribution costs are taken into account.

Meanwhile, studios get maybe forty to fifty percent of the gross of a film.

This means that instead of a $150 million movie earning $200 million making a solid $50 million profit, it may have actually left the backers $200 million in the red--because they spent $300 million, and only got back $100 million, or even less.

Yet, it is worth remembering that film production and other costs are often offset from other sources--like product placement, and even government subsidy. (The makers of Thor 2 got a tax rebate worth some $30 million from the British government.) It is worth remembering, too, that domestic and foreign box office are not always clearly distinguished by commentators, and the foreign markets ever more important, particularly as key large markets (e.g. China) have become more affluent. (Warcraft made $47 million in the U.S., almost ten times as much globally--with about half the total gross earned in China alone.) At the same time, the box office is not the only source of revenue--there being video and TV rights, and merchandising, which can add mightily to the total, and so turn a flop into a moneymaker much more often than might be imagined, especially if one thinks in longer than "this-quarter, next-quarter" terms. (Hudson Hawk, the most notorious flop of the summer of '91, managed a profit five years after its release--but the point is that it did manage it.)

And all that is without getting into the creative bookkeeping in which studios have been known to engage regarding the costs, which remind all involved to take their cut of the gross, not the net, if they can. (Paramount's claims about the profitability of Forrest Gump were quite the story at the time.)

All this is a reminder that like many another industry, for all that we hear about, filmmaking is less transparent than it appears. It is also a reminder not to expect a streak of underperforming films--of sequels nobody asked for and which it turned out the public really didn't want--like 2016's rather long streak of them after the hit-packed spring (X-Men, Alice, the Turtles, Independence Day, etc.)--to have too much effect on how Hollywood does things. The sheer variety of ways of fraying costs, and of revenue streams to tap, as well as the lack of any other business model Hollywood might find remotely attractive given its bottom-line priorities and the reality of a film business that makes the high-concept blockbuster more than just an exercise in business cynicism, are likely to keep it at the present game for a long time to come.

Monday, January 9, 2017

On The Word "Deserve"

Like "lifestyle" the word "deserve" has come to be grossly misused, overused and abused, replacing many another word and concept in the process – and consequently, diminishing the average person's already vanishingly small ability to think.

The word "deserve" properly refers to those things that have actually been earned (like an award recognizing particular accomplishments). To say one deserves something is to make an indisputable moral claim on their behalf. However, the term is being used with mind-numbing regularity in place of words like "want," and "need," and "right" (as in "have a right to"). Certainly we all have wants. (You might want a private jet.) We all have needs. (You need food and oxygen to live.) We all have rights of varying kinds. (Free speech is an inalienable human right, while someone might have a right to the inheritance of a particular property or the award of damages following some injury, given the laws prevailing at a particular time and place.)

The upshot of this is that one may want, need or have a right to the things they deserve – but they do not necessarily deserve the things they want, need or claim as a right. Yet, there seems an increasing insistence on dressing up want, need and right in the moralistic language of deserts. Take, for instance, the immediate cause of this post, which was my hearing an anchor on The Weather Channel say last winter that ski resorts in a particular region were finally getting the snow they "deserve." The owners of ski resorts want and need snow because it enables them to operate their establishments, and a particular resort owner may have done the things ordinarily seen as meriting business success, but it is nothing short of bizarre to say instead that resorts "deserve" snow.

I suppose this particular butchery of the English language contains something of the tendency to view every outcome in a person's life as a matter of their own, personal morality – an idea very much in line with the self-help/religious ideas that have long enjoyed wide currency in the United States. There is, too, what Thorstein Veblen in his classic The Theory of the Leisure Class called the "habit of invidious distinction," because where there are the deserving there are also the undeserving.

It would seem that at the bottom of such things is a deeply conservative impulse to justify – and sanctify – everything as it is, not least the inequalities of wealth and the callousness toward the poor and disenfranchised that increasingly characterize American life. The CEOs of Fortune 500 companies are assumed to "deserve" their seven and eight figure compensation packages (even as they make disastrous decisions), while many vehemently deny that the people who perform the work without which their companies could not possibly remain going concerns "deserve" a living wage. Some people "deserve" megayachts, while others do not "deserve" health care – or even food and shelter.

In other words, the mind-boggling greed of some is not merely excused but justified, celebrated, exalted on the grounds of what they "deserve," while the claims of others to having their most basic physical needs met (and many would say, their most basic rights as human beings recognized) are dismissed on the very same grounds, which happens to be not the content of one's character, but the content of one's bank account, personal worth equated with "net worth." One person is "worth" fifty billion dollars and another "worth" nothing – in effect, worthless.

All of this is a revolting tissue of absurdities which only confuses and cheapens the idea of morality itself. But I don't think it's going away any time soon. If anything, the way the political winds are blowing, it seems likely the tendency will only get stronger.

Wednesday, December 21, 2016

Slow Burner and '50s Britain

The plot of William Haggard's Slow Burner (reviewed here) centers on a British program to develop civil nuclear power in the hopes of scoring a major economic coup.

While nuclear energy has never stopped being topical, this idea was especially so at the time because in the '50s the British government really did bet heavily on its scientists achieving a breakthrough in civil nuclear power, both for the sake of cheap domestic power, and as a source of export income with which to achieve a healthy balance of payments (as a country dependent on massive food and energy imports, while its manufacturing and financial position slipped).1 The object of those hopes was the Magnox reactor, which never justified such a confidence (the world generally preferred the American pressurized water reactor, today still the mainstay of civil nuclear power), but the expectations do come to pass in Slow Burner, specifically in the titular, very different technology. A nuclear power source compact enough to be installed in a suburban attic and packed up and driven about in the boot of a car, it put Britain twenty years ahead of the rest of the world in that field--the only way, the book says, that the British economy was not twenty years behind it.

Moreover, the implications of this for Britain's economic life are repeatedly underlined within the story, so much so that the characters worry that the thief might be running West as much as East, and seem more concerned about the implications of losing perhaps their only prospect for a healthy balance of payments than they are about the Soviets upsetting the Cold War balance of power. Indeed, it is Britain's economic predicament that Sir Jeremy Bates has in mind ("Fifty or sixty million . . . and food, at the level of subsistence, for perhaps forty"; manufacturing plant "a generation out of date") when he thinks to himself that a "man who could consider going abroad, selling his knowledge, was worse than a danger, worse than an apostate" (114).

The point comes up in smaller ways, too--a burglar enlisted by Colonel Russell's people for an illegal black bag job told that if things go badly he could be resettled where he likes in the Sterling Area.

Sterling Area? he wonders, surprised by the qualification.

Yes, he's told, because just now dollars are hard to come by.

Next to this any menace from the Soviets in the book appears vague, shadowy.

Unsurprisingly, a good part of the book's interest for me was in its quality of being a time capsule from '50s Britain, capable of surprising in such ways.

1. While it doesn't say much about the government's specifically nuclear ambitions, David Edgerton's Warfare State nonetheless has a good deal to say about British policy regarding R & D in these years, and the view of some of its critics that it was too devoted to big-ticket prestige projects (of which the Concorde supersonic transport was another example).

Tuesday, December 20, 2016

An Un-Bond: William Haggard's Colonel Charles Russell

I remember that when reading Ian Fleming's James Bond novels I was struck by how much the character and his outlook, his associated image of glamour, the basis of his existence as a globe-trotting British agent (policing the Empire in its last days, playing junior but more skillful partner to the Americans in the Cold War contest) was very much a product of a particular period that has since passed. This has been so much the case that novelists working in the series, less able to rely on brand name and flashy filmmaking than their cinematic counterparts, have in recent years so often seen no option but to go back to those days--Sebastian Faulks and William Boyd taking Bond back to the '60s, Anthony Horowitz taking Bond all the way back to the '50s (or in the case of Jeffrey Deaver, starting with the character again from scratch in the present day).

Still, Fleming's creation has held up considerably better than Haggard's, and his longtime protagonist, Colonel Charles Russell, head of the imaginary Special Executive.

Russell, like Bond, is a formidable ex-military counterintelligence operative; like Bond, handsome, suave, urbane; like Bond a man who enjoys good food, drink and the other luxuries. Still, his version of luxury is different. Instead of the dated image of jet air travel and casinos, there is the even more dated image of the club, the country house, enjoyed by this old officer with his regimental moustache and pipe. Bond might brood about the taxi driver's manner, but Russell never has occasion to, appearing to exist untouched by the changes of the world surrounding him.

Indeed, at times Haggard's characters can almost seem caricatures (at least, to an American rather sensitized to "stage Englishmen" by terrible Hollywood writing). This is particularly the case when Haggard writes figures like Sir Jeremy Bates in Slow Burner, with the following line exemplary: "it was barely four. It would be unheard of for Sir Jeremy Bates to leave his office at tea-time" (116); or better still, when Bates comes home after a drunk to a valet who offers no question or comment:
Confound and damn the fellow! His lack of interest was an insult. A gentleman's gentleman--the convention had survived, the convention of the English manservant, secret, uninquiring, impersonal (111-112).
It is much the same with William Nichol, who after nearly getting run over by a would-be assassin in a big truck, and losing his hat, takes a cab because he "was not a man to walk hatless in London" (127).

Such things are even more evident in their social attitudes, the bigotry and snobbery fiercer and more bluntly expressed. While Fleming wrote many a foreign villain, he generally eschewed depicting treachery by Britons. The French unions may have been a Soviet fifth column in Casino Royale--but the British unions (even the Jamaican unions), however much Fleming disliked organized labor, were in the end no traitors to the nation. Many a scientist can be counted among Fleming's villains--but scientists as such were not presented as a perverse lot.

Not so with Haggard, who condemns scientists as such as untrustworthy because their affinity for reason makes them a bunch of damned crypto-Communists; and when the traitor is exposed, the scientist is indeed a man of "the far, far Left," while also being "no gentleman," not solely as a matter of his less than upper-crust social origin but his "character" in his dealings with his wife.1 (And that is not only all we know about him, but, it seems in Haggard's rather old-fashioned view, all that we need to know.)

All that makes it seem less surprising that there was never a Colonel Russell film franchise--and why, even by the standards of series' that did not land successful movie franchises to keep him on people's minds, the books have slipped into comparative obscurity.

1. "Take a clever boy . . . and put him into a laboratory for the next seven or eight years. What emerged inevitably was a materialist . . . a man who would assume without question that the methods of science could be applied to human societies" (58)--a prospect the narrator clearly regarded with horror.

Sunday, December 18, 2016

Review: Slow Burner, by William Haggard

Boston: Little, Brown, 1965, pp. 192.
WARNING: MILD SPOILERS

Until recently I had read only one novel by William Haggard, mainly because it was conveniently available at the time--Yesterday's Enemy (discussed here).

In hindsight, that book--a relatively late entry into his long-running Colonel Russell series--seems less characteristic than I took it for. The book had Russell out in the world on his own, away from the milieu of British officialdom that looms so large in his other books.

Recently, however, I picked up the first book in the series, Slow Burner, particularly with an eye--again--to the status he once enjoyed as a master of the form, on a level with writers like le Carre or Fleming. Reading it I found Haggard's writing on the whole plain and straightforward (much more so than le Carre, or even Fleming), but not artless. He has an admirable economy with words, and his writing an appealing crispness and polish, especially evident in his use of evocative sentence fragments. ("Afternoon tea on magnificent silver and not enough to eat" (72), he wrote, in a suggestion of the extremely shabby gentility of a family of Irish aristocrats.)

I would add, too, that his interest in character is considerable. Like le Carre Haggard has a deep interest in this particular milieu--the grown-up public schoolboy-senior civil servant-class of which Haggard was a part; indeed, as le Carre has said was the case with himself, this can seem his primary interest, and his writing about espionage just his way of approaching it. The ambitions and anxieties of the people living in this world, their affections and enmities, their jealousies and prejudices, are the stuff of the book to a very great degree--though unlike le Carre Haggard is not terribly critical of his subject. Indeed, while his characters were clearly not of our time, they did not seem nearly so much out of theirs as they do in le Carre's books. There is insecurity about the country's position, and about their class's place within the country--the novel actually opens with senior government scientist William Nichol feeling inhibited about being seen smoking a cigar in the back of the government car in which he is being driven in the current age of austerity and egalitarianism--but one does not get the sense that the country's imperial stature and its traditional ruling class are finished, the way one does reading about Smiley and his people.

The book reflects this in its handling of its two principal plot threads--one, the playing out of the complex of personal antagonisms within the Establishment that eventually leads to attempted murder (particularly that centering on Sir Jeremy Bates and Nichol); and the other, a more conventional spy tale of identifying and stopping a traitorous scientist selling out Queen and Country. The former gets the greater attention, and frankly holds more interest, while the latter is rather slow-moving, and slight on thriller mechanics, the more so because the central character, Colonel Charles Russell, is chief of a counterintelligence service (the imaginary Special Executive) rather than an agent, and one who for the most part acts like a service chief, giving orders and dealing with other officials, rather than playing spy games out in the field in spite of his position. Surveillance is much more often talked about than portrayed; while the one piece of black bag work shown rather than just discussed is essentially played for laughs--and like many an old comedy, concludes with wedding bells.

On the whole the result is uneven, but engaging--if more for its idiosyncracies, the stronger of its characterizations, and its quality of being a time capsule from '50s Britain (a matter meriting its own post, here) than the actual spy story within it, whether taken from the standpoint of le Carre-like drama, or Fleming-like thrills.

Review: Warfare State: Britain, 1920-1970, by David Edgerton

New York: University of Cambridge Press, 2006, pp. 382.

Before delving into a discussion of Edgerton's book, it is important to remember the context in which he wrote--and especially the "conventional wisdom" about British history to which he opposes his own, different analysis. It goes something like this:
Britain has historically been a pacific, free-trading nation, with a strong anti-statist, anti-militarist tradition--and, perhaps not so science-minded as some of its people might have wished--so that one can think of it as "the un-Germany." Accordingly Britain's creation of massive armies to help resist Germany's attempts to dominate the European continent in the world wars (1914-1918, 1939-1945) was something exceptional. Indeed, the epoch-making character of the change was reflected in the way this mobilization contributed to the transformation of the order at home--most notably in the rise of the Labor Party and the establishment of the welfare state after World War I and especially World War II, which proved the dominant theme of the nation's history for the generation that followed.
Liberals tend to view this reading of the past very favorably, to glorify Britain's being a liberal un-Germany. By contrast, others are more critical. Conservatives, epitomized perhaps by Corelli Barnett, view Britain's anti-statism and anti-militarism, and the lack of enthusiasm for science and especially applied science, as weaknesses which disadvantaged it in industrial and economic competition. They hold, moreover, that this economic weakness, in combination with anti-militarism, left the country ill-prepared for the wars it had to fight, and ultimately cost it a great deal of blood and treasure, prosperity and power. And many on the left, if of a technocratic mold (a C.P. Snow, for example), while not buying all of this critique (they generally do not wish for a more militaristic Britain), do nonetheless espouse significant parts of it--quick enough to attribute Britain's industrial, economic and social failings to the grip of an outmoded ruling class and its prejudices (its disregard for science, etc.) on its political life.

In Edgerton's view all this overlooks the reality of a massive British "warfare state"--a large "military-industrial complex," comparable to that of the United States, which was not an exceptional product of wartime, but was massive through the interwar years and after; which for significant portions of even relatively recent peacetime British history consumed more resources than social services; and in practice represented massive state intervention in industry, the economy, science and British life generally. Indeed, Edgerton argues that acknowledging the warfare state's place in British life--the size and cost of the armed forces, their connections with the political, industrial and scientific establishments--automatically complicates, subverts and even debunks the conventional ideas about British history (the works promulgating which are in his phrasing "anti-histories," for giving us claims so much at odds with the facts).

In arguing this view Edgerton offers both a history of the warfare state for the period identified in the subtitle (1920-1970), and the ways in which its reality clashes with the received historiography--two objects to which almost equal time is devoted (five chapters principally given over to the former, three to the latter). Given the difference, and the scale of his efforts on both accounts, it may be best to discuss them separately.

In presenting the history of the warfare state, Edgerton does not present a complete or comprehensive picture--which would have to begin long before 1920 (the 1880s at the latest), and continue up to the present. Moreover, rather than rendering full coverage even of the 1920-1970 period on which he focuses, the five chapters that deal with its actual history look at paticular aspects over particular periods (the World War II-era research effort, the Harold Wilson-era rhetoric and policymaking about the "white heat" of the scientific-technological revolution). Unsurprisingly, there is much that might be discussed which gets little attention (like how the warfare state interacted with the broader line of British economic history). Still, the various pieces of analysis are underpinned by a copious, often systematic use of evidence ranging from statistics on expenditures and shares of national income devoted to defense, the ranking of Britain as an arms exporter (at times slighted, he reports, because ships and planes have not been counted as weapons!) or post-war R & D (a higher share of national income than any other nation in Europe, actually); to the extensive description of the alphabet soup of military and military-affiliated research agencies and establishments (the book actually has a guide to the acronyms up front); to the profiling of the people in these establishments, and the governments generally (which makes clear the image of a government of classicists had long since lost its salience); to his examination of the context and nature of major initiatives (like Harold Wilson's Ministry of Technology, not intended to redress a lack of investment in technology but to rationalize the prestige-oriented mentality that gave the world the Concorde). Together they add up to a satisfactorily broad and deep image of the existence and weight of the warfare state of which he writes.

In pursuing his second object, the historical debate, Edgerton's coverage is similarly robust. He also makes his fair share of valuable observations here--not least the fact that in a country that really was as unconcerned with the use of science as some have charged, the charges of writers like Barnett and Snow would have not have had any traction. His handling of the material is also nuanced enough that he acknowledges the grain of truth in some of the misperceptions of which he is so critical. (Harsh as he is on C.P. Snow, Edgerton does note that public school-educated, arts degree-holding Oxbridge types really were more likely to be administrators than science degree-holders--and concedes that Snow could have made his correct argument more persuasive by making reference to such facts.)

On the whole, it is a solid defense of his position--though in fairness, it is not a particularly difficult one to find support for, and perhaps easier than he admits. Many of the essential facts to which he points are well known to anyone who has paid much attention to this history. (The post-World War II era may have seen the rise of the welfare state--but that it also saw the hugely expensive Korean War-era rearmament, the advent of Britain as a nuclear power, and unprecedented peacetime National Service until 1961, are hardly obscure information.)

Indeed, it seemed to me that he exaggerated the extent to which historians have adhered to the conventional view, particularly as it affects not just writing about British history specifically, but the historiography of war and technology generally as a scene where "civilian industry, science and technology . . . transformed modern war" rather than the other way around to produce "histories only of civilian science and technology applied to war . . . the civilianisation of war," with "military agencies hardly figur[ing]" in the discussion of "war economies." (It seemed to me that he rather mischaracterized William H. McNeill's classic The Pursuit of Power, in particular.1)

All the same, this is something of a quibble, especially given Edgerton's emphasis on the general trend of the historiography, and British historiography generally. It also detracts very little from the very great deal that he gets right with a massive and generally solid synthesis of information ranging from economic history to popular culture--and which cuts through a good deal of the nonsense surrounding such matters as Britain's economic decline. That Britain went from punching above its weight to below it in manufactures, at great cost to its prosperity, is not refuted here--but the fact cannot be simplistically blamed on a cultural lack of enthusiasm for science; the refusal of the British government to involve itself with science, technology and industry; an inability to produce adequate numbers of scientifically trained personnel; or an unwillingness to fund high levels of research and development. The way in which the British state went about using those resources, and the ends to which it used them, are instead the issue. However, that is a subject for another book altogether.

NOTES
1. Contrary to the impression Edgerton gives of McNeill, he elaborates the military-industrial complex phenomenon in late nineteenth century Britain, and draws comparisons with the United States in this respect, even using that exact terminology less associated with Britain. Indeed, a section of Chapter Eight--Military-Industrial Interaction 1884-1914--in McNeill's book is actually headed "Emergence of the Military-Industrial Complex in Britain," and makes it very clear that the armed forces were not merely passive consumers of civilian technology. (The thesis of McNeill's book is actually that in the industrial age the center of gravity moved back from the marketplace to a "command economy," in warfare as elsewhere.)

McNeill's account of the rise of the naval-industrial complex in Britain in the aforementioned section identifies bureaucratic in-fighting as a key factor, as with the role of John Fisher, and notes that it was his object to use market competition to stimulate the activity of the Navy's own, state supplier. Additionally, while this failed, the Navy became an increasing driver of "deliberate" invention, as
Navy technicians set out to specify the desirable performance characteristics for a new gun, or ship, and, in effect, challenged engineers to come up with appropriate designs . . . Within limits, tactical and strategic planning began to shape warships instead of the other way around. Above all, Admiralty officials ceased to set brakes on innovation by sitting in judgment on novelties proposed by the trade (279). And all this, of course, led to "military technology . . . constitut[ing] the leading edge of British (and world) engineering and technological development" (284).
The page numbers I cite here are from the 1984 University of Chicago Press 1st edition.

Wednesday, December 14, 2016

Star Wars' Place in Film History

Looking back it is clear that Star Wars was not the first science fiction film--nor the first space-themed science fiction megahit. (2001 was the highest-grosser of 1968, a year that also saw Barbarella and Planet of the Apes become pop culture icons.) It was not even the first film to serve up galactic empire-style space opera. (Barbarella did that, certainly, and in a quieter way, so did Forbidden Planet, among others.)

At any rate, the significance of Star Wars' portrayal of the stuff of pulp space opera, unprecedentedly lavish as it was, ought not to be exaggerated. Contrary to popular belief, splashy galactic empires never became all that popular on the big screen. (Get away from Star Wars, and the list of really commercially successful films of the type proves rather short--certainly a lot shorter than the list of, for example, successful superhero films.) Indeed, one can think of the Star Wars as occupying a fairly narrow niche because of this.

What Star Wars really did was point Hollywood to something more basic than that--the pattern of the contemporary blockbuster. Hollywood had been making sequels since forever--but trilogies telling an extended story, and prequels, were something more novel. So was the intensiveness of the merchandising (which was why the executives at Fox let George Lucas have the rights, and doubtless kicked themselves afterward for doing so).

One may be ambivalent about that, seeing it as a simple matter of marketing than the artistic or entertainment value of cinema, but the films' innovation extends, too, to the essential structure of the movies--the structure of the action film specifically. The Bond films of the '60s had established this, organizing an unprecedentedly fast-paced film around giving the viewer a "bump" every three minutes or so, with elaborate set pieces at the center of this, filmed with the help of a battery of techniques (from short takes to exaggerated sound effects) to maximize their visceral impact.

Of course, Star Wars came along quite a few years after the Bond films--fifteen years or so after Dr. No. And Hollywood did cash in on the Bond craze, imitating the movies, but generally in a superficial way. The studios made spy movies, and other kinds of action movies (cop movies like Bullitt and The French Connection each had a big car chase in them)--but the filmmakers there didn't quite get the way they were put together (so that Bullitt, and French Connection are just crime dramas which happen to have a car chase in them, and even the Derek Flint movies look pretty flimsy as action films).

That changed when Lucas came along and served up, as he himself put it, a blend of James Bond with Flash Gordon, which because of that blend went Bond one better--because science fiction, with its exotic aliens, super powers and imaginary vehicles makes it easy to serve up bigger, flashier action than any spy or cop adventure, and because the production entailed a special effects revolution. The key innovation was the computer-controlled camera that, because of its precision, made it easier and cheaper to take lots and lots of effects shots, getting scenes right. That opening shot of the Star Destroyer's underbelly probably couldn't have been done without it.

And so today we're not awash in galactic empire movies--but action movies and science fiction movies and especially science fiction action movies have been playing at the multiplex all year long for about as long as any young person today can remember. Many are ambivalent about this--or outright critical. I don't argue that we could use more diversity in our films--diversity of subject, theme, tone, idea. But all the same, there is no denying the technical accomplishment of the saga, or its influence in this regard, precisely because of how it won over millions of fans, and has gone on winning them down to the present day.

Monday, December 12, 2016

Akira Kurosawa's The Hidden Fortress: Thoughts on my First Viewing

I first saw Akira Kurosawa's The Hidden Fortress (1958) on Bravo before its lamentable turn to particularly disgusting versions of the reality show format.

I have to admit that I was a bit thrown by that first viewing--and even a bit let down.

One of the reasons I watched it--the first Kurosawa film I saw--was that I had heard about The Hidden Fortress being a basis (the basis?) for Star Wars.

I suppose I expected Star Wars with horses and katana instead of spaceships and light sabers.

But it wasn't that.

We get C-3PO and R-2D2, Leia and Obi-Wan, and even a Darth Vader--redemption and all. But we don't get a Luke--so, no hero's journey.

There are good guys and bad guys--but no epic, cosmic, "universal" battle of good and evil. We have instead a smaller-scale, local conflict between two clans that is deeply rooted in the feudal culture with which we are presented, with its particular ethos and loyalties.

I might add that while there are some fights, it's not really an action movie. Aside from the fact that people just weren't making action movies as we know them back then (that awaited the Bond films in the '60s), the film skews comedic, at times rather darkly comic--just as you'd expect if Lucas decided to put the droids and their misadventures at the center of his film, and also made them a couple of idiot backstabbing freebooters who don't think twice about doing things they would never put in something advertised on the Disney Channel.

It is all a reminder not just of how much Lucas owed Kurosawa, but also how much he owed his numerous other inspirations--to Joseph Campbell, to a long tradition of pulp space opera, to the adrenaline-oriented structure of the James Bond films that he seems to have figured out and indeed mastered and improved upon before anyone else in Hollywood managed to wrap their minds around it (an achievement many seem to intuit, but rarely spell out clearly), among much, much else.

After realizing that, I took another look at this movie--taking it on its own terms, rather than as a prototype for a very different film, and enjoyed it for the work of art that it is.

Subscribe Now: Feed Icon