Monday, January 26, 2015

Reading H.G. Wells' The Outline of History

Readers familiar with H.G. Wells' later writings will likely know what to expect from The Outline of History. Wells' study of world history is, much like what he offers in comparable passages in The Shape of Things to Come, a progressive tale of the triumph of reason over superstition; of the ideas of service and community over self-seeking and privilege; and of human dignity and freedom over ignorance, want and tyranny. This is, Wells informs us, above all a history of the development of human thought, and in particular the development of three, generally rising trends:

1. The rational, systematic pursuit of knowledge about the world, and the application of this knowledge in similarly rational and systematic fashion. (This can be thought of as the story of philosophy and science, of figures like Herodotus and Aristotle, Roger Bacon and Francis Bacon, and the "mechanical revolution" that remade everyday life.)
2. The recognition that all of humanity comprises a single, universal community, the members of which are all equals and for whom the true "good life" lies in service to that larger community. (This is the ethical revolution wrought by religious traditions like Buddhism, Taoism and the Abrahamic tradition, as well as secular philosophies like Stoicism.1)
3. The interaction of the principles of the "community of will" (exemplified by the freedom-loving nomad) with those of the "community of obedience" (exemplified by the settled, hierarchical inhabitants of the early civilizations) to produce a higher sort of community of will. (The reacting of one on the other is for him the story of democracy, from Greece and Rome to the Magna Charta to the Enlightenment.)

Nonetheless, in line with his science-minded outlook, Wells begins well before one can properly speak of such things, at the birth of the planet Earth, and the earliest development of life on it, a part of the tale that he tells at unusual length for a work of this type. We are already on page 68 before "the first men like ourselves" appear, and on page 127 before he turns to the very first civilizations, roughly an eighth of the main text already behind us.2

After this point the book becomes more conventional in its choice of subject matter, relating the rise of the first recognizable polities in Old World river valleys (the Tigris-Euphrates, the Nile, the Yellow River, the Indus), the intellectual stirrings and empire-building of the ages that followed (the Dharmic and Abrahamic religious traditions, the earliest philosophies of Greece and China; Sargon and Cyrus, Asoka and Alexander, Qin Shi Huang and Caesar), and the long crawl toward modernity (Dark Age and Renaissance, the Tang and Mogul dynasties, the voyages of discovery, the scientific, industrial and mechanical revolutions, the American and French Revolutions). Still, Wells' principal interest is in tracing the trends with which the book is concerned through these movements and events, determining what he does or does not choose to emphasize.

The progress of those trends--the increasing recognition of the liberating potential of the sciences, the oneness of the human community, and the ideas of equality and freedom--had advanced to such a point by the eighteenth century's end that they thrust to the forefront of political thinking three old problems that might be thought of as issues of their practical implementation: property, currency, and the conduct of international affairs.2

The solution to these problems, it seemed to him, lay in "a world unification based on a fundamental social revolution" (916). However, the ending to that story had yet to be written--and did not seem a thing to be taken for granted. Indeed, it may be the discussion of this possibility that constitutes the most dramatic difference between one edition and another. The original 1920 edition, following its discussion of the First World War, concludes with some optimism about the diminished prospects of another, comparable conflict, and closes with a forward-looking chapter titled "The Next Stage of History," detailing a possible path toward a united world, and what the Modern World State might look like.

In their place in the 1961 edition are two chapters on "Twenty Years of Indecision and its Outcome," and "The Aftermath of the Second World War." "Indecision" offers an appropriately grim assessment of the superficial or illusory reforms, broken promises and missed opportunities of the interwar period, and their implications, not least that the outcome he hoped for seemed increasingly uncertain, and likely to come at the cost of "incalculable further depletion in waste and suffering," such that "our species may stagger half way to its goal and fail" (916).3 The book subsequently moves on to a chronicle of the major events of the Second World War, and the years that followed, the element of advocacy much diminished.

As one might guess given the book's age it has dated in ways small and large, from its adherence to hyperdiffusionism, to its assessment of pre-Meiji Japan. However, there are many respects in which Wells' historiography not only remains compelling (like his account of Rome's rise and fall), but seems surprisingly of our own time, like his critiques of academic specialization, or Eurocentrism. More importantly, nearly a century after the publication of the book's first edition, and three-quarters of a century after the last edition on which Wells personally worked, the book remains very effective in making its key arguments. Additionally, in its sweep and great readability, its breadth of vision and multitude of insights, it remains compelling as a piece of historiography, all the more valuable for the ways in which its rationalistic, humanistic and progressive vision of history has come to be unfashionable. The result is that while Wells' Outline is relevant to an understanding of the vision that produced his fictional output, its significance is hardly limited to those with such an interest. Indeed, the tendency to overlook this book, just like the tendency to overlook all his writing apart from his handful of classic scientific romances, is a thing to be regretted.

1. One may be surprised to find Wells attributing a positive role to religion. His position is that beneath the encrustations of superstitious doctrine and ritual, these religions offered a recognition of the existence of a single human community, and of "the good life" as one in service to it, rather than the pursuit of the advancement of oneself or some smaller fragment of humanity; in short, early intimations of the World State for which he called.
2. I am citing the 1961 Doubleday edition of the volume, which was revised and brought up to date by Raymond Postgate. (The last edition on which Wells worked was published in 1939.)
3. Wells' efforts on this score are, of course, imperfect. The fact remains that where space and detail are concerned, the study remains overwhelmingly devoted to the accustomed subjects of traditional Western history. However, this appears to reflect the material then available to him, and he is more successful in his effort when he is in his analytical mode, as when writing of the significance of Buddha and Asoka, or putting the rise and fall of the Roman and European colonial empires into a longer-ranged and global perspective.

Sunday, September 28, 2014

Eugie Foster, 1971-2014

Nebula award winning-author Eugie Foster died on Saturday.

She was forty-two.

I knew her from my time reviewing for Tangent Online and The Fix, where she worked as an editor.

Eugie was one of the very first people I became acquainted with in the world of science fiction fandom, and the first with whom I actually worked in it.

She was always kind, considerate, patient and supportive.

The world is a colder place for her no longer being among us.

However, her work does remain with us. If you are not familiar with it, I urge you to check it out for yourself.

Friday, September 12, 2014

Telling Lies About Tolstoy's War and Peace: The Short Version

My recent post on Tolstoy's War and Peace has been getting a lot of hits lately, and so I thought it worthwhile to write up a quick summary of my thoughts on the book.

Exactly what I had to say about the book is the following.

The length of the book; its crowding with a huge cast of at times confusingly named characters; its loose, even plotless structure; and its frequent digressions into lecturing about ideas which are in some cases abstruse, in others deeply against the grain of twenty-first century attitudes; make it a challenging read.

It must also be admitted that the book has its limits of perspective. Tolstoy is in the main focused on Russian aristocrats here, and takes little interest in the strata below them, or in the modern, urban side of life. He also subordinates the variety, balance and color of his tale to his expounding of certain ideas about life and history. Unsurprisingly the story often seem sanitized, poorly paced and even dull in places--the contrast with his contemporary Dostoyevsky, for example, quite marked--while his relation of his ideas gets repetitious.

Still, along with its challenges and its arguable defects, it has very real strengths. Tolstoy's eye for detail, and the vividness this often gives his characterizations and scenes, often elevates the interest of his material beyond what might be expected from its limitations. As one might hope from he title, he is particularly good at writing about war (and as it turns out, especially the fog of war), and while his presentation of his ideas is one-sided (and frequently self-contradicting) the ideas themselves have enough interest to engage a reader ordinarily disinclined to consider such views.

The result is a book that may not necessarily live up to the claims made for it as the greatest novel ever written, but which remains deservedly a classic, and which I found worth my while.

Looking Back, Looking Forward: The Box Office in Summer 2014, and The Movies of Summer 2015

A summer exceptionally short on megahits came to an end this Labor Day weekend.1

This season, only one movie crossed the quarter-billion dollar mark. By contrast, three or more has been the norm since about the turn of this century, with a few exceptional summers scoring as many as five (2007, 2009, 2010), and last summer seeing four.

The biggest hit was Guardians of the Galaxy, just shy of $300 million at the time of writing. By contrast, 2013 had Iron Man 3 with $409 million, and Despicable Me 2 with just a little less (some $391 million). The preceding year was even more impressive, with Avengers taking in $623 million, and The Dark Knight Rises pulling in another $448 million.

Overall, the earnings during the summer season were down almost 15 percent from the preceding year.

Of course, that is not really grounds for panic. The slow ticket sales do reflect a number of exceptional factors, like the fact that last summer's earnings had set a new record, and so skew the perception of the average; and that the bumping of a number of "tentpoles" (like Jupiter Jones) over to later release dates, and the fewness of big, family-oriented, animated films (no Pixar, etc.), meant fewer of the kinds of movies most likely to be big money-makers.

Still, the long-term picture looks worse when one thinks in terms of tickets sold rather than monetary grosses, as a quick glance at the data on Box Office Mojo shows.2 The fact that the top films in particular were such a letdown can be taken for a reminder of the limits of the blockbuster, "event movie" strategy--its high-risk nature, and the reality that one can only have so many successful releases of the type in any given period. The distribution of success and failure during the season also implies that not only critics, but audiences too, are weary of the kind of fare being served up. The films that did perform relatively well were the fresher and riskier releases (Maleficent, Edge of Tomorrow, Lucy, Guardians of the Galaxy), rather than the remakes and sequels of May and June that seemed such good bets (Spiderman 2, Transformers 4).

Will that fact make Hollywood a bit more adventurous in its future offerings? Perhaps, perhaps not (more likely not, given how few seem to be saying this), but 2015, at least, will come along too early to reflect any such change, and that year at least seems to be much in line with this one, with movies like Avengers 2, Mad Max 4, Jurassic Park 4, Terminator 5. A reboot of the Fantastic Four, remakes of Poltergeist and Point Break. Movie versions of Entourage, The Man From U.N.C.L.E. and Assassin's Creed. The Despicable Me spin-off Minions, and a sequel to Ted. More Insidious and Sinister, and more Melissa McCarthy in Spy. Another Marvel superhero franchise in Ant-Man . . .

Reviewing the list I still wondered if maybe there would nonetheless be material for another post like the one I wrote in 2013 about this summer's films ("And Now For Something Slightly Different (Maybe)"), and simply didn't see it, so I'll content myself with making guesses about how this crop will do.

As it seems to me, with even a steep drop from the earnings of the previous film, Avengers alone should be adequate to ensure one mega-hit for the summer of 2015. And there's no reason to think Marvel's luck is running out just yet, so Fantastic Four and Ant-Man have a shot at being decent performers.

Despite the flopping of Seth MacFarlane's A Million Ways to Die in the West (yet another reminder that Westerns don't do well in summer, as if one were needed), I expect that there's enough goodwill left toward Ted to make Ted 2 a success--just as Minions will probably be a success.3 By contrast, the very hard push made for Melissa McCarthy seems to be running out of steam if Tammy's anything to go by, and Spy seems likely to suffer accordingly.

It has been a long time since either Jurassic Park or Terminator were on the big-screen, and when they did put in their final appearances, those franchises were well along a path of diminishing returns (though I personally liked the lean Jurassic Park III better than the lumbering The Lost World). That will work against them, and so will the ex-California governor's awful track record at the box office this past couple of years.

The idea of a new Mad Max film at least seems more relevant, but this franchise has been off the screen longer (thirty years!), was closely identified with a star no longer attached to it (and whose star is far from what it once was), and was never a big money-maker in the States. (Mad Max 3 made $38 million in '85, which even after inflation still isn't blockbuster money.) I'm not sure there are grounds for expecting this to change now.

I don't have much idea as to whether Man From U.N.C.L.E. will be another Mission: Impossible-like blockbuster--or an I Spy. Equally, Assassin's Creed may be the breakthrough Hollywood's been waiting for with video game-based movies--or it may be another Prince of Persia-style disappointment (though in fairness, a movie can do a lot worse than Prince did).4 And the decision to remake Point Break seems just plain incomprehensible.

This seems to imply a good many gambles--but not necessarily the kind of gambles likely to pay off. Rather than serving up something a little less conventional, it looks like a great deal of effort at squeezing exhausted properties (Terminator, etc.). And so I won't expect that 2015 will shift the talk from dismayed reactions to hopes for endless boom times.

Still, even if that summer ends up with another load of underperformers, Hollywood will not necessarily see it as reinforcing the disappointing signal from 2014, especially with foreign markets continuing to blunt its worry over the trend at home (and few in the industry seeming to worry much about intensifying competition for markets like China's and Japan's)--and in fairness, the absence of any clear alternatives that the studios might plausibly find attractive. So industry-watchers will once again talk about how something must be done, while business goes on as usual.

1. This was underscored by the following weekend being the weakest the industry has seen in thirteen years--with the last such low point due to highly exceptional circumstances simply not comparable to the events of September 4-6, 2014.
2. 2013 may have been a record year in terms of dollars earned, but ticket sales were down 15 percent from 2002, and was actually the fourth lowest year for ticket sales this century. This year seems likely to be worse than that. All the same, when the numbers are crunched at year's end, it remains more than likely that theatergoing in 2014 will be found to have been within the normal range for recent decades--4-5 tickets per capita for the twelvemonth.
3. Even before MacFarlane's film, the list of such flops was already quite long, with Jonah Hex and The Lone Ranger recent memories.
4. Despite its reputation as a flop, Prince still made $91 million at the U.S. box office, $336 million worldwide--more than any other video game adaptation to date.

Wednesday, August 20, 2014

Reading Literary Classics, Again

Time and again I have been struck by the difference between people how are supposed to read literary classics, and the far less inspiring way they actually experience them--a combination of unthinking respect for their Authority, and equally unthinking boredom with the actual stuff of them.

As it happens, this is not a new situation at all.

Oscar Wilde summed up the situation almost one and a quarter centuries ago in similar terms, when he remarked that:
In Art, the public accept what has been, because they cannot alter it, not because they appreciate it. They swallow their classics whole, and never taste them. They endure them as the inevitable, and as they cannot mar them, they mouth about them.
Certainly where Shakespeare is concerned, it seemed to him "quite obvious that the public really see neither the beauties nor the defects of his plays."

Moreover, he took the view that this tendency "does a great deal of harm," the public "mak[ing] use of the classics of a country as a means of checking the progress of Art. They degrade the classics into authorities."

Some decades after, H.G. Wells observed in The Outline of History that it was "a pity that the ridiculous extravagances of scholastic admirers . . . speak of [Homer's Iliad and Odyssey] as supreme and unapproachable and so forth," for these attitudes had left the general reader's response to these worthwhile works an "awe-stricken neglect."

And so it went with every other writer enjoying a company of "scholastic admirers" prone to "ridiculous extravagances" of this sort, who insured that that part of the public which did read fiction gave their attention to figures like E. Phillips Oppenheim instead. So it still goes, long after Oppenheim himself became obscure, while his heirs advertise their novels on television.

Incidentally, I expect to have more to say on Wells' Outline in upcoming posts.

Selling Theater Tickets in a Post-Netflix Era

The entertainment press is, like the rest of the press, not famous for encouraging a long-term perspective. Slight ups and downs in the mark get trumpeted as indications of endless boom times to come, or the End of Everything, when in North America, at least, the numbers have remained fairly constant: 4-5 tickets sold per capita, per year, for the past several decades.

Still, that does not rule out the possibility of deeper and more worrying changes for the business, like the apparently decreasing tendency of the younger age cohorts to go to the theater. To go by the statistics routinely offered up on BoxOfficeGuru, for the young moviegoing (rather than mere movie-watching) is less and less a casual activity during a weekend outing, and more and more a deliberate decision--which seems understandable given that they are as a rule more cash-strapped, less mobile and more accustomed to alternatives for accessing content than older cohorts, even as all demographic categories seem to be affected by these trends.1

The result is that while it has always been the case that most movies lose money, leaving the industry dependent on a comparatively small number of big moneymakers, a "regular" movie has even less chance than before. A film simply has to offer something on the big screen that will make buying a ticket rather than waiting and seeing it on the small screen worthwhile. The most obvious way is sheer visual impact--hence, the accent on blockbuster-style spectacle, supplemented with the punch of IMAX and 3-D.

The other is making a film's release feel like an event, something the viewer wants to experience not in two months, but right NOW along with "everyone" else. Of course, there is hype, but it only goes so far when everyone does it, and those charged with generating the hype need something to work with--the very reason for the pressure on would-be filmmakers to produce "high-concept" work. Basing a film on a property which already has a large and interested following is the most obvious strategy--hence the endless sequels, spin-offs and remakes, as well as the tendency for new work to come from adaptations of already popular properties rather than original scripts. (The phenomenon is not limited to tentpoles: Twilight was not an action-packed CGI-fest, but its release was certainly received as an event by fans of the book.2)

Still, there are limits to the "event" strategy. If every movie is an event, then no movie is an event, especially if all the movies out there look alike anyway--a gripe that has become fairly routine, though it may be increasingly plausible. And particular franchises can be worked to diminishing returns in this as in other ways. The Spiderman reboot failed to stir up real excitement back in 2012, and the sequel suffered accordingly this summer, setting the tone for a season full of movies that large parts of the audience felt they could afford to miss.

Nonetheless, for all the disappointments of the past few months, Hollywood, buoyed by all the surcharges and foreign receipts, looks a long way from the kind of crisis that would require it to seriously alter its way of doing business.

1. Their decreased propensity to drive has often been noted; and it should be remembered that this is not a reflection of improved public transport, the reverse likely being the case given post-2008 cuts to public services.
2. It seems, too, that movie running times reflect similar pressures. The 90 minute film seems largely a thing of the past, just about everything seeming to be a two-and-a-half hour epic, intended to make the viewer feel they got something they wouldn't just watching a TV show at home. And of course, when the movie's main offering is spectacle, the longer running time lets it serve up more of this.

Tuesday, August 19, 2014

The Decline of the R-Rated Action Movie

The dominance of the effects-powered science fiction and fantasy spectacle, and with it, the "summer tentpole," is typically traced back to the mid-1970s, and films like Steven Spielberg's Jaws (1975) and Close Encounters of the Third Kind (1977), George Lucas's Star Wars (1977) and Richard Donner's Superman (1978), since which time films of the type have typically topped the box office.

Yet, their dominance of the marketplace has never been so complete as it is now. Going back over the lists of releases as recently as the 1980s, one struggled to find a handful of such films in any given year. Now in a typical summer, most weeks see a new release of the type (or even two), while a good many other such films are sprinkled around the rest of the calendar, giving the impression that it's summer all year long.

In the process they have crowded out other types of films, among them other styles of action film which had managed to flourish even in the post-Star Wars age, like the paramilitary action movie, and with it, the R-rated action movie in general (in contrast with the typically PG/PG-13-rated Spielberg-Lucas-superhero spectacles).1 During the '80s, and even much of the '90s, various Schwarzenegger and Stallone vehicles (like 1985's Rambo: First Blood Part II, 1993's Cliffhanger, and 1994's True Lies), the first three films of both the Lethal Weapon and Die Hard franchises (1987-1995), and assorted Die Hard imitators (1994's Speed, 1996's The Rock, 1997's Air Force One), as well as a slew of science fiction films made in similar style (like 1986's Aliens, 1990's Total Recall and 1991's Terminator 2: Judgment Day), were all among their year's top ten hits.2

The annual top twenty during these years also included such hits as 1982's First Blood, 1986's Cobra, 1987's Predator and Robocop, 1988's Rambo III, 1989's Tango & Cash, 1992's Under Siege and Patriot Games, 1993's Demolition Man, 1994's The Specialist, 1996's Broken Arrow and Eraser, 1997's Con Air and 1998's Lethal Weapon 4. Two other Schwarzenegger films widely seen as prime examples of the '80s action movie, 1984's Terminator and 1985's Commando, also came very close (making the twenty-first and twenty-fifth spots in their years, respectively).

The preponderance of these kinds of films would seem even more overwhelming if one also counted in related films like Eddie Murphy's '80s-era action-comedies (1982's 48 Hrs., and the first two Beverly Hills Cop movies in 1984 and 1987, all of which were top ten hits, with the original BHC the top earner of '84), or the period cop film The Untouchables (the sixth highest grosser of 1987).2

In fact, it could be argued that where the action genre was concerned, they were predominant, big-screen action almost synonymous with the words "People under 17 years may only be admitted if accompanied by a parent or guardian." Still, even before the end of the '80s the relationship shifted into reverse, as the shoot 'em ups characteristic of the decade became creatively exhausted and decreasingly relevant, while improving CGI and the superhero boom boosted the science fiction and fantasy epics. In succeeding years the blazing machine guns gave way to wire work and computer-based superheroic feats, and the mayhem became at once larger in scale and edited in quicker-cutting fashion, leaving less opportunity or reason to linger on gory details, while the subject matter (so often drawn from the pages of DC and Marvel) made a lighter tone appear more appropriate. The result was action that was at once more spectacular and nearly antiseptic.3 Meanwhile, film in general, the action movie included, became prone to downplay that other major reason to exclude the under-17 crowd, sexuality (such that one writer recently remarked the sexless lives of superheroes). Under such circumstances, the shift away from the "rougher stuff" was natural enough, and encouraged by the risk attendant on swelling budgets.

In hindsight, The Matrix (1999) seems particularly representative of the transition, in its blend of withering machine gun fire and superheroics (Neo even flies), and its launch of the last R-rated action franchise to meet with really massive commercial success. After its debut fifteen years ago, new R-rated action movies tended simply to continue older franchises (like Bad Boys and Terminator in 2003, Rambo in 2008), often sold on nostalgia even when that was not the case (like 2010's The Expendables), and tended to be comparatively marginal within the marketplace.4 The highest-grossing such film last year, Olympus Has Fallen, fell short of the $100 million mark to wind up only the thirty-sixth highest-earning film of its year--a long way down from the prominence of comparable movies a quarter of a century ago. Unsurprisingly, even traditionally R-rated franchises have tended to sanitize their content in pursuit of a more lucrative PG-13 rating (as Alien, Die Hard and Terminator did in the 2000s, as has been the case with the Total Recall and Robocop remakes, and now even the third installment of The Expendables).5

In short, like the sex and nudity once a regular feature of such movies, bloody violence, when not presented as part of Serious Drama (like the based-on-a-true-story terrorism-themed films in which Navy SEALs are now apt to make appearances), tends to be left to lower-budgeted fare, and one supposes, to premium cable drama like Game of Thrones, which along with the rest of the post-'90s explosion of means for accessing entertainment, saves fans a trip to the theater.

1. One might include under the heading of Spielberg-space-superheroes such things as the Star Trek franchise (first-class hits through the fourth film) and the succession of Indiana Jones imitations seen during the decade (the biggest success among which was 1984's Romancing the Stone).
2. Indeed, four of the top twenty movies of 1987 were police-centered action movies (The Untouchables, Lethal Weapon) or action-comedies (Beverly Hills Cop II, Stakeout)--even without including the science fiction film Robocop, which also made the year's top twenty.
3. Excepting the Blade franchise (1998, 2002, 2004), launched in a time when comic book movies were often lower-budgeted and less commercially ambitious, R-rated superhero movies have tended to perform modestly at best. The most successful of these, 2008's Wanted, is only #34 on BoxOfficeMojo's list of superhero films.
4. One area that has been something of an exception is the battle-heavy historical epic, the R-rated Gladiator (the #4 hit of 2000), Troy (#13 in 2004) and 300 (#10 in 2007) all doing good business, but this remains a small and fickle part of the market, especially in the United States. Another, more modest area of success is action-horror, the Resident Evil (2002, 2004, 2007, 2010, 2012) and Underworld (2003, 2006, 2009, 2012) films getting by on smaller budgets and lower grosses.

Tuesday, August 12, 2014

Learning History From Anime

Time and again I have been struck by the paucity of English-language writing on a major historical subject. Large patches of German history, for instance, like the Peasants' War, or the 1918 Revolution, or the history of the Federal Republic. And while World War II can seem the most thoroughly exhausted of historical subjects, very little seems to have been written about the Italian armed forces' performance and role in that conflict.

No less surprising is the scarcity of historiography about Japan. The obsessive interest in the country in the United States during the '80s does not seem to have extended to an interest in the country's history--a testament to the superficiality of that interest, and of the "expertise" on the country so highly touted at the time. Unsurprisingly, the American anime fan who finds themselves intrigued by a subject like the Warring States period, or the Bakumatsu era, who would like to know more about the career of Oda Nobunaga, or what the Shinsengumi were really like, has very few sources to which they can turn.

That being the case with Japan, with its prominent place in world affairs during the last century, one can only imagine the scarcity of substantive information about other, less-prominent, less-studied societies--and what weak stuff must pass for expertise on their cultures.

Such is the distance between hype and reality in this "information age."

Monday, August 11, 2014

The East Asian Box Office: The Entertainment Press Gets Its Say

I have often remarked here on the trend in the hugely important markets of East Asia (China is the world's second-biggest, Japan's the third) toward the consumption of more domestically produced product, and fewer Hollywood movies.

My thoughts were a reaction to the films I saw topping the box office in China, Japan and South Korea in the past decade, but a more systematic examination only supports that conclusion, as an article in Film Business Asia showed last May. Indeed, the article notes that in 2013 "one hundred films from East Asia made more than US$10 million at their local box office," securing "a combined box office of US$3.41 billion."

Mark Schilling in his article "Why Hollywood No Longer Dominates Japan's Box Office" notes that the market share of foreign films in Japan fell from 67 percent a decade earlier to roughly half that (34 percent) by 2013--which also marked the fifth straight year in which they failed to exceed a fifty-percent share of the market.

These are impressive chunks of not just the local market to which Hollywood has paid so much attention in recent years, but the whole world market.

Of course, even if Chinese, Japanese and South Korean moviegoers are less apt to choose Hollywood, some American movies still do massive business in this region. Disney's Frozen was a megahit, pulling in close to a quarter of a billion dollars in Japan alone, and another $125 million in China and South Korea. The preceding year, Monsters University was a colossal success, while the comedy Ted was also a hit.

Still, it is increasingly being recognized that audiences are more selective regarding the American films they do. Schilling's thought, after noting the big business those American films did, is that Disney-style family films can still do well here, and that even if Ted is not exactly that, "Japanese audiences love cute, even if they get their cute from a vulgar, substance-abusing animated bear."

By contrast, they're not much for "dark and depressing"--the American entertainment industry's adoration of "dark and gritty" perhaps costing it in this part of the world.

On the Success of Game of Thrones

By the end of its fourth season, Game of Thrones became the most watched show on HBO ever.

More watched, even, than the storied Sopranos.

That the show would meet with much success at all once seemed a long shot.

And yet, in hindsight, it seems perfectly natural.

Game of Thrones succeeded as a fast, flashy, sex-and-violence-laden soap opera. And where soap opera is concerned, it is tough to beat a feudal setting.

The simple fact of the matter is that in today's world power is vested less in individuals than in large organizations, while office and office-holder are separate, and likely the products of a culture of white-collar organization men taught to always seem agreeable and "speak to the well-blunted point." Different kinds of power--public and private, economic, political and military--are vested in altogether different organizations. It all makes power diffuse, and vague, and impersonal, and the players rather self-important cogs in just one of many wheels, for all their flattering by worshipful journalists.

By contrast, in the Game, personal power is often highly concentrated and multidimensional, so that individual players like Tywin Lannister (Lord of Casterly Rock and Warden of the West, King's Hand, creditor to the throne) weigh very heavily in the scales--so much so that they don't have to worry about hiding bodies when violence becomes their preferred recourse. At the same time the drama of politics and war and wealth is tightly bound up with family drama, with people's love lives, in a way they never could be in the modern world (an incestuous relationship, a family vendetta, sufficient spark to set alight the ever-rickety feudal structure). And it is all attended by pre-modern pageantry next to which even the most lavish corporate function must pale (though admittedly, the TV production never quite does Martin's conception of these justice).

To put it another way, a couple of lawyers trying to do each other out of a partnership in their law firm, suburban adulteries, and even the tabloid scandals of the glamorous seem a very small thing next to the drama of the Lannisters and Starks.

Does this mean American television is about to unleash a torrent of feudal-set soap operas on us?

Perhaps. But it seems more likely to remain confined to channels like the CW (already invested in this area with Reign), and historical drama-minded cable networks like the History Channel (offering up Vikings) than the American Big Four, perhaps not totally averse to flirting with the form but much more likely to stick with their lawyers and suburban adulteries and such over the long run.

The Decline of the R-Rated Movie

Through the '80s and '90s, a typical year saw three R-rated films in the year's top ten earners at the American box office, and eight in the top twenty.1

During the 2000s that number dropped sharply, so that the average was more like one of the top ten, and three of the top twenty in any given year were R-rated. Since 2010, R-rated movies have been even rarer than that in the upper ranks.

What happened?

Much of it would seem to be a reflection of the transformation of the action movie during these decades, from R-rated shoot 'em ups to PG-13 rated CGI spectacles about superheroes. Along with the even more complete disappearance of the sex-themed blockbuster, this translated to a major change in the market. Today's R-rated successes, by and large, are comedies like last year's The Heat (#15), We're The Millers (#16) and Identity Thief (#20), horror films like The Conjuring (#19), and Serious Adult Dramas like American Hustle (#17)--lower-budgeted and lower-grossing fare for the most part.

1. The average was actually 3.1 of the top 10 and 7.8 of the top 20 in the years 1980-1989; and 2.9 of the top 10 and 7.6 of the top 20 in the years 1990-1999. By contrast, the number was 1 of the top 10 and 2.9 of the top 20 in 2000-2009; and 0.5 of the top 10, and 3 of the top 20, in 2010-2013. All calculations based on data from Box Office Mojo.

Wednesday, August 6, 2014

On the Differences Between Japanese and Western RPGs

Gamers have often held the view that Japanese and Western--e.g. American--role-playing games tend to differ significantly in style. (Indeed, the Wikipedia article on role-playing video games devotes a significant amount of space to the subject.)

Setting aside the aesthetic differences (character design and the like), the conventional view seems to hold that Japanese games emphasize story and character and accessible gameplay, while American games stress more complex gaming. Accordingly Japanese games will give us well-defined characters pursuing linear quests, while American games will gives us minutely customizable characters free to roam sandbox worlds at will. The Japanese game is more likely to offer turn-based play than its American counterpart. And so on.

One can, of course, push the generalization too far. These are, after all, tendencies. Exceptions have always existed, and the line has long since been blurred by the cross-fertilization and outright imitation of past decades. Still, the generalization has some value, obvious in a comparison of, for instance, the Dragon Quest (aka Dragon Warrior) and Elder Scroll franchises, and the responses they have elicited in North America.

Interestingly, few seem to have tried to offer an explanation for the difference. But it has occurred to me that one factor may be the personal backgrounds of the pioneers of Japanese and American gaming. Where the latter just about all seem to have been computer scientists, many of the Japanese pioneers, like Hideo Kojima and Shigeru Miyamoto, came to the industry from fields outside computing, and often from the arts. (Kojima was interesting in writing and film, while Miyamoto had a degree in industrial design.)

It does not seem improbable that this subjected the cultures of these industries to differing influences, with the artists looking to draw players into a compelling story, while the engineers esteem technical intricacy and "features."

Considering the possibility I remember the old story about the first commercially sold video game, 1971's Computer Space, the modest sales of which were attributed to its complexity--co-creator Nolan Bushnell saying in hindsight that whereas all his engineer friends "loved it," the game was "a little too complicated for the guy with the beer in the bar." Obviously American video gaming has since gone on to popular success, but the engineers still seem to prevail over the artists to a degree not seen across the Pacific--with implications not just for the RPG, but the visual novel genre (much more prominent in Japan than in the West), and the fact that it was Japan's Nintendo, and not an American company, which offered up the user-friendly Wii.

What do you think?

Saturday, August 2, 2014

What is a Novel?

When I started taking an interest in literature, I naturally found myself wondering just what made a novel, a novel.

Certainly there is a consensus that a novel is a work of prose fiction. It seems fair to argue that the audience is generally expected to read them privately--rather than hear them publicly recited or performed like epic poems, or plays. It also seems plausible to contend that it has tended to stress the individual, and interiority, to a greater degree than those other, older forms--evident in such things as the time they spend inside their protagonists' heads.

Nonetheless, one can say that this generally describes the fiction we read today, and that one has to look to something else to distinguish it from, for instance, a short story. One obvious criterion is length, the short story filling a few pages, the novel a book (with the minimum estimates falling in the 40-50,000 word range).

However, there are qualitative differences as well as quantitative ones. One expects greater breadth and depth, and more detail, in a novel than in a short story--producing an "epic depiction of life," a world on the page.2

Certainly those novels esteemed as great literature tend to offer this. I find myself thinking, for instance, of writers like Hugo or Dostoyevsky or Tolstoy. In a much more idiosyncratic way, even a postmodern like Vonnegut or Pynchon goes for the same thing.

In fairness, narrowly "genre" works are less likely to do this. A romance or a thriller, for instance, may offer this--but many of them, judged by this standard, give the impression of a novella, or even a short story, extended to the length of a book. Ironically, at a time when books are running rather longer than they used to, I wonder at times if this is not the direction in which a good deal of popular fiction is tending.

1. The Science Fiction Writers of America, for instance, uses the 40,000 word figure in determining what is eligible for the "novella" category of the Nebula Awards.
2. Where does the novella fit in, one might wonder? Lengthwise it's an intermediate form, which offers the detailing of a novel, but just the scope of a short story.

Thursday, July 24, 2014

Is The Kindle Changing What We Read?

Much has been said of devices like the Kindle changing how we get our books. However, apart from the publicity given to independently published books (in which there has been much hype), little has been said about how they might change the kinds of books we read.

In my experience reading off of a Kindle is far better than reading off a computer screen - but still less comfortable than reading off of paper. The result is that I find myself prone to read on a Kindle for shorter periods than when reading printed books (anything more than half an hour and I start noticing the difference), and to avoid it entirely on a day when I used my computer heavily. Additionally, the small number of words the screen accommodates compared with the printed page means that anyone scanning a long stretch of text has to go through many more screens than they would pages if they were using a printed edition - which along with the quirkiness of electronic touch screens, makes moving about even an extensively hyperlinked Kindle file rather more awkward than leafing through a printed book. Not unrelated, but arguably more important, is the evidence that what psychologists and neuroscientists have long argued about reading done off screens as compared with paper - that one retains less and digests it less fully - seems to carry over to e-book readers like the Kindle.

All of this suggests that e-books are best suited to easy, comparatively undemanding readings, books which can be finished in small bites, which do not require close reading and the activities associated with it - backtracking, rereading and so forth. Of course, there is nothing wrong with a high level of readability as such - but many books which make greater demands on the reader do so for good reason, like the complexity of their subject matter.

It may be that for those sorts of reads, printed books will remain preferable, at least without advances in the technology that further narrow the gap between the two experiences. However, there is an alternative concern, especially given the ofttimes mindless boosterism for new technologies like e-book readers, and the admitted convenience of the devices - that to the extent that the selection and editing of books for publication becomes dependent on their suitability as e-books, denser, more complex or otherwise more demanding books will have an even harder time making it through the gauntlet that is the publishing process.

On Dubious Writerly Advice: "Watch the Market"

Anyone who's spent much time trying to get published has probably been advised at some point to study the market. However, just like so much of the other advice given authors, even when it has some merit (as a great deal of such advice does not), it's not of much practical use to aspiring, unpublished writers trying to break in.

I've already written quite a bit about the scarcity of useful information on book sales available to the general public (rather than, for instance, industry insiders). But processing even the limited data available to everyone can be tricky. The claim that one should check out what's selling should be qualified.

Consider the typical bestseller list. The authors there, by and large, are longtime Big Names who typically represent not the trends of today, but those of twenty, thirty or even forty years ago.

Take, for instance, the case of Tom Clancy, who had three of the biggest selling books of the early 2000s. One might have concluded from his sales at the time that military techno-thrillers were still kings of their domain – but this would have been wrong. Clancy's sales reflected his earlier success, and his acquisition along the way of enough loyal readers to make him comparatively immune to the ups and downs of the market which make and break smaller careers.

They would have done far better to pay more attention to relative newcomer Dan Brown, who was far more representative of what successful newcomers were doing. In these years he was shifting from high-tech, science-heavy thrillers to religious-Masonic-historical mysteries - think of Angels & Demons as a transitional work in this respect – and ended up with the biggest-selling thriller of the decade - The Da Vinci Code.

One might also do well to remember that a writer's adaptation to a trend, after they've recognized it, is no simple thing. Certainly there are writers of enormous range, like Michael Moorcock, but few of us are so versatile – or likely to ever get the chance to try and become that. (Talented as he undeniably is, he also learned his craft when the business was very different, and afforded him an early start of a kind much less plausible today.) Even hugely successful authors often display a knack for just one kind of story, or two kinds at most, and then go on to crank out variations on it forever, long after it stops being worth their while as artists and entertainers, just to keep the money coming in.

And what is a challenge for professionals is that much more difficult for aspirants. Working without professional experience (or even long writing experience of any kind in many cases), without access to professional advice (no agent, no editor, no circle of friends and colleagues who are professionals in the Business you can show pieces of your manuscript for feedback, such as we see in so many acknowledgments' pages), while holding down a day job (perhaps one throwing endless obstacles in the way of any outside efforts), or enjoying much encouragement (let's face it: non-writers, non-artists, aren't likely to be understanding of your ambitions, and anyone who's unintentionally started a collection of form rejection letters can tell you how inhibiting they can be), the road to finishing a publishable manuscript is likely to be a long one – years, perhaps. If you go the traditional publishing route, it can be years after that before you can find an agent willing to take you on, and that agent places the manuscript, and the manuscript actually hits the market.

A decade for this whole process isn't at all unusual (and I am, of course, talking about the very, very rare success stories when I write this). In the meantime, established pros able to work more quickly, and enjoying better access to people in the business who can help them in various ways (not least in getting their manuscript into the machinery which transforms it into a published book with tolerable speed), are moving on the same trend – while a good many other aspirants who heard the same piece of advice are trying to do the same thing.

In short, by the time you succeed in finishing a publishable manuscript of the desired type, even if you do succeed at this, the trend's time will likely have passed, the market changed – so that the time frame in which you are working makes the advice to "follow the trends" meaningless. Now that isn't to say that a writer shouldn't stretch themselves, of course. That's key to staying interesting – or interested – for any length of time. And it isn't to say that one should ignore opportunities.

But there are real limits to how far you should go. Certainly an author shouldn't decide to write something they have no interest or feel for; to decide, for instance, that as young adult paranormal romance seems to be doing well, that's what they should be writing (President Obama's Young Adult Novel Plan notwithstanding). The only satisfaction the vast majority of would-be writers are likely to get out of a project is the actual pleasure of the writing; and if they don't like what they're doing, there seems little chance of anyone else liking it much either, let alone liking it enough to make the chore worthwhile. The truth is that it seems much more the case that authors succeed writing things they like which happen to be marketable at a given moment, than that they succeed by forcing themselves to cater to someone else's tastes. If you're not one of the lucky few on the exact wavelength to partake in the Next Big Thing, you might try to find a way to reconcile those imperatives – but you won't get anywhere tossing your own likes and dislikes overboard in the desperate hope of Giving The People What They Want.

New and Noteworthy (Amazon Kindle, Stross on Santorum, Heat Wave)
4/1/12
Writers Write About Writerly Advice
1/29/12
On the New York Times Bestseller List . . .
10/8/10
Actual Data on SF and Fantasy Publishing
3/26/10

Subscribe Now: Feed Icon