Wednesday, August 19, 2020

2015's Fantastic Four: Another Look

I recall being, if not pleasantly surprised, then at least intrigued by Josh Trank's Fantastic Four when I first saw it. There were aspects I was dubious about--the young adult turn in the characterizations, for one, and the inconsistency of the tone--and yet there were others that did interest me.

On a recent (if casual) reviewing of the film I was more impressed by those aspects, not least the film's handling of the scientific research at the heart of the story. The film does make concessions to silly, outworn Edisonade convention in having a high school student working with scavenged junk happen on a key part of the solution to the problem of interdimensional travel. But afterward the film displays, to a far, far greater degree than most of what comes out of Hollywood, a striking awareness, and acknowledgment, of the realities of Big Science. Its agendas and politics, which have the Suits, not the scientists, calling the shots. And the reality that science is not nerd-magic, but a lot of hard work--collaborative, protracted, at times grueling hard work that builds on the bits of the puzzle others have solved, most of that by people unlikely to get any of the glory.

Such things, small as they seem, are rare enough and significant enough that they appear to me to rate an honorable mention.

A Techno-Thriller Revival?

In writing my history of the military techno-thriller I was concerned principally with the main line of the genre--its early flickerings dating back to the seventeenth century; its coalescence into the "invasion story" of the late nineteenth and early twentieth centuries; after the shock of World War I and its aftermath convinced many that war could only mean universal calamity, its giving way to other genres it helped inspire--spy fiction, military science fiction, post-apocalyptic war fiction, "war scare" fiction; and then by way of those remnants, the revival of the old-style invasion story in the 1970s and 1980s (begun by British writers like Frederick Forsyth, Craig Thomas, John Hackett, but most identified with Tom Clancy); before the end of the Cold War turned that boom to bust.

As a result my book handled the post-Cold War period as an epilogue to the main story. Today, however, the world appears a far more aggressive place than it did a decade ago, let alone two decades ago.

Even the '90s, of course, were not without their share of crises--the Norwegian rocket incident, the Third Taiwan Strait Crisis, the NATO-Russia confrontation at Pristina airport--yet today it seems the blood is actually flowing at a lengthening list of flashpoints where armies that once upon a time would not have fought each other are doing just that. The war in Ukraine is not over. In Syria, in Libya, NATO member Turkey and Russia are engaging in regular shooting incidents, while that same NATO government engages in hostilities with Greece and confronts the French(!) at sea. Scarcely a year after India's exchanging air strikes with Pakistan for the first time in nearly a half century (this, they avoided even in the 1999 war), Chinese and Indian troops have fought in the Himalayas (with clubs and rocks?), while China for its part remains at odds with its neighbors over its claims in the East and South China Sea.

Meanwhile everyone is chasing military capabilities and overseas bases they did not think they needed a short time ago. Not long ago most sat out the fifth generation of fighters, making do with older gear--but now everyone wants in on the sixth, while hypersonic cruise missiles may be the object of a new arms race (one perhaps responsible for a mysterious nuclear explosion in northern Russia). Britain is deploying its first full-deck carriers begun since the '40s, while German opinion-makers talk about having a carrier too (and maybe much else). There are even hints of a revival of conscription, thus far limited to smaller nations like Sweden and Peru, but Sweden's action is strongly linked with the elevated fear of war in northeastern Europe, while the French President, who has continued what for that country is an extraordinarily large-scale and long-duration mission in Mali (compare this to its post-Algeria record of African interventions), gives the impression of inching toward the same in his country . . .

It feels as if we are living in a Tom Clancy universe these days. Or, rather, a Dale Brown universe. (Yes, Turkish super-drones attacking Russian mercenaries in Libya feels more Dale Brown.)

Does this portend a revival of the genre? I, for one, doubt it. If scenarios of high-tech, conventional, military conflict look more plausible now than before it should be remembered that the readiness of a broad audience to consider discussion of the "next big war" as relevant is one thing, that audience's readiness to accept the depiction of a war as entertainment is another. Where the appeal of such is concerned it seems most susceptible in a moment where that audience has had a comparative respite from war, some years without what it thinks of as a really major conflict, at least, with decades better still. (It is worth remembering that the techno-thriller emerged as a popular genre not in the 1940s or 1950s or 1960s, but in the late 1970s and only really took off in the 1980s, many, many years after the end of Vietnam.) Of course, this is something audiences in America and the West generally cannot be said to have had, and very clearly would like to have had. Indeed, even the political right gives an impression of weariness of foreign war, especially when one gets away from the professional hawks to which the media gives so much press. (One might add that the withdrawal from Afghanistan and Iraq has been a very different thing from the withdrawal from Vietnam, with less division in the U.S., less anti-militarism, and at the same no desire for some kind of "redemption" on the part of the right that we identify with "post-Vietnam" evident or even plausible, diminishing the interest in an imaginary "refighting" of the conflict.)

The public seems more open to the genre, too, when at least part of it is being led to believe that a big war is somehow thinkable, somehow winnable. This attitude never wholly recovered from the world wars, with the result that the two defining big war novels of the techno-thriller revival--Hackett's book, and Clancy's Red Storm Rising--had to come up with a dodge to let the West get the upper hand without the conflict going all-out nuclear. I see no evidence of such a mood these days. Rather than the '80s I think a better analogy would be the '30s in this, as in so many other ways.

So people may be thinking about the danger of war more than before--but not in such a way that I see them making a bestseller out of some latterday Tom Clancy. In fact, I wonder if the H.G. Wells tradition would not find a stronger echo--save for the fact that I imagine the mandarins of Park Avenue to be exceedingly averse to publishing a work of the type.

Tuesday, August 18, 2020

Remembering The Pretender

Watching The Pretender it seemed that pretty much nothing about the show made sense--not that I expected much from a network TV show at that time--while I suspected an essential vacuity behind the opaqueness and the teasing. (Remember, these were the years of the original run of The X-Files.)

What was the Center? A think tank we were told--apparently a think tank that centered on the intellection of one captive savant. (The writers seemed to make up the biogenetic bits of intrigue--clones and the rest--just as they went along, with this suddenly becoming more important as the season finale approached, and then once renewal happened, receding into the background . . .)

And of course, that particular savant never made much sense, least of all in his abilities. "Genius" was an extreme understatement for what he was, even by the risible "dumb person's idea of a genius" standard to which TV adheres. He seemed instead a realization of the Faustian (I mean Goethe here, not Marlowe) will to do everything, know everything, go everywhere--as Gary Stu/Mary Sue as a figure gets. Between one episode and the next he was able to master a skill, a trade, a profession to which people devote entire lifetimes (pilot today, symphony conductor tomorrow, surgeon the next week), while having the time to contrive the fake paper trail and other deceptions necessary to pass himself off as really what he appeared to be--all with his brain not functioning as a palimpsest, with one set of skills displacing another, but one set of abilities added to the next without loss of what he had previously acquired so that his repertoire had him walking about with the equivalent of the capacities of hundreds of world-class experts of the most diverse types in his head all at once.

Of course, that complicated the writing of the show somewhat. The protagonist was in new surroundings, performing different tasks, surrounded by different people in pretty much every episode, pretending to be a different man with a different history--while he was somebody who didn't have a conventional past to begin with, making him harder to establish even as a Fugitive-type character. (It was, of course, one reason why the episodes spent so much showing what the folks back at the Center--pretty much his only consistent acquaintances--were doing.)

Still, the writers knew enough not to make him a tedious, egomaniacal mass of quirks like so many of their "geniuses"--while also being astute enough to make his eccentricities elicit amusement and sympathy rather than grate (delight in some pleasure of which he had been deprived by the Center growing up--like ice cream). They did this the more easily because he was essentially written as a sympathetic, empathetic human being.

That little lesson, I think, TV writers have since forgotten.

Reflections on Independence Day: Resurgence

According to the data set over at Box Office Mojo in 1996 Independence Day opened with a then-record $96 million July 4 extended weekend (five-day) gross. That made it #1 at the box office, of course, a position it held for the next two weekends, then held on to the #2 spot for the two weekends after that, and stayed in the top five for two more, on the way to banking $300 million, a sum which made it the top-earning movie of a very vigorous summer movie season, and a very good year for Hollywood overall (which, not incidentally, launched Will Smith's long career as a sci-fi action star).

In 2016 the sequel Independence Day: Resurgence pulled in a mere $41 million, putting it at #2, and was down to the #5 spot a week later, after which it continued a fairly quick plummet through the rankings. At the end of its run it had about $103 million banked, which, in inflation-adjusted terms, is a good deal less than the original made in just its first five days, and under a quarter of that movie's overall gross. The result was that it barely made the #27 spot for the year, just ahead of the far lower-budgeted horror movie The Conjuring 2.

In short, it could not but be considered a disappointment in light of the original's massive success, and for that matter, its massive $165 million budget--without at the same time gaining a measure of cult-y success.

That failure, of course, does not seem to require much explanation. The second Independence Day movie was a sequel, arrived long after the original had passed from the very short pop cultural memory of most--the more so because of the kind of movie it was. In 1996, when we had about a half dozen really big would-be blockbusters that summer, most of them quite different (the others were Dragonheart, The Rock, Eraser, the first Mission: Impossible and Twister), Independence Day was something we hadn't seen before--the subject matter at least as old as H.G. Wells, but the handling of the spectacle still something that stood out with its grandiosely CGI-powered depiction of a massive alien attack on Earth's major cities.

Now it's entirely standard.

Due to the genre's scaling up, and going science fiction-al in the process, every action movie is now likewise a disaster movie--and a monster movie, and an alien invasion movie in some way or other--any "respectable" CGI-packed tentpole apt to have otherworldly creatures wrecking the core of at least one major city before the end credits (as was the case in the same year's X-Men Apocalypse, Suicide Squad, Superman vs. Batman and Ghostbusters, to say nothing of the city-busting we saw in Rogue One). As with so many prior action movie phenomena--James Bond, Star Wars--what was once special about the movie became routine, except that it never really acquired the kind of brand name that a film franchise could go on trading on after it has ceased to be fresh and new (the dashing of which hope is all too evident when one looks at the closing scenes, and their desperate laying of groundwork for a third movie that is almost certain to not happen).

It did not help that ID: Resurgence was, in spite of some dazzling effects, a fairly mediocre example of the type, in a far more crowded season--which promptly saw it shoved out of the way by other, more vigorous movies, through the rest of the summer and after it, and then in the next year.

In fact you probably already forgot all about it until I mentioned it just now, didn't you? (I know I did until recently happening on it when flipping channels.)

Why Does Everyone on American TV Know How to Use Chopsticks?

Watching TV these days one gets the impression that all Americans are adroit users of chopsticks. Suspecting this to be, to put it mildly, a gross exaggeration, I went looking for data on the subject.

An item at Statista claims that, based on self-reporting (people answered the question "How good are you at using chopsticks?"), 4 percent of Americans are "expert," 11 percent "very good," and another 19 percent "fair."

By contrast 43 percent are "not very good" or "terrible," and 24 percent say they have never even tried to use them.

Let us accept these numbers as a starting point in the absence of better. Considering them my guess would be that, because this is a matter of self-reporting, people overstate their proficiency--and even with that just one in three claims to be fair or better at chopstick use, meaning at least two-thirds are less than comfortable with them, with a quarter never having handled them at all.

Why does TV give such a different impression? Barring some deliberate effort to master these utensils as an end in itself, the vast majority of Americans, especially in the absence of especially strong exposure to East and Southeast Asian culture due to personal heritage or travel, or very close connections with people who have had that, are probably only likely to learn chopstick use in the course of frequently going to restaurants serving East Asian cuisine, and there specifically insisting on learning to use the associated utensils--something far more likely for affluent (not the 1 percent necessarily, but at least the top 10 percent) residents of big coastal cities than working class folks, and especially working class people from rural, small-town, provincial areas. This seems all the more plausible given the extraordinary snobbery with which Americans have surrounded some aspects of such dining, for example, sushi consumption (in stark contrast with the very different Japanese tradition).

In short, it seems to me a matter of exactly that subject which remains anathema not just to the right but much of the "left," socioeconomic disparity and social class--and in this case, the social class background of the people who write for TV, and the kind of characters they happen to put on TV (overwhelmingly, affluent coastal types).

Do I think that the hacks in Hollywood and on Madison Avenue are deliberately signaling social class when they casually pack the screen with adroit chopstick use, however? The way that, for example, they so often present characters whom they wish to impress us as not just our social and economic superiors but also our cultural and intellectual superiors by having them play the piano with concert performer proficiency, or fence like masters (a visual shorthand that, condescending from the start, is now painfully cliched)?

Of that I am more doubtful. I take it, instead, as simply another reflection of the extreme cluelessness of the hyper-privileged of medialand about the existence of anything outside their painfully cramped little world.

Why Is There So Much Talk About NEETs in Japan?

I remember being introduced to the concept of NEETs (Not in Education, Employment or Training) when watching Welcome to the NHK.

Later I was surprised to find that the concept was not Japanese, but actually originated in Margaret Thatcher's Britain.

I was surprised again when I found out that Japan is remarkable not for a particularly high percentage of NEETs, but perhaps as low a percentage as they get next to any other country. The World Bank actually has the International Labor Organization's data for the "Share of youth not in education, employment or training total"--in short, the share of the country's youth who are NEETs--up online here. Japan's last recorded figure was 2.9 percent--compared to the 11.5 percent average for the "high income" countries, and the 28.3 percent figure for "lower middle income" countries. (For the obvious reason that poorer countries have fewer jobs to go round and more limited education systems, high proportions of NEETs tend to go with poverty, not wealth, and things are generally worse there that way than in the richer nations.) The U.S. figure is 13.1 percent, and in Italy, in exceptionally bad shape by First World standards, it is 18 percent, six times as high as the Japanese figure--but probably without six times as much talk about the issue. (In Britain, the home of the concept, the figure is 10.5 percent, three and a half times as high as Japan's, in spite of which, so far as I can tell, I cannot remember a single reference to NEETs in British news media or British pop culture, for whatever that may be worth.)

One might wonder if Japanese official statistics do not underreport the problem. Still, I am unaware of any evidence that Japan is going so much further on this count than other states, hardly less inclined to play such games, and so for the moment I will take it at face value, raising the question of just why all the fuss? One possibility that comes to mind is that right-wingers looking for scapegoats for the failure of their economic policies in other countries, while certainly quick to slander and libel young people as playing video games in their mom's basement all day, have a greater abundance of other targets for their venom. They can also slander and libel minorities, immigrants and ethnic Others generally as the source of their nation's woes. However Japan's comparative ethnic homogoneity, insular geographic position, and historically tight immigration policies mean less in the way of such options for those pursuing such a political line. (The Ainu and Ryukyuan peoples together comprise about 0.4 percent of the Japanese population. And fewer than 2 percent are foreign born. By contrast the population of Britain in 2011 contained some 12 percent or so in the minority category, while about 13 percent of the country was foreign-born.)

It might be added that the sense of Otherness in Japan's case in relation to the great majority of its minorities and foreign-born residents is, compared with many other nations, slighter for their overwhelmingly being from neighboring Asian countries, and often, ethnic Japanese from the Japanese communities in South American countries like Brazil. It seems worth noting, too, that Japanese right-wingers, like their counterparts elsewhere, are inclined to claim that, whatever one says about other people's imperialism, theirs was uplifting, and point to countries they colonized like Korea and Taiwan--the very countries from which so many of their immigrants come--as success stories testifying to the beneficence of their rule. And this, of course, makes claptrap about those "lazy, backward" Others and their "s---hole countries" less tenable. The result is still plenty of xenophobia in all its ugliness--but not much of a basis for its utilization in "foreigners are ruining our economy" scapegoating.

One result is that when it comes to the slowing of the nation's economic progress (these days, it should be admitted, not really so much worse than anyone else's; the twenty-first century has everybody hurting this way), the young catch that much more of the flak.

The Hikikomori Phenomenon: A Sociological View
5/15/17
The Hikikomori and the Lost Decade That Never Ended
7/13/15

Thursday, July 23, 2020

Anyone Still Watching Basic Cable?

Remember when basic cable's scripted TV offerings seemed a fairly major part of pop culture? The Disney Channel had shows like That's So Raven and Hannah Montana (despite not watching the channel at all then, I knew there was something called a "Hannah Montana," even if I wasn't sure what exactly it was, an actual person or a character or maybe just a figure of speech), and Nickelodeon had iCarly. AMC's Mad Men was more a hit with Midcult-loving critics  and their upmarket followers than the general audience, but Breaking Bad and The Walking Dead were both confirmed popular sensations--while at the more grown-up but still lighter end of the spectrum, USA had Monk, and TNT had Leverage.

As Brad Adgate acknowledged in Forbes recently, hits like these are pretty much nonexistent these days. Where iCarly creator Dan Schneider watched his show pulling in 10 million viewers then on its best nights, his more recent Henry Danger, which Nickelodeon touted in a recent commercial as the #1 Kid's Comedy on TV, was bringing in under 1 million--bespeaking an astonishing collapse in that market. And it seems pretty much the same story elsewhere in this part of TVland.

Of course, this raises the question--just how will basic cable respond? The cancellation of Disney XD's Kirby Buckets in hindsight seems to have been indicative of a shift away from original, scripted, live-action fare in favor of other material--animation, gaming and the rest. It may be that this was an easier course for this particular channel, with its slighter market and margin for failure, and connections with Marvel and so forth, but it seems possible that other channels will be walking the same path, looking for other ways in which to cater to a shrunken base of viewers. One way might be their similarly turning to nonscripted fare. Another might be their doing what the networks have generally done, passing on more "adventurous" material in favor of more conventional fare--like the procedurals they keep churning out, very profitably.

Still Waiting on '90s Nostalgia

I have, from time to time, remarked the hints of a belated interest in '90s nostalgia (in the feature film version of Baywatch, or the sitcom Schooled, for example), but by and large these have just proven flickers (the Baywatch film was no blockbuster, Schooled cancelled after a mere two seasons), while a glance at the line-up of films initially scheduled for release in 2020 made clear that where nostalgia was concerned the '80s, after all these years, remained king.

And so I found myself thinking about the faintness of '90s nostalgia to date, developing some thoughts I had earlier a little further. It seems to me that one can simultaneously say several different things about the '90s--each true about different aspects of it, and altogether meaning that the era lends itself less well to nostalgia.

#1. The '90s Never Happened.
To the extent that people imagined the '90s would, in politics, values and the rest represent a break with the '80s--a reaction against Reaganomics and neoliberalism and neoconservatism and Cold War, a shift of our thinking in a more socially and ecologically conscientious, internationalist, de-militarized direction, or even a rebirth of countercultural radicalism ("Once we get outta the 80's, the 90's are going to make the 60's look like the 50's" '60s radical Huey Walker promised in Flashback)--the '90s never happened, amounting instead to a brief and quickly crushed hope that I suspect few even remember. What happened instead was that the '80s simply went on and on in this respect, the right pressing rightward rather than retreating amid the "end of history." (A glance at the Clinton administration's record of social and economic policymaking--government budget-cutting, welfare "reform," deregulation for finance, low taxes on the rich, Alan Greenspan at the Federal Reserve, free trade, etc., etc.--certainly, makes it all too clear that it was a "continuation of Reaganomics by other parties.") And we cannot be nostalgic for what never happened, can we? (Indeed, it seems to me to say a lot that those things of the '80s for which we see nostalgia are, by and large, the things of childhood--while those who express a dissenting impulse, for lack of anything else, still look back to before the '80s, as we see in films like Roman J. Israel, Esq., or Joker.)

#2. The '90s Did Happen, But Was a Transitional Period Rather Than a Distinct Phenomena of its Own.
I have certainly had this impression thinking about film during this period--the action film in particular. We all know the distinctively "'80s action film" Hollywood turned out in that decade--Schwarzenegger and Chuck Norris and Rambo and Die Hard and the rest. And we all know the sort of action film that has prevailed in the twenty-first century--dominated by superheroes and their heroics, with Spiderman, Batman and the Avengers delivering one billion dollar hit after another. But what about the '90s? Close examination, time and again, has convinced me that it was a transitional era where people still flocked to see Schwarzenegger in Eraser, while Hollywood, in spite of the early success of the Tim Burton Batman franchise, was making mostly second-string superhero movies, and Independence Day was novel enough to be a major event (while two decades later its sequel was predictably lost in the ever-larger summer crowd).

#3. The '90s Happened--it's the Period After the '90s That Didn't.
Perhaps the most unambiguous change in the sort of details of daily life that nostalgia-hawkers interest themselves in is perhaps what people call "technology." Yet, what in the heady '90s seemed to be the beginning of a new wave of digital wonders can, in hindsight, seem like the climax. Personal computing, Internet connections, cellular telephony all exploded then--and while they have got cheaper and more portable and faster enabling fuller exploitation of their potentials, it seems that nothing since has been quite so radical. Indeed, much of what was expected to follow soon--bodily immersive virtual reality, telepresence, self-driving vehicles and other equally dramatic artificial intelligence aids to daily living--proved rather further off than imagined, things we are still waiting for now two decades on, with "digital personal assistants" like Alexa still for the most part a novelty, and many still feeling themselves in a position to express a "Git a Hoss"-like disdain for the idea that cars will drive themselves anytime soon. The result is that rather than being in the '80s we remain in the '90s--and equally cannot get nostalgic for what never departed.

Of course, none of this means that there are not things that I look at and find to be particularly "'90s." Thus does it go with, for instance, those syndicated action hours one can still see rerun on channels like H & I (for now). They offer a style of entertainment not quite so available before, and clearly vanished since. Yet such items strike me as sufficiently minor as to leave those building entertainment out of nostalgia little to work with.

No Time to Die: Commercial Prospects

If memory serves, No Time to Die was the first blockbuster-grade movie to see its release deferred as a result of the pandemic--in this case all the way to late November (more than seven months after its initial release date). This will mean its coming out five years after Spectre--the longest gap in series' history apart from the span of time between Licence to Kill and Goldeneye.

Given the surprises this year has had for us all, and for that installment in the franchise in particular, I am in no mood to make predictions--but still comfortable with hypotheticals. Let us assume the release goes ahead as now scheduled, in something like a normal film market, where a comparison with that older film is actually useful. Goldeneye was, of course, a success, the first Bond movie to break the $100 million barrier, which did not quite mean what it had before, but even adjusted for inflation reversed the generally downward trend in the series' grosses since the '70s, an especially welcome development after the disappointment of Licence to Kill. (I might add that years later it seems to be widely regarded as, The Spy Who Loved Me apart, the best of the Bond films to appear between the original crop of the '60s and the twenty-first century reboot.)

No Time to Die does not have so much to recover from--but at the same time, a generation later, the concept may be that much more worn, the market tougher, while as I remarked not so long ago, if the Bond series has been borrowing ideas rather than generating them for a half century now, the ideas it has been borrowing seem to me less appealing generally, and certainly less easy to assimilate to the character. (Moonraker's sending Bond into space was wacky, and Licence to Kill went too far in reinventing Bond as a "someone killed my favorite second cousin"-type '80s Hollywood action film hero, but all that was less problematic than origin story prequels and family drama and overlong running times and the rest it took from recent action films.)

In considering all that it seems worth acknowledging that there has been a certain amount of noise over the hints that Bond will get married, and replaced by a female (and Black) operative. I have no idea how much it will mean amid all the flashy action stuff, and after all these decades of concessions to a gender politics less forgiving of the old conception (already by the late '70s things were changing, while I would say Casino Royale pretty much finished Bond off in this department all by itself), the bigger issue seems to me to be something more basic. Already by the late '70s the illusion that it was more than "just another action" series was beginning to fade, by the '80s pretty much gone, and if at times it seemed to defy it with a movie that persuaded much of the audience it was an event (because of a longer than usual span of time, because of a new concept), as Casino Royale and Skyfall managed to do, it is an increasingly difficult trick to pull off--while pulling it off seems more essential than ever to selling tickets.

NOTE: Between the time I drafted this post, and my being able to post, there has already been new talk of No Time to Die getting bumped again, all the way over to the summer of 2021. (Things are moving too fast for anyone to keep up with these days.)

Tuesday, July 21, 2020

What Ever Happened to Tennis?

Recently considering how the image of luxurious, glamorous living film and TV offer has changed I happened to mention the scale of the houses we see--which seems to me a matter of the exploding fortunes of the wealthy. Still, not everything seems easily explicable in such terms, with the changing nature of pastimes an obvious case. In our movies and television shows the rich and glamorous still play plenty of golf--but, I think, rather less tennis, a game that seemed ubiquitous in that kind of fare. Jonathan and Jennifer Hart, after all, were not unknown to be on the tennis court. In the opening montage of the first season of Charlie's Angels, during Farrah Fawcett's bit we see her swinging a tennis racquet. What did the original Bionic Woman do before becoming an OSI agent/junior high school teacher? She was a tennis pro.

And so on and so forth.

All that is rather less evident now.

It seems plausible that this is bound up with the decline of interest in tennis more generally.

The generally accepted version of the story seems to be that tennis was, for the broader public, fresh and new in the late '60s and early '70s--thanks to the airing of major tournaments on television as the sport newly went pro, while it is held that colorful personalities and gender politics (John McEnroe's temper tantrums, Billie Jean King's feminism) sustained interest for some. That could not and did not go on forever, while the relative decline of American players' prominence at the sports upper levels reduced interest in the U.S. (kind of like with boxing). Given that even at its peak tennis was never exactly football or basketball (the U.S. Open no Super Bowl or NBA Championship even at its long ago height), all this meant it was vulnerable when "secondary sports" got squeezed out of press coverage and television broadcast (also like boxing), leaving it a long way from its glory days--and TV writers and producers less expectant that an audience will be suitably impressed by the sight of their protagonist on the court.

Luxury in the Neoliberal Era

A prominent theme of discussion of the Bond novels and films--the more critical kind that rises above a poptimist denigration of earlier works to better promote the new--is the ways in which the original conception has dated. One of the more interesting but less talked-about aspects of this is the earlier Bond novels and films' presentation of what may be called "lifestyle fantasy"--their image of what was luxurious and glamorous. As has been noted time and again, much of what seemed so in the '50s and early '60s--getting on a jet plane, staying in a Carribbean resort--has come to seem commonplace to "middle-class people," and even for those not affluent enough to afford such vacations (often confusing the "lifestyles" they see on TV with how they live), less wildly fantastic than they seemed back in the series' heyday. Much else has simply come to seem old-fashioned--like nightclubs as places where one wears black tie and evening gowns. (Certainly the opening sequence of the first XXX movie offered what a young person of even twenty years ago was apt to think of that.)

I have little to add to my earlier remarks about the Bond series here, but these days I find myself leaving on reruns of older television, and being struck by how, rather than some peculiarity of the James Bond series, this is generally the case. Looking at Hart to Hart's images of luxury, for instance, I find myself thinking about the house in which the Harts live. They have what can only be considered a very large and beautiful home. The vast majority of us, even those of us in the "First World," are unlikely to ever live in anything nearly so spacious or attractive. But today it does not give the impression of opulence that I suspect it was meant to; the grandeur one might associate with people who fly about in private jet planes.

This is most likely a matter of how the past few decades went. An era of spectacular economic growth they have not been for America, or the world. But the richest have got much, much richer. J. Paul Getty, in 1976 the world's richest man with a fortune of $6 billion, would, even with his fortune's size adjusted for inflation (to $27 billion in today's money), not make the list of the 25 wealthiest today. As will probably surprise no one, that list is now headed by Jeff Bezos, whose fortune (recently estimated at $146 billion) is more than five times as large as Getty's was then. And contrary to the drivel some "moralists" still spout about the Puritan virtues as the source of great fortunes, the billionaires have not hesitated to lavish the money on themselves, raising the stakes with regard to luxury--this age now seeing them take billion-dollar compounds as their personal homes.

In short, when it comes to houses, yachts and just about everything else where the expenditure of more money translates to more obvious opulence, the standard has changed, and changed profoundly.

Tuesday, June 23, 2020

Mr. Spock's Beard and the Hyper-Fragmentation of Pop Culture

Over the years I have time and again been struck by how single episodes of shows like the original Star Trek or The Twilight Zone have lingered in the pop cultural memory so that a half century on TV writers can still make casual reference to even their smaller details, and expect a portion of their audience to remember what they were talking about to make the reference worth the while--as with Star Trek's "Mirror, Mirror."

That was the show that taught us all that if we see someone we knew well wearing a beard, it must be the evil alternate universe version of them. (Unless the one we know is evil, in which case the "evil" version of them is actually the good, like with Eric Cartman on South Park.)

That this has been the case seems natural enough in hindsight. People had fewer channels, enough so that anything in network prime-time almost automatically commanded a huge slice of the TV audience by today's standards. For much of the year a first-run show's new season consisted of reruns, just in case they missed it the first time--or refreshing their memories if they hadn't. And then if a show lasted for a while it made its way into syndication and, quite often, way more reruns apt to be watched by a significant share of the viewership, because there were only so many venues, picking from a limited pool of really successful content. And this seems to have especially been the case with fandom-inspiring genre television, to which the TV networks were only willing to go so far in catering, so that their options were limited indeed. So without trying someone only casually interested in science fiction could easily end up seeing the same episodes over and over and over again, just by the way memorizing the more striking or popular of them, with others doing the same, so that casually referencing it before a wide audience was plausible, even easy. That, of course meant more such reference, year in, year out (as the "bearded Spock" becoming a cliché demonstrates), which in turn reinforced its recognizability, translating in its turn to still more reference in a virtuous circle.

In this "peak TV" era the situation is very, very different, with the quantity of outlets and the quantity of their programming exploded, and many of them less casually available (with the prime-time networks' marginalization followed by the basic cable TV boom going bust), or casually accessible to viewers even if they are available to them (because of the increased propensity for story arcs, because of shared universe complexity like with the Arrowverse Crisis on Infinite Earths); with the networks inclined to yank reruns off the air in summer in favor of reality TV garbage and the like; with channels favoring their original, exclusive content over reruns of other people's stuff as the mainstay of their content. (To give one small example I saw Smallville in reruns on TNT, but the current crop of DC Comics-based superhero shows airing on the CW does not turn up there or anywhere else; it's something you have to seek out, or not watch at all--and I'll admit I haven't, and don't expect to be doing it anytime soon.)

The result is the ever-more thorough hyper-fragmentation of the viewing audience--and, I think, even if you are sure that TV today is better than it has ever been before, that even the best that is out there today is far less likely to give us a shared frame of reference. Certainly I do not think any show today, no matter how brilliant, will give us something that a half century on we will recognize as the way we do such bits of those '60s-era shows as the evil alternate universe significance of a beard. And I can't help suspecting that, even if there is much good, or at least potential good, in the profusion of material and options, we are not also losing something that way.

Saturday, June 20, 2020

The Problem with Grand Admiral Thrawn

Researching the second edition of Star Wars in Context I found myself delving into some of Lucas' well-known inspirations--like Bruno Bettelheim, and specifically his writing on fairy tales in The Uses of Enchantment. That book, which I suspect is not much read these days (Bettelheim has long since fallen from grace), seems to have profoundly oriented Lucas to the vision he followed in the first film--setting aside the relatively complex and political story of the early scripts in favor of a simpler, more fairy tale-like film.

As explained by Bettelheim, fairy tales are stories dealing with existential, life-and-death matters in a simple form that makes them accessible to children. Part of this is that the good are all good, the bad all bad; and precisely because the young are young, still forming a morality, and so less responsive to good and bad on their own terms than other attractions or repulsions, it matters that the bad are made unattractive, the good more so. (Thus the evil witch is also ugly.)

Darth Vader seems to me exemplary of this. Certainly in the first film there is no ambiguity about the character. This is not to say that Vader wholly lacks traits people could admire. He is perceptive, quick, cunning, determined, whether in handling subordinates, carrying out a raid, conducting himself in a fight, flying a TIE fighter. One can call him intelligent, decisive, forceful--which is why he holds the position of authority and enjoys the power that he does, as the right-hand man of the absolute ruler of a Galactic Empire (apparently, rather more meritocratic than any world we know). However, any admiration for these qualities is preempted by the more immediate concern we are made to feel for the fate of our heroes--and more generally the fact that he keeps killing people like flies (with Empire Strikes Back seeing him do this with subordinates who, in spite of their best efforts, let him down). That he does vile things, that he intimidates and threatens, that he is a menace--an embodiment of evil--overwhelms everything else.

It has struck me that Timothy Zahn's creation, the Imperial Grand Admiral Thrawn, is very different in this way. His good qualities are not only abundant, but foregrounded, and in such a manner as to make him admirable and appealing, even if he is the bad guy. Rather than his intelligence being something we only appreciate if we stop and think, Zahn makes a point of having Thrawn put on one bravura display of performative intelligence for the audience after another, with Admiral Pellaeon playing the sometimes skeptical but suitably impressed Watson to Thrawn's Sherlock Holmes as he uses his deductive skills to navigate one strategic, one tactical, problem after another (all without making Thrawn seem a tiresome know-it-all). Thrawn is cultured, too--an art-lover whose understanding of art work is deep enough that he can derive militarily applicable insights from his examination of a particular culture's painting. He can be very charming when he wants to be. He can be fair with subordinates--rather more so than most people we see--not only punishing incompetence, but rewarding intelligence and initiative with promotion even where that intelligence and initiative did not, on a particular occasion, produce the hoped-for result (as seen in The Last Command). And while Zahn does not probe into his motives too deeply, it does seem at least plausible that while Thrawn is personally ambitious, he does believe himself to be doing the right thing in striving to restore the Empire to its earlier stature.

Indeed, if we go by the fairy tale vision of the original Star Wars, a figure like Grand Admiral Thrawn can seem to have no place in the franchise, at least as understood by the sort of purist who was so quick and vehement in calling out the prequels for their breaches with the fairy tale simplicity established in Episode IV. Yet Thrawn proved hugely popular--perhaps the most popular of the Expanded Universe's creations. In fact, I cannot remember reading a single derogatory comment about him.

Why have so few sensed any dissonance here? I suspect it is because audiences were less critically-minded in regard to the books than they were to the films. Watching the prequels they were alert to these films not "feeling" like the original trilogy, not recapturing their (imperfect and perhaps romanticized) memory of the experience. They had fewer preconceptions about what a Star Wars book should feel like, and that the book let them create the images for themselves in their minds (rather than presenting them with an abundance of CGI clashing with their visual memories), at the least made any issues less immediate, less visceral. And if the resulting books as a whole do not stay so fairy tale-simple as the originals (the story sprawls, the military detail is more pronounced), Zahn keeps to a minimum the increase in the sorts of complexity that so troubled many a fan (like more elaborate world-building, or the intricacy of the political premise, with all their Brechtian alienations), all as a good deal rings true. (Zahn's novels in a multitude of other ways made a point of evoking the original trilogy without too closely repeating it--Luke returns to Dagobah, significantly, while Lando is back on the make, and so on and so forth.) Moreover, precisely because the Thrawn character did appeal to readers, they were comparatively prepared to forgive the breach with the older conception--perhaps the more after the broadening of the Expanded Universe accustomed them to a more complex, more "adult" fictional universe on paper, where this sort of thing may anyway "play" better than in a two-hour movie, even without such particular expectations as the fans had.

Friday, June 19, 2020

Watching Flatland

I remember when the El Rey network began airing Flatland I was surprised to not have encountered the show before. Back in those years when I was most attentive to science fiction television I only intermittently followed The X-Files. (I had little patience for the mind-numbingly repetitive back-and-forth between credulous Mulder and "skeptical" Scully, or the artless head game that was the mythology.) And I took less interest in the young adult stuff on the WB. (I still haven't seen much of Buffy, and still don't intend to; while seeing Smallville was basically a result of its reruns winding up in a convenient time slot on a channel I often left on between waking up and heading off to work in the morning.) But I by and large kept up with the syndicated action hour-type stuff Flatland seemed to be. Even if I didn't see quite every episode of every one of them (there sure were a lot of them, back then, and some came and went almost before you knew it), I would have expected to at least have heard of this one.

I went online and found that there is virtually nothing there on the show in the way of fan sites, wikis, even data on the relevant Internet Movie Data Base pages (where there is not a single user review, and a mere 28 ratings in all--as compared with, for instance, 7 user reviews and 295 ratings for Adventure, Inc., another little-seen, quickly forgotten, one-season show of the action hour-type from that same year, 2002). To go by the Wikipedia disambiguation page for the word "flatland," the show does not even seem to have a page on the site. In fact I did not come up with evidence of its having aired in North America before, or anywhere else for that matter, a run that I suppose would have been short given how the boom in this format, and its most typical mode of distribution (syndication), collapsed shortly after the show's date of production and never recovered.

None of that made me less interested, of course, and I caught a few episodes. What can I say about the show at this point? Flatland clearly walks the road trod by that earlier show which helped launch that very boom in syndicated action hours, Highlander: The Series. Like Highlander (57 user reviews and 13,292 ratings, in case you're wondering), it is a story of a continuing millennia-old battle between immortal superbeings today in an old and glamorous world-city, largely as seen by the ordinary (or are they?) humans who have found themselves caught up in those events. Befitting the long past behind all this we have a good deal of intercutting between the contemporary world and a historical costume past, colorfully filling us in on the backstory of the loves and hates and other histories and relationships of these figures. We have those beings fighting things out by way of stylish, supernatural-tinged and visual effects-enhanced personal, physical combat. And in line with their "high concept" inspiration (the mid-'80s was a golden age for music video-TV commercial style filmmaking, with Highlander right up there with classics of the form like Flashdance and Rocky IV and Top Gun), nearly everything is as plush and sleek and glossy as if it came out of an upmarket catalog ad, with an occasional grimy or "industrial" backdrop lending some visual texture.

Still, if I had seen it before the formula is a solid one, and this show's particular variations on it definitely have their appeal. From a metafictional standpoint the casting of Dennis Hopper as an immortal super-being with the fate of the world in his hands is fairly striking given his career of playing megalomaniacs--and if Peter Biskind's stories about Hopper are to be believed, Hopper's not just talking that way in his movie roles. (Claiming in the wake of his success with Easy Rider that his "may be the most creative generation in the last nineteenth centuries," Hopper compared himself with Jesus, while he opens the episodes of Flatland by reciting, as if they were his own, the words of the Buddha as commonly rendered in contemporary English--"Do not dwell in the past, do not dream of the future, concentrate the mind on the present moment"--just before introducing himself as "Smith.") Un-metafictionally, Hopper and the rest of the cast do quite well, while Shanghai (the "Paris of the East") is most certainly a suitable backdrop for this kind of drama, and the airy Crouching Tiger, Hidden Dragon-style wire-fu makes a very acceptable alternative to Highlander's head-slicing swordplay.

I grant that it is not what the "peak TV" cheerleaders who wet their pants over The Sopranos and Mad Men and Netflix's House of Cards remake and expected all the rest of us to do the same celebrate as great television, but I enjoyed what I saw of it, and was dismayed to see El Rey yank it and all the other '90s action hours they had brought back to the TV schedule (Stargate: SG-1, Starhunter, Xena: Warrior Princess, Relic Hunter) back off the air in favor of their mostly unscripted originals (while this viewer goes elsewhere in the meantime).

Reevaluating Christopher Nolan's Batman Films

Some years ago I offered my thoughts on The Dark Knight Rises--which were less than complimentary. But I had received its two predecessors more favorably.

One reason was that when those earlier movies came out I had been ready to give certain foolishnesses a pass.

Looking at Batman Begins, for instance, there was plenty that seemed off to me on even a first viewing. The worldwide journey Bruce Wayne is on when the film begins, undertaken for the sake of "investigating the criminal mind," struck me as a silly self-indulgence on the part of the billionaire--criminological priv-lit--a wealthy wannabe vigilante's equivalent of Eat, Pray, Love. And that silliness was tied in with a still bigger silliness. It is by no means a new realization that the biggest and worst crimes are not generally labeled crimes at all--while even if one does not, for example, consider the conquest and enslavement of countries, the exploitation of the disenfranchised and the ruination of the environment to be crimes, the line between ordinary business and what is normally recognized as crime in the narrowest sense has never been more than finely drawn, and has only got blurrier over time. However, Nolan did not seem to have caught on to this, Batman's world one where, at least normally, the criminal were all clearly on one side, the "decent, law-abiding citizens" on the other. As it was 2005 there were those who saw in it something of a statement on the War on Terror, but people were always doing that kind of Zeitgeist criticism, and on the relevant level the film seemed to me too flimsy to take seriously.

Still, if it was risible on the level of sociology and politics, I did not expect much more from a superhero movie, and there was a little psychological interest, at least--which seemed novel enough in those days, and at least less of a downer than it appeared in, say, 1995's Batman Forever (when I experienced it as a pure and simple annoyance, out of place in that movie's lighter treatment of the character). It helped that the superhero boom was still fairly fresh, and this more superficially serious and artful approach still novel.

All of this carried over to my reading of the sequel, The Dark Knight, three years later. I enjoyed it as Jungian psychodrama--a level on which I think it is still interesting (even as I find that kind of thing less worthwhile than before)--and brushed off the talk of it as much else.

Alas, The Dark Knight Rises afforded no room for that--Nolan's insistence on making a point that I can least offensively call "questionable" impossible to shrug off, all while simply not being as good a film as the prior ones in the formal ways. (The third installment was, as so many third entries in film trilogies are, not just less fresh and outright repetitive, but less tight and coherent also, so that it was not just a fascist film, but a badly made fascist film.)

And if anything, the Ben Affleck-era take on Batman (what I saw in the perhaps not insignificantly Nolan-produced Batman vs. Superman, and its follow-up Suicide Squad) was still less pleasant. In fact, it soon had me thinking again of how in 2010 I wondered on this blog if it was not time to "Give the Superheroes a Rest?"

Ten years later, it seems fair to say that the studios didn't give us a rest--so I guess I just gave myself a rest from their not giving us a rest.

Subscribe Now: Feed Icon