Recently looking at the coverage of the box office again I have found myself considering the case of China again. While in the U.S. Mulan has gone straight to streaming, Mulan has come out in theaters in China--again, to results considered underwhelming, even given the industry's lowered expectations in the wake of the pandemic.
It is, of course, not the first Disney production to underperform there. There is, of course, the notorious case of the Star Wars saga.
As Alan Yuhas' article observes, this was a significant function of the fact that, given when the originals came out (the late 1970s, the early 1980s), China did not share in the moment. The result was that there was no pull from nostalgia. The article's assessment usefully touches on something else, too--that the series was relatively inaccessible and unengaging to newcomers (as compared with, for example, the information-lighter Marvel superhero films that have done so well in that country).
The disregard of all of this, of course, testifies to something bigger, even more apparent in the fumbling performance with Mulan. Much as we hear about the fuss over studio efforts to craft something that will appeal in that market, the reality is a profound laziness and obtuseness about the audience. Hollywood, and its sycophants in the entertainment press, seem to forget that while China represents a huge market where successful American films have taken in hundreds of millions of dollars (Avengers: Endgame took in over $600 million), China's own film industry is making plenty of blockbusters of its own, and indeed king of the country's own market. In 2019 Avengers: Endgame was only the #3 earner, with China's own Ne Zha and The Wandering Earth each outgrossing it, and together taking in some $1.4 billion. Eight of the top ten films in China were homegrown productions, each of which grossed $200 million or more. Of the twenty-three films that broke the $100 million barrier, fifteen were domestic Chinese productions.
In short, China's market is not simply Hollywood's for the asking. And I suspect that with plenty of homegrown blockbusters having historical themes, pitching a Disney live-action Mulan--one really made in line with American sensibilities (or rather, Hollywood sensibilities), while putting on a very clumsy show of pandering to what they think Chinese filmgoers want in astonishing disregard for innumerable basics of a story and a history that Chinese movie-watchers know very well (really, I suspect, one half-decent historian could and would have fixed up a lot of this for a half-decent consulting fee if the people in charge were capable of listening)--is the sort of thing that will sound much more astute to occupants of board rooms than to anyone of even median intelligence.
Monday, October 19, 2020
Mulan, Tenet and Tough Times for Hollywood
Ever since the advent of TV the Hollywood movie studios have struggled to keep people buying movie tickets. Measured in per capita terms, American movie admissions fell from thirty a year in the 1940s and 1950s to about four by the end of the 1960s (an astonishing 87 percent drop). Of course the figure stabilized at that lower level, Hollywood managing to keep admissions at that level in the half century since--but not without increasing effort. Especially as edgier content became less plausible as a way of packing theaters (New Hollywood died, and between cable, the Internet and identity politics sex ceased to sell), bigger and bigger spectacle, backed by ever-more publicity, became the foundation of continued profitability, with single films now representing investments of a half billion dollars or more--and so only yielding an economically meaningful return if they can be a really big hit on a global scale rather than a merely national one. It has seemed to me that filmmaking has suffered for this, with the scale of the investment, and the needs of publicity, forcing studios to play it safer and safer, above all through ever more rigid adherence to the artistically suffocating requirements of "high concept," and even that in fewer and fewer ways. Indeed, writing the last chapter of the second edition of Star Wars in Context it seemed to me that the decline of that kind of filmmaking was well advanced, and maybe the end of the theatrical feature film as we know it along with it.
That book came out back in the summer of 2018. Like everyone else I did not imagine how in two years the movie industry would be virtually shut down by a global pandemic. And now it appears the studio executives are floundering, to go by the desperation of which their latest moves smack.
I recall how when the pandemic first hit and they started releasing material direct-to-streaming they were only willing to let the small stuff bypass theaters, while sitting on the really big investments. Thus Disney let its adaptation of Artemis Fowl go to streaming--while holding on to Mulan. Still, they have gone on waiting longer than they planned, because the situation still did not improve. Certainly going by the number of officially recorded daily infections August was no better than June, while given the fight over reopening schools one could hardly imagine people were eager to rush into theaters. And so for no clear reason save impatience there was talk of the theaters reopening for business in late August, with Warner Brothers rushing Christopher Nolan's Tenet to theaters, where it seems to have now performed even below their lowered expectations, an outcome at which "everyone" pretended to be surprised. (Or perhaps didn't pretend? One should be careful to not give them more credit than they deserve, which is, at best, very little credit.) Disney, more pragmatically, sent Mulan to streaming--perhaps not so pragmatically with a $30 fee on top of the cost of a Disney Plus subscription. The result appears similarly underwhelming.
And now we are hearing of a new round of delays, with perhaps the most striking the second bumping (by Warner Brothers again) of the release date of Wonder Woman 1984 from early October to Christmas Day, with Jordan Peele's remake of Candyman doing the same.
The really big movies rescheduled from early summer to November still seem to be slated for November--Black Widow and No Time to Die each still due out then (interestingly, even though there was word of No Time to Die delaying all the way to the summer of 2021 back in July). The studios are clearly resisting a second rescheduling. But what has already happened with October is not promising, and unless things get very much better very quickly (and again, I suspect that even if the virus were somehow brought under control it would take time before people are comfortable going to the theater again; and there is that economic downturn making people more careful with money, why does everyone forget that?) it is hard to picture them simply accepting their investment going the way Tenet appears to be doing. Especially given that Black Widow's release date is, at the moment, scarcely seven weeks away, my guess is that another bump is in store for that franchise--and for James Bond as well.
NOTE: This post was originally written in mid-September. The second delay of both Black Widow and No Time to Die has, of course, already happened. You can read my post about the latter event here.
That book came out back in the summer of 2018. Like everyone else I did not imagine how in two years the movie industry would be virtually shut down by a global pandemic. And now it appears the studio executives are floundering, to go by the desperation of which their latest moves smack.
I recall how when the pandemic first hit and they started releasing material direct-to-streaming they were only willing to let the small stuff bypass theaters, while sitting on the really big investments. Thus Disney let its adaptation of Artemis Fowl go to streaming--while holding on to Mulan. Still, they have gone on waiting longer than they planned, because the situation still did not improve. Certainly going by the number of officially recorded daily infections August was no better than June, while given the fight over reopening schools one could hardly imagine people were eager to rush into theaters. And so for no clear reason save impatience there was talk of the theaters reopening for business in late August, with Warner Brothers rushing Christopher Nolan's Tenet to theaters, where it seems to have now performed even below their lowered expectations, an outcome at which "everyone" pretended to be surprised. (Or perhaps didn't pretend? One should be careful to not give them more credit than they deserve, which is, at best, very little credit.) Disney, more pragmatically, sent Mulan to streaming--perhaps not so pragmatically with a $30 fee on top of the cost of a Disney Plus subscription. The result appears similarly underwhelming.
And now we are hearing of a new round of delays, with perhaps the most striking the second bumping (by Warner Brothers again) of the release date of Wonder Woman 1984 from early October to Christmas Day, with Jordan Peele's remake of Candyman doing the same.
The really big movies rescheduled from early summer to November still seem to be slated for November--Black Widow and No Time to Die each still due out then (interestingly, even though there was word of No Time to Die delaying all the way to the summer of 2021 back in July). The studios are clearly resisting a second rescheduling. But what has already happened with October is not promising, and unless things get very much better very quickly (and again, I suspect that even if the virus were somehow brought under control it would take time before people are comfortable going to the theater again; and there is that economic downturn making people more careful with money, why does everyone forget that?) it is hard to picture them simply accepting their investment going the way Tenet appears to be doing. Especially given that Black Widow's release date is, at the moment, scarcely seven weeks away, my guess is that another bump is in store for that franchise--and for James Bond as well.
NOTE: This post was originally written in mid-September. The second delay of both Black Widow and No Time to Die has, of course, already happened. You can read my post about the latter event here.
Friday, October 16, 2020
The Writing Life: The Law of Supply and Demand
I recently suggested that an optimistic calculation of the earnings of the country's novelists producing for adult readers from actual royalties in a given year might come to something in the area of $1.4 billion.
By contrast the same population includes in its work force over 1.3 million lawyers, earning a mean income of $144,000 annually. One might extrapolate from this that the country's lawyers collectively make close to $190 billion a year.
In short, if we treat market incomes as indicative of demand then there is at least a hundred times the demand for lawyers that there is for novelists. Thinking about this I can't help remembering what David Graeber wrote in Bullshit Jobs--that the society we live in "seems to generate an extremely limited demand" for people in the arts, "but an apparently infinite demand" for people in fields that, I suspect, people only work because of the money, like "corporate law." Alas, it is not a happy situation for artists, finding as they do so little effective demand for what they offer.
By contrast the same population includes in its work force over 1.3 million lawyers, earning a mean income of $144,000 annually. One might extrapolate from this that the country's lawyers collectively make close to $190 billion a year.
In short, if we treat market incomes as indicative of demand then there is at least a hundred times the demand for lawyers that there is for novelists. Thinking about this I can't help remembering what David Graeber wrote in Bullshit Jobs--that the society we live in "seems to generate an extremely limited demand" for people in the arts, "but an apparently infinite demand" for people in fields that, I suspect, people only work because of the money, like "corporate law." Alas, it is not a happy situation for artists, finding as they do so little effective demand for what they offer.
The Macroeconomics of Publishing
Not long ago I wrote about the possibility that we are entering a post-scarcity age when it comes to fiction. My analysis there focused on the collapse in pay rates for authors, and the explosion in the availability of reading material of all kinds, much of it free, and much more of it nearly so, without even taking into account the piracy about which we hear endlessly.
Right now I would like to consider the matter from a different standpoint, the size of the market as indicated by sales figures in the U.S., with what that implies about the number of authors who could under even the most favorable conditions make a living from it. (I freely admit that the figures I am working from, the estimates I can make, are rough at best, but in the absence of better . . .)
According to Forbes the retail sales of childrens', young adult and adult books amounted to $8.1 billion in 2018. That sounds like a lot, but of course only part of that went to authors. In the realm of traditional publishing print royalties top out at about 15 percent of sales, while e-books come to 25 percent--while the figures for both royalties, and the sale price off which they take their cut, can vary enormously for the self-published. In the absence of much more detailed data than anyone has bothered to collect, I know of no really satisfactory way to average things out, so let us, in the near-certainty that this overstates the royalty share, assume 20 percent of the revenue ends up going to royalties. This works out to $1.6 billion. If we assume the median U.S. wage--$32,000--then that $1.6 billion works out to such livings for about 50,000 U.S. fiction authors.
However, this is a naively optimistic way of looking at the matter. After all, the reality is that not all this money is going to authors. After all, no author is collecting anything on books in the public domain, while books still in copyright but held by a deceased author's estate are, if supporting their descendants, nonetheless not to be counted here. It might be added that lots of books are by authors writing for an intellectual property owned by someone else, like the Star Wars and Star Trek franchises, which means a smaller proportion of the sales for the person who actually writes the thing. And of course, some of those sales are going to foreign authors (like, for example, J.K. Rowling and E.L. James), not U.S. writers. Thus the pot shrinks again, substantially--and while this would seem offset by the fact that American writers probably earn more from foreign sales than vice-versa, those U.S. authors benefiting this way are likely to be members of a relatively small club, the superstars, underlining the important fact that the pot is not divided at all equally. The superstars who sell millions of copies have a much larger share of the books sold, anywhere--which means that much less to go round for everyone else. And let us not forget that the literary agents are taking their cut from the marginal as well as from the rich and famous (15 percent off the top).
Tellingly the Author's Guild has 9,000 members, and reported in its Wages of Writing survey that the full-timers among its members were making perhaps half the $32,000 figure ($17,500), while the part-timers included in the number made much, much less than even that (a mere $4,500 a year), which implies that all but a small portion of even this relatively elite club are toiling in poverty to the extent that their income comes wholly from their writing. (And of course, only a portion of this group is making a portion of its money from fiction alone.)
Of course, one would never guess any of these things from the attention the media lavishes on the handful who achieve extremely high and unrepresentative sales; from the ubiquity of full-time "writers," and indeed "successful" and often wealthy writers, as figures in popular culture; from the relentless flogging of how-to books and courses and self-publishing services. And this starkly different reality raises the question of just how many of those positions are up for grabs in any given year. Realistically, it would only be a small fraction as some retire (though of course, the retired and even the dead go on writing via "coauthors," usually someone who is also already established, so, no, those are not opportunities for first-time novelists).
Still, taking a shot at even one of a very few positions might not seem so bleak a prospect if there was not so much competition. Alas . . .
Agents and editors constantly tell the public that while "no one" is buying books, "everybody" wants to publish books. They point to polls saying that eighty percent of the American public thinks they'd like to be an author. I suspect this is an idle fantasy for most of them, just as being a talk show host or a film director or a YouTube celebrity would be for most people, if that. (If you have taught college writing--a day job many a would-be novelist works, the kind that brings them far closer to reality than the Mandarins of Park Avenue ever get--you are painfully aware that a very large number of people would like nothing better than to never have to write so much as a single word ever again.) Mostly, I think, the constant mention of such figures is whining on the part of those who resent the efforts of the unpublished to become published, and the insiders' way of (lamely) excusing themselves for the nasty way they treat those who fill up the slush piles with their submissions, as well as the plain old elitist snobbery and meanness of the Joseph Epstein variety hardly unknown among such.
Still, the fact that those how-to books and self-publishing courses and the rest are out there, comprising a massive industry in their own right, indicates that the number of really interested people is not small. Let us posit that out of that 200 million would-be authors even one in a thousand--0.1 percent--are both interested in competing at the adult end of the market, and serious enough about the idea to be making some real effort to write and publish something commercially at a given moment. That would work out to 200,000 people competing not for the few thousand (or few hundred) positions already noted, but the fraction of them open at any one time--likely, just scores in any given year--which again works out to a thousand contestants for each spot, each chance to, most likely, grind along in poverty as a writer, with the odds for most the worse because of the degree to which connections overwhelmingly favor a few, and the rest compete for the miniscule opportunity afforded the slush pile. (It seems relevant that agents talk about receiving five or ten thousand queries and submissions in a year--and picking up the writer of maybe one of those, maybe none, for their list.) Even were I off by an order of magnitude, and it were just 20,000 people submitting work--only one in ten thousand of those who say they have a book in them--it would not make much practical difference for any one person's situation.
As I have remarked before, the ex-interns who once had the charge of looking at the material in those slush piles, venting their hatred of aspiring authors in publications which should know better than to run the rants of the snobs (like the Guardian, always showing its elitism when push comes to shove), insist with as much vehemence as viciousness that those in the slush pile are not worth bothering about (automatically meaning, as this does, that no one but the genuinely privileged people who tend to have the connections can ever produce anything worthwhile). But what I get from these numbers is very different--namely that it is less the possession of talent than getting one of the exceedingly few "breaks" that makes the difference between the "success" on the pedestal and the "failures" the world holds in contempt (and that it is very plausible indeed that a good many of those being endlessly rebuffed have far more to offer than those whose principal asset was having an "in").
I also conclude from these numbers that while Establishment authors malign self-published writers as making things harder for everyone by bucking the system, there is simply no way in which anything remotely resembling our current way of producing books, distributing books, and apportioning the economic rewards of that activity within the existing market can give any but the thoroughly connected or lottery-winner-lucky, no matter how talented and hard-working, even a slight chance at getting their work published and read, let alone the economic opportunity to write full-time.
Is there anything that might change that?
One possibility is that the market will get bigger--that people will read more contemporary fiction than they do now. This may not seem a particularly radical thing to hope for when there is so little such reading going on. But is that really plausible?
Consider how we are constantly told that life will soon get more convenient. Consider, for example, the claims we are hearing about self-driving cars, or better still a shift from private car ownership to ride-sharing those self-driving cars. Many of us who spend hours at the wheel--and still more hours in the tedious tasks associated with car ownership, from filling up the tank to shopping for insurance--would suddenly have a lot more hours on their hands each week. Should ride-sharing prove cheaper as well as more convenient (as seems likely) they would also have more disposable income for other conveniences, translating to still more hours. And let us assume that something does not come along and muck up this improvement in our quality of life, and that people are at least able to use some of those hours for entertainment purposes. Those who already like to read might read more. Some of those who have not read much might find that they like doing so. But that they will necessarily devote that time to the fiction writers are putting out now is not a given (they might, for instance, turn to the vast treasury of classics in the public domain--which they avoided precisely because they did not have the time for them), let alone do so in a way that affords those writers now struggling more opportunity. (We might, for instance, see the biggest bestsellers claim those new readers, while all the rest get none of the extra attention.) We might see any increase in the number of readers overmatched by the increase in the number of writers competing for them as some of that eighty percent of the population interested in being an author takes advantage of the time available to them to finish that novel and get it out there. And in any event, I do not think it overly cynical to suspect that YouTube and Netflix will be bigger winners than any purveyor of things people actually read.
Another possibility is that writers will be able to claim a higher share of the income of book sales as a result of book production getting cheaper. Think of it this way: at this moment we've brilliantly automated book production and distribution. But editing, copyediting, design (to say nothing of marketing) remain laborious, and at that, labors best done by experts. This means that production costs for publishers of all kinds remain high--with the self-published especially suffering. But what if we could perform those tasks much more quickly and cheaply via an advance in artificial intelligence? Traditional publishers might find it economically attractive to publish more books than they do, while the self-published will not be at such a disadvantage, and writers at least potentially able to pocket more profit on their efforts. However, that level of AI looks to be far off, and even if it is not, I suspect the arrival of AI which can edit a book will be accompanied, or even preceded, by AI which could write a book by itself--so I suspect writers will end up losing ground in the market rather than gaining it (as the likes of James Patterson take up computers as their coauthors).
The third possibility I see is writers not having to worry nearly so much about making a living at all from their books--because no one does. I do not think this would require full-blown post-scarcity, just some tolerable minimum of income which would let people take a chance on full-time writing for as long as they found it worthwhile, and let the really determined go on plugging away if prepared to make some sacrifices. Yes, we have been talking about Universal Basic Income of some kind for a half century now without anything of the kind happening, but we have undeniably seen an uptick in mainstream interest in recent years, while at the same time a sharp drop in the cost of living may make such an income scheme more practicable than before. The RethinkX think tank, notably, has just issued a fascinating report forecasting that in a decade's time everyone on the planet could have access to a "First World" living standard at Third World income levels due to drops in the price of food, energy, transport, information and shelter resulting from the mere continuation of the recently observed trend of technological advance. None of this is a done deal by any means (indeed, I think this will take even more managing than the authors acknowledge given the extremity of the economic disruptions involved), but all the same, if $3,000 a year could keep a person well-fed and clothed and housed and mobile and Internet-connected, or we even went only partway to that goal, suddenly what was impossible becomes possible in economic and political terms. And that, in turn, would open up new cultural possibilities, above all for those so long excluded from the arts by an accident of birth that in our allegedly democratic and meritocratic age can seem hardly less damning for such ambitions than not being of noble blood in an earlier era when poetry was the purview of the patricians, and those very few of the lowborn they personally patronized for so long as they were prepared to bow and scrape.
Right now I would like to consider the matter from a different standpoint, the size of the market as indicated by sales figures in the U.S., with what that implies about the number of authors who could under even the most favorable conditions make a living from it. (I freely admit that the figures I am working from, the estimates I can make, are rough at best, but in the absence of better . . .)
According to Forbes the retail sales of childrens', young adult and adult books amounted to $8.1 billion in 2018. That sounds like a lot, but of course only part of that went to authors. In the realm of traditional publishing print royalties top out at about 15 percent of sales, while e-books come to 25 percent--while the figures for both royalties, and the sale price off which they take their cut, can vary enormously for the self-published. In the absence of much more detailed data than anyone has bothered to collect, I know of no really satisfactory way to average things out, so let us, in the near-certainty that this overstates the royalty share, assume 20 percent of the revenue ends up going to royalties. This works out to $1.6 billion. If we assume the median U.S. wage--$32,000--then that $1.6 billion works out to such livings for about 50,000 U.S. fiction authors.
However, this is a naively optimistic way of looking at the matter. After all, the reality is that not all this money is going to authors. After all, no author is collecting anything on books in the public domain, while books still in copyright but held by a deceased author's estate are, if supporting their descendants, nonetheless not to be counted here. It might be added that lots of books are by authors writing for an intellectual property owned by someone else, like the Star Wars and Star Trek franchises, which means a smaller proportion of the sales for the person who actually writes the thing. And of course, some of those sales are going to foreign authors (like, for example, J.K. Rowling and E.L. James), not U.S. writers. Thus the pot shrinks again, substantially--and while this would seem offset by the fact that American writers probably earn more from foreign sales than vice-versa, those U.S. authors benefiting this way are likely to be members of a relatively small club, the superstars, underlining the important fact that the pot is not divided at all equally. The superstars who sell millions of copies have a much larger share of the books sold, anywhere--which means that much less to go round for everyone else. And let us not forget that the literary agents are taking their cut from the marginal as well as from the rich and famous (15 percent off the top).
Tellingly the Author's Guild has 9,000 members, and reported in its Wages of Writing survey that the full-timers among its members were making perhaps half the $32,000 figure ($17,500), while the part-timers included in the number made much, much less than even that (a mere $4,500 a year), which implies that all but a small portion of even this relatively elite club are toiling in poverty to the extent that their income comes wholly from their writing. (And of course, only a portion of this group is making a portion of its money from fiction alone.)
Of course, one would never guess any of these things from the attention the media lavishes on the handful who achieve extremely high and unrepresentative sales; from the ubiquity of full-time "writers," and indeed "successful" and often wealthy writers, as figures in popular culture; from the relentless flogging of how-to books and courses and self-publishing services. And this starkly different reality raises the question of just how many of those positions are up for grabs in any given year. Realistically, it would only be a small fraction as some retire (though of course, the retired and even the dead go on writing via "coauthors," usually someone who is also already established, so, no, those are not opportunities for first-time novelists).
Still, taking a shot at even one of a very few positions might not seem so bleak a prospect if there was not so much competition. Alas . . .
Agents and editors constantly tell the public that while "no one" is buying books, "everybody" wants to publish books. They point to polls saying that eighty percent of the American public thinks they'd like to be an author. I suspect this is an idle fantasy for most of them, just as being a talk show host or a film director or a YouTube celebrity would be for most people, if that. (If you have taught college writing--a day job many a would-be novelist works, the kind that brings them far closer to reality than the Mandarins of Park Avenue ever get--you are painfully aware that a very large number of people would like nothing better than to never have to write so much as a single word ever again.) Mostly, I think, the constant mention of such figures is whining on the part of those who resent the efforts of the unpublished to become published, and the insiders' way of (lamely) excusing themselves for the nasty way they treat those who fill up the slush piles with their submissions, as well as the plain old elitist snobbery and meanness of the Joseph Epstein variety hardly unknown among such.
Still, the fact that those how-to books and self-publishing courses and the rest are out there, comprising a massive industry in their own right, indicates that the number of really interested people is not small. Let us posit that out of that 200 million would-be authors even one in a thousand--0.1 percent--are both interested in competing at the adult end of the market, and serious enough about the idea to be making some real effort to write and publish something commercially at a given moment. That would work out to 200,000 people competing not for the few thousand (or few hundred) positions already noted, but the fraction of them open at any one time--likely, just scores in any given year--which again works out to a thousand contestants for each spot, each chance to, most likely, grind along in poverty as a writer, with the odds for most the worse because of the degree to which connections overwhelmingly favor a few, and the rest compete for the miniscule opportunity afforded the slush pile. (It seems relevant that agents talk about receiving five or ten thousand queries and submissions in a year--and picking up the writer of maybe one of those, maybe none, for their list.) Even were I off by an order of magnitude, and it were just 20,000 people submitting work--only one in ten thousand of those who say they have a book in them--it would not make much practical difference for any one person's situation.
As I have remarked before, the ex-interns who once had the charge of looking at the material in those slush piles, venting their hatred of aspiring authors in publications which should know better than to run the rants of the snobs (like the Guardian, always showing its elitism when push comes to shove), insist with as much vehemence as viciousness that those in the slush pile are not worth bothering about (automatically meaning, as this does, that no one but the genuinely privileged people who tend to have the connections can ever produce anything worthwhile). But what I get from these numbers is very different--namely that it is less the possession of talent than getting one of the exceedingly few "breaks" that makes the difference between the "success" on the pedestal and the "failures" the world holds in contempt (and that it is very plausible indeed that a good many of those being endlessly rebuffed have far more to offer than those whose principal asset was having an "in").
I also conclude from these numbers that while Establishment authors malign self-published writers as making things harder for everyone by bucking the system, there is simply no way in which anything remotely resembling our current way of producing books, distributing books, and apportioning the economic rewards of that activity within the existing market can give any but the thoroughly connected or lottery-winner-lucky, no matter how talented and hard-working, even a slight chance at getting their work published and read, let alone the economic opportunity to write full-time.
Is there anything that might change that?
One possibility is that the market will get bigger--that people will read more contemporary fiction than they do now. This may not seem a particularly radical thing to hope for when there is so little such reading going on. But is that really plausible?
Consider how we are constantly told that life will soon get more convenient. Consider, for example, the claims we are hearing about self-driving cars, or better still a shift from private car ownership to ride-sharing those self-driving cars. Many of us who spend hours at the wheel--and still more hours in the tedious tasks associated with car ownership, from filling up the tank to shopping for insurance--would suddenly have a lot more hours on their hands each week. Should ride-sharing prove cheaper as well as more convenient (as seems likely) they would also have more disposable income for other conveniences, translating to still more hours. And let us assume that something does not come along and muck up this improvement in our quality of life, and that people are at least able to use some of those hours for entertainment purposes. Those who already like to read might read more. Some of those who have not read much might find that they like doing so. But that they will necessarily devote that time to the fiction writers are putting out now is not a given (they might, for instance, turn to the vast treasury of classics in the public domain--which they avoided precisely because they did not have the time for them), let alone do so in a way that affords those writers now struggling more opportunity. (We might, for instance, see the biggest bestsellers claim those new readers, while all the rest get none of the extra attention.) We might see any increase in the number of readers overmatched by the increase in the number of writers competing for them as some of that eighty percent of the population interested in being an author takes advantage of the time available to them to finish that novel and get it out there. And in any event, I do not think it overly cynical to suspect that YouTube and Netflix will be bigger winners than any purveyor of things people actually read.
Another possibility is that writers will be able to claim a higher share of the income of book sales as a result of book production getting cheaper. Think of it this way: at this moment we've brilliantly automated book production and distribution. But editing, copyediting, design (to say nothing of marketing) remain laborious, and at that, labors best done by experts. This means that production costs for publishers of all kinds remain high--with the self-published especially suffering. But what if we could perform those tasks much more quickly and cheaply via an advance in artificial intelligence? Traditional publishers might find it economically attractive to publish more books than they do, while the self-published will not be at such a disadvantage, and writers at least potentially able to pocket more profit on their efforts. However, that level of AI looks to be far off, and even if it is not, I suspect the arrival of AI which can edit a book will be accompanied, or even preceded, by AI which could write a book by itself--so I suspect writers will end up losing ground in the market rather than gaining it (as the likes of James Patterson take up computers as their coauthors).
The third possibility I see is writers not having to worry nearly so much about making a living at all from their books--because no one does. I do not think this would require full-blown post-scarcity, just some tolerable minimum of income which would let people take a chance on full-time writing for as long as they found it worthwhile, and let the really determined go on plugging away if prepared to make some sacrifices. Yes, we have been talking about Universal Basic Income of some kind for a half century now without anything of the kind happening, but we have undeniably seen an uptick in mainstream interest in recent years, while at the same time a sharp drop in the cost of living may make such an income scheme more practicable than before. The RethinkX think tank, notably, has just issued a fascinating report forecasting that in a decade's time everyone on the planet could have access to a "First World" living standard at Third World income levels due to drops in the price of food, energy, transport, information and shelter resulting from the mere continuation of the recently observed trend of technological advance. None of this is a done deal by any means (indeed, I think this will take even more managing than the authors acknowledge given the extremity of the economic disruptions involved), but all the same, if $3,000 a year could keep a person well-fed and clothed and housed and mobile and Internet-connected, or we even went only partway to that goal, suddenly what was impossible becomes possible in economic and political terms. And that, in turn, would open up new cultural possibilities, above all for those so long excluded from the arts by an accident of birth that in our allegedly democratic and meritocratic age can seem hardly less damning for such ambitions than not being of noble blood in an earlier era when poetry was the purview of the patricians, and those very few of the lowborn they personally patronized for so long as they were prepared to bow and scrape.
Thursday, October 1, 2020
BACK IN PRINT . . . GUARDIANS
The year is 1992, and the Soviet Union has just collapsed. Intelligence officer Nick McNab, wondering where his career will go with the Cold War over, finds himself approached by D.C.-based company Argus Consulting to go to Russia to try and infiltrate a secret Soviet-era military research program, the prizes from which may be up for grabs in the chaos.
The assignment at first seems like a snipe hunt, but while in Russia Nick learns of the ellipton, a mysterious weapon of mass destruction which agents from several countries are pursuing. His bosses at Argus promptly assign him to get it, embroiling him in a covert international competition that has already turned bloody, and unknown to him, is exhuming a secret war thought long over.
Available in paperback and as an ebook.
You can also check it out for free on Inkitt.
Get your copy today!
Tweet
Wednesday, August 19, 2020
2015's Fantastic Four: Another Look
I recall being, if not pleasantly surprised, then at least intrigued by Josh Trank's Fantastic Four when I first saw it. There were aspects I was dubious about--the young adult turn in the characterizations, for one, and the inconsistency of the tone--and yet there were others that did interest me.
On a recent (if casual) reviewing of the film I was more impressed by those aspects, not least the film's handling of the scientific research at the heart of the story. The film does make concessions to silly, outworn Edisonade convention in having a high school student working with scavenged junk happen on a key part of the solution to the problem of interdimensional travel. But afterward the film displays, to a far, far greater degree than most of what comes out of Hollywood, a striking awareness, and acknowledgment, of the realities of Big Science. Its agendas and politics, which have the Suits, not the scientists, calling the shots. And the reality that science is not nerd-magic, but a lot of hard work--collaborative, protracted, at times grueling hard work that builds on the bits of the puzzle others have solved, most of that by people unlikely to get any of the glory.
Such things, small as they seem, are rare enough and significant enough that they appear to me to rate an honorable mention.
On a recent (if casual) reviewing of the film I was more impressed by those aspects, not least the film's handling of the scientific research at the heart of the story. The film does make concessions to silly, outworn Edisonade convention in having a high school student working with scavenged junk happen on a key part of the solution to the problem of interdimensional travel. But afterward the film displays, to a far, far greater degree than most of what comes out of Hollywood, a striking awareness, and acknowledgment, of the realities of Big Science. Its agendas and politics, which have the Suits, not the scientists, calling the shots. And the reality that science is not nerd-magic, but a lot of hard work--collaborative, protracted, at times grueling hard work that builds on the bits of the puzzle others have solved, most of that by people unlikely to get any of the glory.
Such things, small as they seem, are rare enough and significant enough that they appear to me to rate an honorable mention.
A Techno-Thriller Revival?
In writing my history of the military techno-thriller I was concerned principally with the main line of the genre--its early flickerings dating back to the seventeenth century; its coalescence into the "invasion story" of the late nineteenth and early twentieth centuries; after the shock of World War I and its aftermath convinced many that war could only mean universal calamity, its giving way to other genres it helped inspire--spy fiction, military science fiction, post-apocalyptic war fiction, "war scare" fiction; and then by way of those remnants, the revival of the old-style invasion story in the 1970s and 1980s (begun by British writers like Frederick Forsyth, Craig Thomas, John Hackett, but most identified with Tom Clancy); before the end of the Cold War turned that boom to bust.
As a result my book handled the post-Cold War period as an epilogue to the main story. Today, however, the world appears a far more aggressive place than it did a decade ago, let alone two decades ago.
Even the '90s, of course, were not without their share of crises--the Norwegian rocket incident, the Third Taiwan Strait Crisis, the NATO-Russia confrontation at Pristina airport--yet today it seems the blood is actually flowing at a lengthening list of flashpoints where armies that once upon a time would not have fought each other are doing just that. The war in Ukraine is not over. In Syria, in Libya, NATO member Turkey and Russia are engaging in regular shooting incidents, while that same NATO government engages in hostilities with Greece and confronts the French(!) at sea. Scarcely a year after India's exchanging air strikes with Pakistan for the first time in nearly a half century (this, they avoided even in the 1999 war), Chinese and Indian troops have fought in the Himalayas (with clubs and rocks?), while China for its part remains at odds with its neighbors over its claims in the East and South China Sea.
Meanwhile everyone is chasing military capabilities and overseas bases they did not think they needed a short time ago. Not long ago most sat out the fifth generation of fighters, making do with older gear--but now everyone wants in on the sixth, while hypersonic cruise missiles may be the object of a new arms race (one perhaps responsible for a mysterious nuclear explosion in northern Russia). Britain is deploying its first full-deck carriers begun since the '40s, while German opinion-makers talk about having a carrier too (and maybe much else). There are even hints of a revival of conscription, thus far limited to smaller nations like Sweden and Peru, but Sweden's action is strongly linked with the elevated fear of war in northeastern Europe, while the French President, who has continued what for that country is an extraordinarily large-scale and long-duration mission in Mali (compare this to its post-Algeria record of African interventions), gives the impression of inching toward the same in his country . . .
It feels as if we are living in a Tom Clancy universe these days. Or, rather, a Dale Brown universe. (Yes, Turkish super-drones attacking Russian mercenaries in Libya feels more Dale Brown.)
Does this portend a revival of the genre? I, for one, doubt it. If scenarios of high-tech, conventional, military conflict look more plausible now than before it should be remembered that the readiness of a broad audience to consider discussion of the "next big war" as relevant is one thing, that audience's readiness to accept the depiction of a war as entertainment is another. Where the appeal of such is concerned it seems most susceptible in a moment where that audience has had a comparative respite from war, some years without what it thinks of as a really major conflict, at least, with decades better still. (It is worth remembering that the techno-thriller emerged as a popular genre not in the 1940s or 1950s or 1960s, but in the late 1970s and only really took off in the 1980s, many, many years after the end of Vietnam.) Of course, this is something audiences in America and the West generally cannot be said to have had, and very clearly would like to have had. Indeed, even the political right gives an impression of weariness of foreign war, especially when one gets away from the professional hawks to which the media gives so much press. (One might add that the withdrawal from Afghanistan and Iraq has been a very different thing from the withdrawal from Vietnam, with less division in the U.S., less anti-militarism, and at the same no desire for some kind of "redemption" on the part of the right that we identify with "post-Vietnam" evident or even plausible, diminishing the interest in an imaginary "refighting" of the conflict.)
The public seems more open to the genre, too, when at least part of it is being led to believe that a big war is somehow thinkable, somehow winnable. This attitude never wholly recovered from the world wars, with the result that the two defining big war novels of the techno-thriller revival--Hackett's book, and Clancy's Red Storm Rising--had to come up with a dodge to let the West get the upper hand without the conflict going all-out nuclear. I see no evidence of such a mood these days. Rather than the '80s I think a better analogy would be the '30s in this, as in so many other ways.
So people may be thinking about the danger of war more than before--but not in such a way that I see them making a bestseller out of some latterday Tom Clancy. In fact, I wonder if the H.G. Wells tradition would not find a stronger echo--save for the fact that I imagine the mandarins of Park Avenue to be exceedingly averse to publishing a work of the type.
As a result my book handled the post-Cold War period as an epilogue to the main story. Today, however, the world appears a far more aggressive place than it did a decade ago, let alone two decades ago.
Even the '90s, of course, were not without their share of crises--the Norwegian rocket incident, the Third Taiwan Strait Crisis, the NATO-Russia confrontation at Pristina airport--yet today it seems the blood is actually flowing at a lengthening list of flashpoints where armies that once upon a time would not have fought each other are doing just that. The war in Ukraine is not over. In Syria, in Libya, NATO member Turkey and Russia are engaging in regular shooting incidents, while that same NATO government engages in hostilities with Greece and confronts the French(!) at sea. Scarcely a year after India's exchanging air strikes with Pakistan for the first time in nearly a half century (this, they avoided even in the 1999 war), Chinese and Indian troops have fought in the Himalayas (with clubs and rocks?), while China for its part remains at odds with its neighbors over its claims in the East and South China Sea.
Meanwhile everyone is chasing military capabilities and overseas bases they did not think they needed a short time ago. Not long ago most sat out the fifth generation of fighters, making do with older gear--but now everyone wants in on the sixth, while hypersonic cruise missiles may be the object of a new arms race (one perhaps responsible for a mysterious nuclear explosion in northern Russia). Britain is deploying its first full-deck carriers begun since the '40s, while German opinion-makers talk about having a carrier too (and maybe much else). There are even hints of a revival of conscription, thus far limited to smaller nations like Sweden and Peru, but Sweden's action is strongly linked with the elevated fear of war in northeastern Europe, while the French President, who has continued what for that country is an extraordinarily large-scale and long-duration mission in Mali (compare this to its post-Algeria record of African interventions), gives the impression of inching toward the same in his country . . .
It feels as if we are living in a Tom Clancy universe these days. Or, rather, a Dale Brown universe. (Yes, Turkish super-drones attacking Russian mercenaries in Libya feels more Dale Brown.)
Does this portend a revival of the genre? I, for one, doubt it. If scenarios of high-tech, conventional, military conflict look more plausible now than before it should be remembered that the readiness of a broad audience to consider discussion of the "next big war" as relevant is one thing, that audience's readiness to accept the depiction of a war as entertainment is another. Where the appeal of such is concerned it seems most susceptible in a moment where that audience has had a comparative respite from war, some years without what it thinks of as a really major conflict, at least, with decades better still. (It is worth remembering that the techno-thriller emerged as a popular genre not in the 1940s or 1950s or 1960s, but in the late 1970s and only really took off in the 1980s, many, many years after the end of Vietnam.) Of course, this is something audiences in America and the West generally cannot be said to have had, and very clearly would like to have had. Indeed, even the political right gives an impression of weariness of foreign war, especially when one gets away from the professional hawks to which the media gives so much press. (One might add that the withdrawal from Afghanistan and Iraq has been a very different thing from the withdrawal from Vietnam, with less division in the U.S., less anti-militarism, and at the same no desire for some kind of "redemption" on the part of the right that we identify with "post-Vietnam" evident or even plausible, diminishing the interest in an imaginary "refighting" of the conflict.)
The public seems more open to the genre, too, when at least part of it is being led to believe that a big war is somehow thinkable, somehow winnable. This attitude never wholly recovered from the world wars, with the result that the two defining big war novels of the techno-thriller revival--Hackett's book, and Clancy's Red Storm Rising--had to come up with a dodge to let the West get the upper hand without the conflict going all-out nuclear. I see no evidence of such a mood these days. Rather than the '80s I think a better analogy would be the '30s in this, as in so many other ways.
So people may be thinking about the danger of war more than before--but not in such a way that I see them making a bestseller out of some latterday Tom Clancy. In fact, I wonder if the H.G. Wells tradition would not find a stronger echo--save for the fact that I imagine the mandarins of Park Avenue to be exceedingly averse to publishing a work of the type.
Tuesday, August 18, 2020
Remembering The Pretender
Watching The Pretender it seemed that pretty much nothing about the show made sense--not that I expected much from a network TV show at that time--while I suspected an essential vacuity behind the opaqueness and the teasing. (Remember, these were the years of the original run of The X-Files.)
What was the Center? A think tank we were told--apparently a think tank that centered on the intellection of one captive savant. (The writers seemed to make up the biogenetic bits of intrigue--clones and the rest--just as they went along, with this suddenly becoming more important as the season finale approached, and then once renewal happened, receding into the background . . .)
And of course, that particular savant never made much sense, least of all in his abilities. "Genius" was an extreme understatement for what he was, even by the risible "dumb person's idea of a genius" standard to which TV adheres. He seemed instead a realization of the Faustian (I mean Goethe here, not Marlowe) will to do everything, know everything, go everywhere--as Gary Stu/Mary Sue as a figure gets. Between one episode and the next he was able to master a skill, a trade, a profession to which people devote entire lifetimes (pilot today, symphony conductor tomorrow, surgeon the next week), while having the time to contrive the fake paper trail and other deceptions necessary to pass himself off as really what he appeared to be--all with his brain not functioning as a palimpsest, with one set of skills displacing another, but one set of abilities added to the next without loss of what he had previously acquired so that his repertoire had him walking about with the equivalent of the capacities of hundreds of world-class experts of the most diverse types in his head all at once.
Of course, that complicated the writing of the show somewhat. The protagonist was in new surroundings, performing different tasks, surrounded by different people in pretty much every episode, pretending to be a different man with a different history--while he was somebody who didn't have a conventional past to begin with, making him harder to establish even as a Fugitive-type character. (It was, of course, one reason why the episodes spent so much showing what the folks back at the Center--pretty much his only consistent acquaintances--were doing.)
Still, the writers knew enough not to make him a tedious, egomaniacal mass of quirks like so many of their "geniuses"--while also being astute enough to make his eccentricities elicit amusement and sympathy rather than grate (delight in some pleasure of which he had been deprived by the Center growing up--like ice cream). They did this the more easily because he was essentially written as a sympathetic, empathetic human being.
That little lesson, I think, TV writers have since forgotten.
What was the Center? A think tank we were told--apparently a think tank that centered on the intellection of one captive savant. (The writers seemed to make up the biogenetic bits of intrigue--clones and the rest--just as they went along, with this suddenly becoming more important as the season finale approached, and then once renewal happened, receding into the background . . .)
And of course, that particular savant never made much sense, least of all in his abilities. "Genius" was an extreme understatement for what he was, even by the risible "dumb person's idea of a genius" standard to which TV adheres. He seemed instead a realization of the Faustian (I mean Goethe here, not Marlowe) will to do everything, know everything, go everywhere--as Gary Stu/Mary Sue as a figure gets. Between one episode and the next he was able to master a skill, a trade, a profession to which people devote entire lifetimes (pilot today, symphony conductor tomorrow, surgeon the next week), while having the time to contrive the fake paper trail and other deceptions necessary to pass himself off as really what he appeared to be--all with his brain not functioning as a palimpsest, with one set of skills displacing another, but one set of abilities added to the next without loss of what he had previously acquired so that his repertoire had him walking about with the equivalent of the capacities of hundreds of world-class experts of the most diverse types in his head all at once.
Of course, that complicated the writing of the show somewhat. The protagonist was in new surroundings, performing different tasks, surrounded by different people in pretty much every episode, pretending to be a different man with a different history--while he was somebody who didn't have a conventional past to begin with, making him harder to establish even as a Fugitive-type character. (It was, of course, one reason why the episodes spent so much showing what the folks back at the Center--pretty much his only consistent acquaintances--were doing.)
Still, the writers knew enough not to make him a tedious, egomaniacal mass of quirks like so many of their "geniuses"--while also being astute enough to make his eccentricities elicit amusement and sympathy rather than grate (delight in some pleasure of which he had been deprived by the Center growing up--like ice cream). They did this the more easily because he was essentially written as a sympathetic, empathetic human being.
That little lesson, I think, TV writers have since forgotten.
Reflections on Independence Day: Resurgence
According to the data set over at Box Office Mojo in 1996 Independence Day opened with a then-record $96 million July 4 extended weekend (five-day) gross. That made it #1 at the box office, of course, a position it held for the next two weekends, then held on to the #2 spot for the two weekends after that, and stayed in the top five for two more, on the way to banking $300 million, a sum which made it the top-earning movie of a very vigorous summer movie season, and a very good year for Hollywood overall (which, not incidentally, launched Will Smith's long career as a sci-fi action star).
In 2016 the sequel Independence Day: Resurgence pulled in a mere $41 million, putting it at #2, and was down to the #5 spot a week later, after which it continued a fairly quick plummet through the rankings. At the end of its run it had about $103 million banked, which, in inflation-adjusted terms, is a good deal less than the original made in just its first five days, and under a quarter of that movie's overall gross. The result was that it barely made the #27 spot for the year, just ahead of the far lower-budgeted horror movie The Conjuring 2.
In short, it could not but be considered a disappointment in light of the original's massive success, and for that matter, its massive $165 million budget--without at the same time gaining a measure of cult-y success.
That failure, of course, does not seem to require much explanation. The second Independence Day movie was a sequel, arrived long after the original had passed from the very short pop cultural memory of most--the more so because of the kind of movie it was. In 1996, when we had about a half dozen really big would-be blockbusters that summer, most of them quite different (the others were Dragonheart, The Rock, Eraser, the first Mission: Impossible and Twister), Independence Day was something we hadn't seen before--the subject matter at least as old as H.G. Wells, but the handling of the spectacle still something that stood out with its grandiosely CGI-powered depiction of a massive alien attack on Earth's major cities.
Now it's entirely standard.
Due to the genre's scaling up, and going science fiction-al in the process, every action movie is now likewise a disaster movie--and a monster movie, and an alien invasion movie in some way or other--any "respectable" CGI-packed tentpole apt to have otherworldly creatures wrecking the core of at least one major city before the end credits (as was the case in the same year's X-Men Apocalypse, Suicide Squad, Superman vs. Batman and Ghostbusters, to say nothing of the city-busting we saw in Rogue One). As with so many prior action movie phenomena--James Bond, Star Wars--what was once special about the movie became routine, except that it never really acquired the kind of brand name that a film franchise could go on trading on after it has ceased to be fresh and new (the dashing of which hope is all too evident when one looks at the closing scenes, and their desperate laying of groundwork for a third movie that is almost certain to not happen).
It did not help that ID: Resurgence was, in spite of some dazzling effects, a fairly mediocre example of the type, in a far more crowded season--which promptly saw it shoved out of the way by other, more vigorous movies, through the rest of the summer and after it, and then in the next year.
In fact you probably already forgot all about it until I mentioned it just now, didn't you? (I know I did until recently happening on it when flipping channels.)
In 2016 the sequel Independence Day: Resurgence pulled in a mere $41 million, putting it at #2, and was down to the #5 spot a week later, after which it continued a fairly quick plummet through the rankings. At the end of its run it had about $103 million banked, which, in inflation-adjusted terms, is a good deal less than the original made in just its first five days, and under a quarter of that movie's overall gross. The result was that it barely made the #27 spot for the year, just ahead of the far lower-budgeted horror movie The Conjuring 2.
In short, it could not but be considered a disappointment in light of the original's massive success, and for that matter, its massive $165 million budget--without at the same time gaining a measure of cult-y success.
That failure, of course, does not seem to require much explanation. The second Independence Day movie was a sequel, arrived long after the original had passed from the very short pop cultural memory of most--the more so because of the kind of movie it was. In 1996, when we had about a half dozen really big would-be blockbusters that summer, most of them quite different (the others were Dragonheart, The Rock, Eraser, the first Mission: Impossible and Twister), Independence Day was something we hadn't seen before--the subject matter at least as old as H.G. Wells, but the handling of the spectacle still something that stood out with its grandiosely CGI-powered depiction of a massive alien attack on Earth's major cities.
Now it's entirely standard.
Due to the genre's scaling up, and going science fiction-al in the process, every action movie is now likewise a disaster movie--and a monster movie, and an alien invasion movie in some way or other--any "respectable" CGI-packed tentpole apt to have otherworldly creatures wrecking the core of at least one major city before the end credits (as was the case in the same year's X-Men Apocalypse, Suicide Squad, Superman vs. Batman and Ghostbusters, to say nothing of the city-busting we saw in Rogue One). As with so many prior action movie phenomena--James Bond, Star Wars--what was once special about the movie became routine, except that it never really acquired the kind of brand name that a film franchise could go on trading on after it has ceased to be fresh and new (the dashing of which hope is all too evident when one looks at the closing scenes, and their desperate laying of groundwork for a third movie that is almost certain to not happen).
It did not help that ID: Resurgence was, in spite of some dazzling effects, a fairly mediocre example of the type, in a far more crowded season--which promptly saw it shoved out of the way by other, more vigorous movies, through the rest of the summer and after it, and then in the next year.
In fact you probably already forgot all about it until I mentioned it just now, didn't you? (I know I did until recently happening on it when flipping channels.)
Why Does Everyone on American TV Know How to Use Chopsticks?
Watching TV these days one gets the impression that all Americans are adroit users of chopsticks. Suspecting this to be, to put it mildly, a gross exaggeration, I went looking for data on the subject.
An item at Statista claims that, based on self-reporting (people answered the question "How good are you at using chopsticks?"), 4 percent of Americans are "expert," 11 percent "very good," and another 19 percent "fair."
By contrast 43 percent are "not very good" or "terrible," and 24 percent say they have never even tried to use them.
Let us accept these numbers as a starting point in the absence of better. Considering them my guess would be that, because this is a matter of self-reporting, people overstate their proficiency--and even with that just one in three claims to be fair or better at chopstick use, meaning at least two-thirds are less than comfortable with them, with a quarter never having handled them at all.
Why does TV give such a different impression? Barring some deliberate effort to master these utensils as an end in itself, the vast majority of Americans, especially in the absence of especially strong exposure to East and Southeast Asian culture due to personal heritage or travel, or very close connections with people who have had that, are probably only likely to learn chopstick use in the course of frequently going to restaurants serving East Asian cuisine, and there specifically insisting on learning to use the associated utensils--something far more likely for affluent (not the 1 percent necessarily, but at least the top 10 percent) residents of big coastal cities than working class folks, and especially working class folks from rural, small-town, "provincial" areas. This seems all the more plausible given the extraordinary snobbery with which Americans have surrounded some aspects of such dining, for example, sushi consumption (in stark contrast with the very different Japanese tradition).
In short, it seems to me a matter of exactly that subject which remains anathema not just to the right but much of the "left," socioeconomic disparity and social class--and in this case, the social class background of the people who write for TV, and the kind of characters they happen to put on TV (overwhelmingly, affluent coastal types).
Do I think that the hacks in Hollywood and on Madison Avenue are deliberately signaling social class when they casually pack the screen with adroit chopstick use, however? The way that, for example, they so often present characters whom they wish to impress us as not just our social and economic superiors but also our cultural and intellectual superiors by having them play the piano with concert performer proficiency, or fence like masters (a visual shorthand that, condescending from the start, is now painfully cliched)?
Of that I am more doubtful. I take it, instead, as simply another reflection of the extreme cluelessness of the hyper-privileged of medialand about the existence of anything outside their painfully cramped little world.
An item at Statista claims that, based on self-reporting (people answered the question "How good are you at using chopsticks?"), 4 percent of Americans are "expert," 11 percent "very good," and another 19 percent "fair."
By contrast 43 percent are "not very good" or "terrible," and 24 percent say they have never even tried to use them.
Let us accept these numbers as a starting point in the absence of better. Considering them my guess would be that, because this is a matter of self-reporting, people overstate their proficiency--and even with that just one in three claims to be fair or better at chopstick use, meaning at least two-thirds are less than comfortable with them, with a quarter never having handled them at all.
Why does TV give such a different impression? Barring some deliberate effort to master these utensils as an end in itself, the vast majority of Americans, especially in the absence of especially strong exposure to East and Southeast Asian culture due to personal heritage or travel, or very close connections with people who have had that, are probably only likely to learn chopstick use in the course of frequently going to restaurants serving East Asian cuisine, and there specifically insisting on learning to use the associated utensils--something far more likely for affluent (not the 1 percent necessarily, but at least the top 10 percent) residents of big coastal cities than working class folks, and especially working class folks from rural, small-town, "provincial" areas. This seems all the more plausible given the extraordinary snobbery with which Americans have surrounded some aspects of such dining, for example, sushi consumption (in stark contrast with the very different Japanese tradition).
In short, it seems to me a matter of exactly that subject which remains anathema not just to the right but much of the "left," socioeconomic disparity and social class--and in this case, the social class background of the people who write for TV, and the kind of characters they happen to put on TV (overwhelmingly, affluent coastal types).
Do I think that the hacks in Hollywood and on Madison Avenue are deliberately signaling social class when they casually pack the screen with adroit chopstick use, however? The way that, for example, they so often present characters whom they wish to impress us as not just our social and economic superiors but also our cultural and intellectual superiors by having them play the piano with concert performer proficiency, or fence like masters (a visual shorthand that, condescending from the start, is now painfully cliched)?
Of that I am more doubtful. I take it, instead, as simply another reflection of the extreme cluelessness of the hyper-privileged of medialand about the existence of anything outside their painfully cramped little world.
Why Is There So Much Talk About NEETs in Japan?
I remember being introduced to the concept of NEETs (Not in Education, Employment or Training) when watching Welcome to the NHK.
Later I was surprised to find that the concept was not Japanese, but actually originated in Margaret Thatcher's Britain.
I was surprised again when I found out that Japan is remarkable not for a particularly high percentage of NEETs, but perhaps as low a percentage as they get next to any other country. The World Bank actually has the International Labor Organization's data for the "Share of youth not in education, employment or training total"--in short, the share of the country's youth who are NEETs--up online here. Japan's last recorded figure was 2.9 percent--compared to the 11.5 percent average for the "high income" countries, and the 28.3 percent figure for "lower middle income" countries. (For the obvious reason that poorer countries have fewer jobs to go round and more limited education systems, high proportions of NEETs tend to go with poverty, not wealth, and things are generally worse there that way than in the richer nations.) The U.S. figure is 13.1 percent, and in Italy, in exceptionally bad shape by First World standards, it is 18 percent, six times as high as the Japanese figure--but probably without six times as much talk about the issue. (In Britain, the home of the concept, the figure is 10.5 percent, three and a half times as high as Japan's, in spite of which, so far as I can tell, I cannot remember a single reference to NEETs in British news media or British pop culture, for whatever that may be worth.)
One might wonder if Japanese official statistics do not underreport the problem. Still, I am unaware of any evidence that Japan is going so much further on this count than other states, hardly less inclined to play such games, and so for the moment I will take it at face value, raising the question of just why all the fuss? One possibility that comes to mind is that right-wingers looking for scapegoats for the failure of their economic policies in other countries, while certainly quick to slander and libel young people as playing video games in their mom's basement all day, have a greater abundance of other targets for their venom. They can also slander and libel minorities, immigrants and ethnic Others generally as the source of their nation's woes. However Japan's comparative ethnic homogoneity, insular geographic position, and historically tight immigration policies mean less in the way of such options for those pursuing such a political line. (The Ainu and Ryukyuan peoples together comprise about 0.4 percent of the Japanese population. And fewer than 2 percent are foreign born. By contrast the population of Britain in 2011 contained some 12 percent or so in the minority category, while about 13 percent of the country was foreign-born.)
It might be added that the sense of Otherness in Japan's case in relation to the great majority of its minorities and foreign-born residents is, compared with many other nations, slighter for their overwhelmingly being from neighboring Asian countries, and often, ethnic Japanese from the Japanese communities in South American countries like Brazil. It seems worth noting, too, that Japanese right-wingers, like their counterparts elsewhere, are inclined to claim that, whatever one says about other people's imperialism, theirs was uplifting, and point to countries they colonized like Korea and Taiwan--the very countries from which so many of their immigrants come--as success stories testifying to the beneficence of their rule. And this, of course, makes claptrap about those "lazy, backward" Others and their "s---hole countries" less tenable. The result is still plenty of xenophobia in all its ugliness--but not much of a basis for its utilization in "foreigners are ruining our economy" scapegoating.
One result is that when it comes to the slowing of the nation's economic progress (these days, it should be admitted, not really so much worse than anyone else's; the twenty-first century has everybody hurting this way), the young catch that much more of the flak.
The Hikikomori Phenomenon: A Sociological View
5/15/17
The Hikikomori and the Lost Decade That Never Ended
7/13/15
Later I was surprised to find that the concept was not Japanese, but actually originated in Margaret Thatcher's Britain.
I was surprised again when I found out that Japan is remarkable not for a particularly high percentage of NEETs, but perhaps as low a percentage as they get next to any other country. The World Bank actually has the International Labor Organization's data for the "Share of youth not in education, employment or training total"--in short, the share of the country's youth who are NEETs--up online here. Japan's last recorded figure was 2.9 percent--compared to the 11.5 percent average for the "high income" countries, and the 28.3 percent figure for "lower middle income" countries. (For the obvious reason that poorer countries have fewer jobs to go round and more limited education systems, high proportions of NEETs tend to go with poverty, not wealth, and things are generally worse there that way than in the richer nations.) The U.S. figure is 13.1 percent, and in Italy, in exceptionally bad shape by First World standards, it is 18 percent, six times as high as the Japanese figure--but probably without six times as much talk about the issue. (In Britain, the home of the concept, the figure is 10.5 percent, three and a half times as high as Japan's, in spite of which, so far as I can tell, I cannot remember a single reference to NEETs in British news media or British pop culture, for whatever that may be worth.)
One might wonder if Japanese official statistics do not underreport the problem. Still, I am unaware of any evidence that Japan is going so much further on this count than other states, hardly less inclined to play such games, and so for the moment I will take it at face value, raising the question of just why all the fuss? One possibility that comes to mind is that right-wingers looking for scapegoats for the failure of their economic policies in other countries, while certainly quick to slander and libel young people as playing video games in their mom's basement all day, have a greater abundance of other targets for their venom. They can also slander and libel minorities, immigrants and ethnic Others generally as the source of their nation's woes. However Japan's comparative ethnic homogoneity, insular geographic position, and historically tight immigration policies mean less in the way of such options for those pursuing such a political line. (The Ainu and Ryukyuan peoples together comprise about 0.4 percent of the Japanese population. And fewer than 2 percent are foreign born. By contrast the population of Britain in 2011 contained some 12 percent or so in the minority category, while about 13 percent of the country was foreign-born.)
It might be added that the sense of Otherness in Japan's case in relation to the great majority of its minorities and foreign-born residents is, compared with many other nations, slighter for their overwhelmingly being from neighboring Asian countries, and often, ethnic Japanese from the Japanese communities in South American countries like Brazil. It seems worth noting, too, that Japanese right-wingers, like their counterparts elsewhere, are inclined to claim that, whatever one says about other people's imperialism, theirs was uplifting, and point to countries they colonized like Korea and Taiwan--the very countries from which so many of their immigrants come--as success stories testifying to the beneficence of their rule. And this, of course, makes claptrap about those "lazy, backward" Others and their "s---hole countries" less tenable. The result is still plenty of xenophobia in all its ugliness--but not much of a basis for its utilization in "foreigners are ruining our economy" scapegoating.
One result is that when it comes to the slowing of the nation's economic progress (these days, it should be admitted, not really so much worse than anyone else's; the twenty-first century has everybody hurting this way), the young catch that much more of the flak.
The Hikikomori Phenomenon: A Sociological View
5/15/17
The Hikikomori and the Lost Decade That Never Ended
7/13/15
Thursday, July 23, 2020
Anyone Still Watching Basic Cable?
Remember when basic cable's scripted TV offerings seemed a fairly major part of pop culture? The Disney Channel had shows like That's So Raven and Hannah Montana (despite not watching the channel at all then, I knew there was something called a "Hannah Montana," even if I wasn't sure what exactly it was, an actual person or a character or maybe just a figure of speech), and Nickelodeon had iCarly. AMC's Mad Men was more a hit with Midcult-loving critics and their upmarket followers than the general audience, but Breaking Bad and The Walking Dead were both confirmed popular sensations--while at the more grown-up but still lighter end of the spectrum, USA had Monk, and TNT had Leverage.
As Brad Adgate acknowledged in Forbes recently, hits like these are pretty much nonexistent these days. Where iCarly creator Dan Schneider watched his show pulling in 10 million viewers then on its best nights, his more recent Henry Danger, which Nickelodeon touted in a recent commercial as the #1 Kid's Comedy on TV, was bringing in under 1 million--bespeaking an astonishing collapse in that market. And it seems pretty much the same story elsewhere in this part of TVland.
Of course, this raises the question--just how will basic cable respond? The cancellation of Disney XD's Kirby Buckets in hindsight seems to have been indicative of a shift away from original, scripted, live-action fare in favor of other material--animation, gaming and the rest. It may be that this was an easier course for this particular channel, with its slighter market and margin for failure, and connections with Marvel and so forth, but it seems possible that other channels will be walking the same path, looking for other ways in which to cater to a shrunken base of viewers. One way might be their similarly turning to nonscripted fare. Another might be their doing what the networks have generally done, passing on more "adventurous" material in favor of more conventional fare--like the procedurals they keep churning out, very profitably.
As Brad Adgate acknowledged in Forbes recently, hits like these are pretty much nonexistent these days. Where iCarly creator Dan Schneider watched his show pulling in 10 million viewers then on its best nights, his more recent Henry Danger, which Nickelodeon touted in a recent commercial as the #1 Kid's Comedy on TV, was bringing in under 1 million--bespeaking an astonishing collapse in that market. And it seems pretty much the same story elsewhere in this part of TVland.
Of course, this raises the question--just how will basic cable respond? The cancellation of Disney XD's Kirby Buckets in hindsight seems to have been indicative of a shift away from original, scripted, live-action fare in favor of other material--animation, gaming and the rest. It may be that this was an easier course for this particular channel, with its slighter market and margin for failure, and connections with Marvel and so forth, but it seems possible that other channels will be walking the same path, looking for other ways in which to cater to a shrunken base of viewers. One way might be their similarly turning to nonscripted fare. Another might be their doing what the networks have generally done, passing on more "adventurous" material in favor of more conventional fare--like the procedurals they keep churning out, very profitably.
Still Waiting on '90s Nostalgia
I have, from time to time, remarked the hints of a belated interest in '90s nostalgia (in the feature film version of Baywatch, or the sitcom Schooled, for example), but by and large these have just proven flickers (the Baywatch film was no blockbuster, Schooled cancelled after a mere two seasons), while a glance at the line-up of films initially scheduled for release in 2020 made clear that where nostalgia was concerned the '80s, after all these years, remained king.
And so I found myself thinking about the faintness of '90s nostalgia to date, developing some thoughts I had earlier a little further. It seems to me that one can simultaneously say several different things about the '90s--each true about different aspects of it, and altogether meaning that the era lends itself less well to nostalgia.
#1. The '90s Never Happened.
To the extent that people imagined the '90s would, in politics, values and the rest represent a break with the '80s--a reaction against Reaganomics and neoliberalism and neoconservatism and Cold War, a shift of our thinking in a more socially and ecologically conscientious, internationalist, de-militarized direction, or even a rebirth of countercultural radicalism ("Once we get outta the 80's, the 90's are going to make the 60's look like the 50's" '60s radical Huey Walker promised in Flashback)--the '90s never happened, amounting instead to a brief and quickly crushed hope that I suspect few even remember. What happened instead was that the '80s simply went on and on in this respect, the right pressing rightward rather than retreating amid the "end of history." (A glance at the Clinton administration's record of social and economic policymaking--government budget-cutting, welfare "reform," deregulation for finance, low taxes on the rich, Alan Greenspan at the Federal Reserve, free trade, etc., etc.--certainly, makes it all too clear that it was a "continuation of Reaganomics by other parties.") And we cannot be nostalgic for what never happened, can we? (Indeed, it seems to me to say a lot that those things of the '80s for which we see nostalgia are, by and large, the things of childhood--while those who express a dissenting impulse, for lack of anything else, still look back to before the '80s, as we see in films like Roman J. Israel, Esq., or Joker.)
#2. The '90s Did Happen, But Was a Transitional Period Rather Than a Distinct Phenomena of its Own.
I have certainly had this impression thinking about film during this period--the action film in particular. We all know the distinctively "'80s action film" Hollywood turned out in that decade--Schwarzenegger and Chuck Norris and Rambo and Die Hard and the rest. And we all know the sort of action film that has prevailed in the twenty-first century--dominated by superheroes and their heroics, with Spiderman, Batman and the Avengers delivering one billion dollar hit after another. But what about the '90s? Close examination, time and again, has convinced me that it was a transitional era where people still flocked to see Schwarzenegger in Eraser, while Hollywood, in spite of the early success of the Tim Burton Batman franchise, was making mostly second-string superhero movies, and Independence Day was novel enough to be a major event (while two decades later its sequel was predictably lost in the ever-larger summer crowd).
#3. The '90s Happened--it's the Period After the '90s That Didn't.
Perhaps the most unambiguous change in the sort of details of daily life that nostalgia-hawkers interest themselves in is perhaps what people call "technology." Yet, what in the heady '90s seemed to be the beginning of a new wave of digital wonders can, in hindsight, seem like the climax. Personal computing, Internet connections, cellular telephony all exploded then--and while they have got cheaper and more portable and faster enabling fuller exploitation of their potentials, it seems that nothing since has been quite so radical. Indeed, much of what was expected to follow soon--bodily immersive virtual reality, telepresence, self-driving vehicles and other equally dramatic artificial intelligence aids to daily living--proved rather further off than imagined, things we are still waiting for now two decades on, with "digital personal assistants" like Alexa still for the most part a novelty, and many still feeling themselves in a position to express a "Git a Hoss"-like disdain for the idea that cars will drive themselves anytime soon. The result is that rather than being in the '80s we remain in the '90s--and equally cannot get nostalgic for what never departed.
Of course, none of this means that there are not things that I look at and find to be particularly "'90s." Thus does it go with, for instance, those syndicated action hours one can still see rerun on channels like H & I (for now). They offer a style of entertainment not quite so available before, and clearly vanished since. Yet such items strike me as sufficiently minor as to leave those building entertainment out of nostalgia little to work with.
And so I found myself thinking about the faintness of '90s nostalgia to date, developing some thoughts I had earlier a little further. It seems to me that one can simultaneously say several different things about the '90s--each true about different aspects of it, and altogether meaning that the era lends itself less well to nostalgia.
#1. The '90s Never Happened.
To the extent that people imagined the '90s would, in politics, values and the rest represent a break with the '80s--a reaction against Reaganomics and neoliberalism and neoconservatism and Cold War, a shift of our thinking in a more socially and ecologically conscientious, internationalist, de-militarized direction, or even a rebirth of countercultural radicalism ("Once we get outta the 80's, the 90's are going to make the 60's look like the 50's" '60s radical Huey Walker promised in Flashback)--the '90s never happened, amounting instead to a brief and quickly crushed hope that I suspect few even remember. What happened instead was that the '80s simply went on and on in this respect, the right pressing rightward rather than retreating amid the "end of history." (A glance at the Clinton administration's record of social and economic policymaking--government budget-cutting, welfare "reform," deregulation for finance, low taxes on the rich, Alan Greenspan at the Federal Reserve, free trade, etc., etc.--certainly, makes it all too clear that it was a "continuation of Reaganomics by other parties.") And we cannot be nostalgic for what never happened, can we? (Indeed, it seems to me to say a lot that those things of the '80s for which we see nostalgia are, by and large, the things of childhood--while those who express a dissenting impulse, for lack of anything else, still look back to before the '80s, as we see in films like Roman J. Israel, Esq., or Joker.)
#2. The '90s Did Happen, But Was a Transitional Period Rather Than a Distinct Phenomena of its Own.
I have certainly had this impression thinking about film during this period--the action film in particular. We all know the distinctively "'80s action film" Hollywood turned out in that decade--Schwarzenegger and Chuck Norris and Rambo and Die Hard and the rest. And we all know the sort of action film that has prevailed in the twenty-first century--dominated by superheroes and their heroics, with Spiderman, Batman and the Avengers delivering one billion dollar hit after another. But what about the '90s? Close examination, time and again, has convinced me that it was a transitional era where people still flocked to see Schwarzenegger in Eraser, while Hollywood, in spite of the early success of the Tim Burton Batman franchise, was making mostly second-string superhero movies, and Independence Day was novel enough to be a major event (while two decades later its sequel was predictably lost in the ever-larger summer crowd).
#3. The '90s Happened--it's the Period After the '90s That Didn't.
Perhaps the most unambiguous change in the sort of details of daily life that nostalgia-hawkers interest themselves in is perhaps what people call "technology." Yet, what in the heady '90s seemed to be the beginning of a new wave of digital wonders can, in hindsight, seem like the climax. Personal computing, Internet connections, cellular telephony all exploded then--and while they have got cheaper and more portable and faster enabling fuller exploitation of their potentials, it seems that nothing since has been quite so radical. Indeed, much of what was expected to follow soon--bodily immersive virtual reality, telepresence, self-driving vehicles and other equally dramatic artificial intelligence aids to daily living--proved rather further off than imagined, things we are still waiting for now two decades on, with "digital personal assistants" like Alexa still for the most part a novelty, and many still feeling themselves in a position to express a "Git a Hoss"-like disdain for the idea that cars will drive themselves anytime soon. The result is that rather than being in the '80s we remain in the '90s--and equally cannot get nostalgic for what never departed.
Of course, none of this means that there are not things that I look at and find to be particularly "'90s." Thus does it go with, for instance, those syndicated action hours one can still see rerun on channels like H & I (for now). They offer a style of entertainment not quite so available before, and clearly vanished since. Yet such items strike me as sufficiently minor as to leave those building entertainment out of nostalgia little to work with.
No Time to Die: Commercial Prospects
If memory serves, No Time to Die was the first blockbuster-grade movie to see its release deferred as a result of the pandemic--in this case all the way to late November (more than seven months after its initial release date). This will mean its coming out five years after Spectre--the longest gap in series' history apart from the span of time between Licence to Kill and Goldeneye.
Given the surprises this year has had for us all, and for that installment in the franchise in particular, I am in no mood to make predictions--but still comfortable with hypotheticals. Let us assume the release goes ahead as now scheduled, in something like a normal film market, where a comparison with that older film is actually useful. Goldeneye was, of course, a success, the first Bond movie to break the $100 million barrier, which did not quite mean what it had before, but even adjusted for inflation reversed the generally downward trend in the series' grosses since the '70s, an especially welcome development after the disappointment of Licence to Kill. (I might add that years later it seems to be widely regarded as, The Spy Who Loved Me apart, the best of the Bond films to appear between the original crop of the '60s and the twenty-first century reboot.)
No Time to Die does not have so much to recover from--but at the same time, a generation later, the concept may be that much more worn, the market tougher, while as I remarked not so long ago, if the Bond series has been borrowing ideas rather than generating them for a half century now, the ideas it has been borrowing seem to me less appealing generally, and certainly less easy to assimilate to the character. (Moonraker's sending Bond into space was wacky, and Licence to Kill went too far in reinventing Bond as a "someone killed my favorite second cousin"-type '80s Hollywood action film hero, but all that was less problematic than origin story prequels and family drama and overlong running times and the rest it took from recent action films.)
In considering all that it seems worth acknowledging that there has been a certain amount of noise over the hints that Bond will get married, and replaced by a female (and Black) operative. I have no idea how much it will mean amid all the flashy action stuff, and after all these decades of concessions to a gender politics less forgiving of the old conception (already by the late '70s things were changing, while I would say Casino Royale pretty much finished Bond off in this department all by itself), the bigger issue seems to me to be something more basic. Already by the late '70s the illusion that it was more than "just another action" series was beginning to fade, by the '80s pretty much gone, and if at times it seemed to defy it with a movie that persuaded much of the audience it was an event (because of a longer than usual span of time, because of a new concept), as Casino Royale and Skyfall managed to do, it is an increasingly difficult trick to pull off--while pulling it off seems more essential than ever to selling tickets.
NOTE: Between the time I drafted this post, and my being able to post, there has already been new talk of No Time to Die getting bumped again, all the way over to the summer of 2021. (Things are moving too fast for anyone to keep up with these days.)
Given the surprises this year has had for us all, and for that installment in the franchise in particular, I am in no mood to make predictions--but still comfortable with hypotheticals. Let us assume the release goes ahead as now scheduled, in something like a normal film market, where a comparison with that older film is actually useful. Goldeneye was, of course, a success, the first Bond movie to break the $100 million barrier, which did not quite mean what it had before, but even adjusted for inflation reversed the generally downward trend in the series' grosses since the '70s, an especially welcome development after the disappointment of Licence to Kill. (I might add that years later it seems to be widely regarded as, The Spy Who Loved Me apart, the best of the Bond films to appear between the original crop of the '60s and the twenty-first century reboot.)
No Time to Die does not have so much to recover from--but at the same time, a generation later, the concept may be that much more worn, the market tougher, while as I remarked not so long ago, if the Bond series has been borrowing ideas rather than generating them for a half century now, the ideas it has been borrowing seem to me less appealing generally, and certainly less easy to assimilate to the character. (Moonraker's sending Bond into space was wacky, and Licence to Kill went too far in reinventing Bond as a "someone killed my favorite second cousin"-type '80s Hollywood action film hero, but all that was less problematic than origin story prequels and family drama and overlong running times and the rest it took from recent action films.)
In considering all that it seems worth acknowledging that there has been a certain amount of noise over the hints that Bond will get married, and replaced by a female (and Black) operative. I have no idea how much it will mean amid all the flashy action stuff, and after all these decades of concessions to a gender politics less forgiving of the old conception (already by the late '70s things were changing, while I would say Casino Royale pretty much finished Bond off in this department all by itself), the bigger issue seems to me to be something more basic. Already by the late '70s the illusion that it was more than "just another action" series was beginning to fade, by the '80s pretty much gone, and if at times it seemed to defy it with a movie that persuaded much of the audience it was an event (because of a longer than usual span of time, because of a new concept), as Casino Royale and Skyfall managed to do, it is an increasingly difficult trick to pull off--while pulling it off seems more essential than ever to selling tickets.
NOTE: Between the time I drafted this post, and my being able to post, there has already been new talk of No Time to Die getting bumped again, all the way over to the summer of 2021. (Things are moving too fast for anyone to keep up with these days.)
Tuesday, July 21, 2020
What Ever Happened to Tennis?
Recently considering how the image of luxurious, glamorous living film and TV offer has changed I happened to mention the scale of the houses we see--which seems to me a matter of the exploding fortunes of the wealthy. Still, not everything seems easily explicable in such terms, with the changing nature of pastimes an obvious case. In our movies and television shows the rich and glamorous still play plenty of golf--but, I think, rather less tennis, a game that seemed ubiquitous in that kind of fare. Jonathan and Jennifer Hart, after all, were not unknown to be on the tennis court. In the opening montage of the first season of Charlie's Angels, during Farrah Fawcett's bit we see her swinging a tennis racquet. What did the original Bionic Woman do before becoming an OSI agent/junior high school teacher? She was a tennis pro.
And so on and so forth.
All that is rather less evident now.
It seems plausible that this is bound up with the decline of interest in tennis more generally.
The generally accepted version of the story seems to be that tennis was, for the broader public, fresh and new in the late '60s and early '70s--thanks to the airing of major tournaments on television as the sport newly went pro, while it is held that colorful personalities and gender politics (John McEnroe's temper tantrums, Billie Jean King's feminism) sustained interest for some. That could not and did not go on forever, while the relative decline of American players' prominence at the sports upper levels reduced interest in the U.S. (kind of like with boxing). Given that even at its peak tennis was never exactly football or basketball (the U.S. Open no Super Bowl or NBA Championship even at its long ago height), all this meant it was vulnerable when "secondary sports" got squeezed out of press coverage and television broadcast (also like boxing), leaving it a long way from its glory days--and TV writers and producers less expectant that an audience will be suitably impressed by the sight of their protagonist on the court.
And so on and so forth.
All that is rather less evident now.
It seems plausible that this is bound up with the decline of interest in tennis more generally.
The generally accepted version of the story seems to be that tennis was, for the broader public, fresh and new in the late '60s and early '70s--thanks to the airing of major tournaments on television as the sport newly went pro, while it is held that colorful personalities and gender politics (John McEnroe's temper tantrums, Billie Jean King's feminism) sustained interest for some. That could not and did not go on forever, while the relative decline of American players' prominence at the sports upper levels reduced interest in the U.S. (kind of like with boxing). Given that even at its peak tennis was never exactly football or basketball (the U.S. Open no Super Bowl or NBA Championship even at its long ago height), all this meant it was vulnerable when "secondary sports" got squeezed out of press coverage and television broadcast (also like boxing), leaving it a long way from its glory days--and TV writers and producers less expectant that an audience will be suitably impressed by the sight of their protagonist on the court.
Subscribe to:
Posts (Atom)