It seems that Lenny's Pizza in Bensonhurst, Brooklyn, is closing down.
I have to admit that I didn't even know about any Lenny's Pizza until the news story about its closure, occasioned principally by its appearance in Saturday Night Fever. Still, all the talk about the film did remind me of how the film struck me when I first saw it, long after its initial release.
I remember being surprised, very surprised, by just how bleak the movie was. The innumerable evocations of the film I'd seen in other films (like Airplane!, or Short Circuit), all emphasized the disco stuff. But the movie was not just about Saturday Night. It was also about "Sunday morning," and all the other mornings, noons and nights of the rest of the week as experienced by a bunch of working-class kids leading dead-end lives in an America that, as we see even more clearly with the hindsight of a half century, was facing diminished prospects after the end of a post-war boom that proved singular event rather than "new normal"--and the unexpected urban grit made more of an impression on me than anything else in the film.
Indeed, the fact that a movie like this was made as a big feature, and became a big blockbuster, struck me as itself indicative of its having come from a different time, and the bit of film history I have learned since has only affirmed that. In 1977 "high concept" film was already a-borning (Jaws came out two years later, while Star Wars had made its debut six months earlier), but this was still the era of the "New Hollywood."
By contrast it was very different with that obvious counterpoint, the following year's screen adaptation of the hit stage musical Grease, where John Travolta played a very similar figure to the one he played in Saturday Night Fever (Danny Zucco, of similar social background and not much more intelligence) in a far more cheerful tale. The grit scrubbed out of the picture it was a bright, upbeat, nostalgic story where the things that went wrong for the characters in Fever went right for them here. (Unlike Fever's Bobby C., it turns out that Kenickie did not get his girlfriend pregnant after all, but even if she was, he was ready to "make an honest woman" out of her, instead of performing suicidal stunts on a bridge and falling to his death.)
I suppose Hollywood was still able to take a hard look at the less pleasant facts of contemporary life--but increasingly inclining to make a principal product of looking at the past with rose-colored glasses, the New Hollywood starting to give way to "the New Old Hollywood" increasingly dominant since. Now anything like the working-class drama of Fever is apt to be the stuff of little indie movies almost no one will see, or feel-bad prestige TV that a certain kind of cowardly ideologue will sneer at as "poverty porn"--while we are barraged with commercials for the upcoming HBO Max prequel to Grease, Grease: Rise of the Pink Ladies.
Thursday, February 23, 2023
Baywatch as a Piece of Pop Cultural History
Baywatch had its premiere on NBC back in 1989 and actually was a network show for its first year, but after its cancellation by the network's execs it continued in first-run syndication.
It didn't begin the syndicated TV bonanza of the late '80s and '90s. Star Trek: The Next generation, which launched in 1987, can get more credit for really getting the trend going. But it was an especially large success (the "most watched show in history"), which besides inspiring a good deal of direct imitation (like a four-season revival of Flipper), and more broadly encouraged the boom (helping pave the way for such hits as Xena: Warrior Princess).
Watching it now one is reminded of other aspects of how TV used to be, with its lush opening credits sequences whose opening theme songs would became pop cultural standards; its episodic plots; its lightness of tone, even amid danger-filled action--not taking itself too seriously but also not being obnoxiously flippant. Even more striking in comparison with what we get in even today's lighter fare, distinctively lacking the "edginess" and meanness and pretention and the self-satisfaction that attends all these things we take for granted even in network sitcoms as the makers of the show endeavored to please rather than offend. This all made it the kind of easy viewing one finds only in certain niche sorts of programming these days (like the movies Hallmark and its rivals put on the air, which is generally not quite like this). There is, too, the profound L.A.-ness of the show, and indeed L.A.-ness from when L.A. was the embodiment of the American fantasy of "the good life," which, again, befit that light entertainment approach--people wanting to imagine themselves someplace more alluring than where they happen to be.
Owing to how well it all went over the show stayed on the air so long that its trajectory was to be bound up with the changes in television a decade after its launch. Amid the increasing crowding of the market a decade after its launch, followed by its peaking --and one might add, the general decamping of "Hollywood" Hollywood from what had so long been the default setting for American film and television production, and even the Lower Forty-Eight--the production relocated to Hawaii. This bought it no more than an additional two years, what was now called Baywatch: Hawaii airing its last episode on May 19, 2001.
In its way that date seems to me another little marker of the end of the "'90s," and the beginning of the twenty-first century, with the franchise's fortunes since confirming the distance pop culture traveled--by and large, not for the better. Unlikely material as it was for a big summer movie, that was exactly what Paramount offered up--with the script's writers opting for making the characters look like idiots in what was (mostly but not always) a parody of the original, heavy on frat-boy gross-out humor, rather less heavy on the appeal that drew viewers back week after week. (It was not even made in L.A., but instead in Florida and Georgia, and set in Florida's Broward County!)
It was no surprise that even the poptimists and outright claqueurs who now pass for "film critics" were unimpressed--and so far as I can tell, no hit with fans of the old show, the movie enjoyed or not enjoyed on its own terms, rather than as anything to do with that original. (Mostly "not enjoyed," to go by the ratings I have seen on the Internet Movie DataBase.)
Similarly a sign of the times would seem a comparison of Baywatch with that more recent fantasy of life in L.A., Entourage. With Entourage we have the celebration not of "golden Los Angeles" so much as "golden Los Angeles as experienced by Hollywood's favored few," courtesy of some of those few who again showed us the distance pop culture had traveled, and again, not for the better. The way this sort of thing is supposed to work is that we the audience feel ourselves there, vicariously living it up in this fictional world. Instead Entourage always made its viewers feel as if they had their noses pressed to the glass, looking in on someone else enjoying themselves at a party from which they barred--with those they were watching enjoy themselves a pack of thoroughly unlikable idiots. Indeed, one of the more perceptive comments about Entourage described it as "the most taunting show in TV history . . . show[ing] us what we're missing, and rub[bing] our faces in" it--and so for all the sun, the glamour and the rest made the viewer, just as much as any episode of Mad Men, think "I hate the world, and everything in it, myself included," depressing them without having earned the right by virtue of at least trying to say something important. Such does it go when a show comes from the "mind" of Mark Wahlberg, I guess, though in fairness he is a product of his time and milieu--all as he does that time and milieu no credit.
It didn't begin the syndicated TV bonanza of the late '80s and '90s. Star Trek: The Next generation, which launched in 1987, can get more credit for really getting the trend going. But it was an especially large success (the "most watched show in history"), which besides inspiring a good deal of direct imitation (like a four-season revival of Flipper), and more broadly encouraged the boom (helping pave the way for such hits as Xena: Warrior Princess).
Watching it now one is reminded of other aspects of how TV used to be, with its lush opening credits sequences whose opening theme songs would became pop cultural standards; its episodic plots; its lightness of tone, even amid danger-filled action--not taking itself too seriously but also not being obnoxiously flippant. Even more striking in comparison with what we get in even today's lighter fare, distinctively lacking the "edginess" and meanness and pretention and the self-satisfaction that attends all these things we take for granted even in network sitcoms as the makers of the show endeavored to please rather than offend. This all made it the kind of easy viewing one finds only in certain niche sorts of programming these days (like the movies Hallmark and its rivals put on the air, which is generally not quite like this). There is, too, the profound L.A.-ness of the show, and indeed L.A.-ness from when L.A. was the embodiment of the American fantasy of "the good life," which, again, befit that light entertainment approach--people wanting to imagine themselves someplace more alluring than where they happen to be.
Owing to how well it all went over the show stayed on the air so long that its trajectory was to be bound up with the changes in television a decade after its launch. Amid the increasing crowding of the market a decade after its launch, followed by its peaking --and one might add, the general decamping of "Hollywood" Hollywood from what had so long been the default setting for American film and television production, and even the Lower Forty-Eight--the production relocated to Hawaii. This bought it no more than an additional two years, what was now called Baywatch: Hawaii airing its last episode on May 19, 2001.
In its way that date seems to me another little marker of the end of the "'90s," and the beginning of the twenty-first century, with the franchise's fortunes since confirming the distance pop culture traveled--by and large, not for the better. Unlikely material as it was for a big summer movie, that was exactly what Paramount offered up--with the script's writers opting for making the characters look like idiots in what was (mostly but not always) a parody of the original, heavy on frat-boy gross-out humor, rather less heavy on the appeal that drew viewers back week after week. (It was not even made in L.A., but instead in Florida and Georgia, and set in Florida's Broward County!)
It was no surprise that even the poptimists and outright claqueurs who now pass for "film critics" were unimpressed--and so far as I can tell, no hit with fans of the old show, the movie enjoyed or not enjoyed on its own terms, rather than as anything to do with that original. (Mostly "not enjoyed," to go by the ratings I have seen on the Internet Movie DataBase.)
Similarly a sign of the times would seem a comparison of Baywatch with that more recent fantasy of life in L.A., Entourage. With Entourage we have the celebration not of "golden Los Angeles" so much as "golden Los Angeles as experienced by Hollywood's favored few," courtesy of some of those few who again showed us the distance pop culture had traveled, and again, not for the better. The way this sort of thing is supposed to work is that we the audience feel ourselves there, vicariously living it up in this fictional world. Instead Entourage always made its viewers feel as if they had their noses pressed to the glass, looking in on someone else enjoying themselves at a party from which they barred--with those they were watching enjoy themselves a pack of thoroughly unlikable idiots. Indeed, one of the more perceptive comments about Entourage described it as "the most taunting show in TV history . . . show[ing] us what we're missing, and rub[bing] our faces in" it--and so for all the sun, the glamour and the rest made the viewer, just as much as any episode of Mad Men, think "I hate the world, and everything in it, myself included," depressing them without having earned the right by virtue of at least trying to say something important. Such does it go when a show comes from the "mind" of Mark Wahlberg, I guess, though in fairness he is a product of his time and milieu--all as he does that time and milieu no credit.
Tuesday, February 21, 2023
Remembering Baywatch
It has been a third of a century since Baywatch first hit the airwaves, and on streaming it is back, remastered and as bright and glossy as ever.
The show was unpretentious, light entertainment, and absolutely unashamed of the fact—indeed, made the most of it in a manner not unique to it (one finds that even action shows like this one, the original Magnum or MacGyver for instance, tended in the same direction), but arguably more successfully than its contemporaries. Helpful here was the setting and its associated imagery--the sun, the sand, the surf, and of course, the Baywatch beauties, every bit of it redolent with the mystique southern California had not yet begun to lose, and all of it shot in that glorious music video-style that the makers of television for the small screen were just then mastering as the makers of film for the big screen had before them. Helpful, too, was the fact that if the show did not wholly shut out reality (indeed, shut it out rather less than a good deal of "serious" drama praised to the skies by the critics) it rarely got gritty or brutal, and made the viewer feel that the characters inhabited a world where yanking Darwin Award aspirants out of the water was the biggest problem with which the world had to contend, and being a lifeguard on the beach accordingly the most important job in that world--a situation that, I think, most of us would regard as a vast improvement over the polycrisis-ridden world in which we had even then been living for as long as anyone could remember.
It was a winning combination--according to the legend, at any rate, getting well over a billion regular viewers globally at its height, making it the most-watched show in history--a title no show would seem likely to claim from it in this crowded and fragmented media universe, the more in as the strategy for surviving in the ever-more brutal attention economy seems to be to aim for intense appeal at the expense of wide; and much of what made this show such a hit now so unfashionable as to be virtually impossible (and in cases, unlikely even to be tolerated were it attempted).
Still, the fact that it isn't quite like anything they make now is the more reason for its lingering in the public consciousness the way it has, its stars still pop cultural icons. Thus did I find when looking at a list of the 500 most searched-for keywords that, a generation after her departure from the show, Pamela Anderson is on that list (at #435), making her still one of the most searched-for celebrities in the world. Meanwhile, one of the very few ahead of her is her fellow Baywatch alum, Carmen Electra--at #87, putting her behind only Kim Kardashian!
Looking over her list of credits there can be no doubt about which of them contributed most to that standing.
The show was unpretentious, light entertainment, and absolutely unashamed of the fact—indeed, made the most of it in a manner not unique to it (one finds that even action shows like this one, the original Magnum or MacGyver for instance, tended in the same direction), but arguably more successfully than its contemporaries. Helpful here was the setting and its associated imagery--the sun, the sand, the surf, and of course, the Baywatch beauties, every bit of it redolent with the mystique southern California had not yet begun to lose, and all of it shot in that glorious music video-style that the makers of television for the small screen were just then mastering as the makers of film for the big screen had before them. Helpful, too, was the fact that if the show did not wholly shut out reality (indeed, shut it out rather less than a good deal of "serious" drama praised to the skies by the critics) it rarely got gritty or brutal, and made the viewer feel that the characters inhabited a world where yanking Darwin Award aspirants out of the water was the biggest problem with which the world had to contend, and being a lifeguard on the beach accordingly the most important job in that world--a situation that, I think, most of us would regard as a vast improvement over the polycrisis-ridden world in which we had even then been living for as long as anyone could remember.
It was a winning combination--according to the legend, at any rate, getting well over a billion regular viewers globally at its height, making it the most-watched show in history--a title no show would seem likely to claim from it in this crowded and fragmented media universe, the more in as the strategy for surviving in the ever-more brutal attention economy seems to be to aim for intense appeal at the expense of wide; and much of what made this show such a hit now so unfashionable as to be virtually impossible (and in cases, unlikely even to be tolerated were it attempted).
Still, the fact that it isn't quite like anything they make now is the more reason for its lingering in the public consciousness the way it has, its stars still pop cultural icons. Thus did I find when looking at a list of the 500 most searched-for keywords that, a generation after her departure from the show, Pamela Anderson is on that list (at #435), making her still one of the most searched-for celebrities in the world. Meanwhile, one of the very few ahead of her is her fellow Baywatch alum, Carmen Electra--at #87, putting her behind only Kim Kardashian!
Looking over her list of credits there can be no doubt about which of them contributed most to that standing.
Of Greedy and McTeague
I remember how back in March 1994 the film Greedy came out with what seemed somewhat more than the usual fanfare for a late winter/early spring release in those days.
As it happened the critics received the film as less than a complete success, while the box office was not boffo. (According to Box Office Mojo it was not even among the top hundred earners at the U.S. box office in 1994.) Unsurprisingly the movie seems to have been quickly forgotten afterward--and since then, never being rediscovered and acquiring a cult following the way some films do, remained in the obscurity into which it had passed. Still, I enjoyed it at the time and think it had its good points, not least a twist-filled plot more than usually ambitious for comedies of the day, and some memorable performances--not least, those of screen legend Kirk Douglas as the scheming family patriarch, Olivia d'Abo as his "nurse," and Phil Hartman as the nastiest of his would-be heirs.
The film's cinematic and literary allusions, which went over my head at the time, seem at least to be a point of interest. The title Greedy was a play on Eric von Stroheim's classic silent film Greed (1924)--which was an adaptation of Frank Norris' novel McTeague (1899), with all this, of course, reflected in "McTeague" being the name of the family in question.
Of course, that--and the fact that greed is indeed a theme of the film--is as far as it goes. Norris' novel, to the extent that it is remembered today, is recalled as a classic of Zolaesque naturalism in all its social insight, and the forcefulness, even brutality, with which it conveyed that insight. (McTeague centers on a love triangle involving two friends and the woman they both want, the three members of which may be fairly described as ruining and killing each other--with their deaths prefigured by the destruction of another couple before them.)
By all accounts Von Stroheim's Greed was faithful to that. By contrast Greedy was a high concept comedy, and not an especially dark one, which delivers the expected happy ending--with the result that in the end the title of the film and the choice of McTeague as the family name are just allusions, nothing more, with even a relatively favorably disposed viewer of the film like myself not ready to claim any clear evidence that the makers of the film had anything much deeper on their minds.
As it happened the critics received the film as less than a complete success, while the box office was not boffo. (According to Box Office Mojo it was not even among the top hundred earners at the U.S. box office in 1994.) Unsurprisingly the movie seems to have been quickly forgotten afterward--and since then, never being rediscovered and acquiring a cult following the way some films do, remained in the obscurity into which it had passed. Still, I enjoyed it at the time and think it had its good points, not least a twist-filled plot more than usually ambitious for comedies of the day, and some memorable performances--not least, those of screen legend Kirk Douglas as the scheming family patriarch, Olivia d'Abo as his "nurse," and Phil Hartman as the nastiest of his would-be heirs.
The film's cinematic and literary allusions, which went over my head at the time, seem at least to be a point of interest. The title Greedy was a play on Eric von Stroheim's classic silent film Greed (1924)--which was an adaptation of Frank Norris' novel McTeague (1899), with all this, of course, reflected in "McTeague" being the name of the family in question.
Of course, that--and the fact that greed is indeed a theme of the film--is as far as it goes. Norris' novel, to the extent that it is remembered today, is recalled as a classic of Zolaesque naturalism in all its social insight, and the forcefulness, even brutality, with which it conveyed that insight. (McTeague centers on a love triangle involving two friends and the woman they both want, the three members of which may be fairly described as ruining and killing each other--with their deaths prefigured by the destruction of another couple before them.)
By all accounts Von Stroheim's Greed was faithful to that. By contrast Greedy was a high concept comedy, and not an especially dark one, which delivers the expected happy ending--with the result that in the end the title of the film and the choice of McTeague as the family name are just allusions, nothing more, with even a relatively favorably disposed viewer of the film like myself not ready to claim any clear evidence that the makers of the film had anything much deeper on their minds.
Mike Davis' City of Quartz: Excavating the Future in Los Angeles
The novelist Frank Norris remarked that the United States had only three "story" cities, and identified them as New York, New Orleans and San Francisco. I'm not sure that I would have ever wholly agreed with that list's being so limited. (What about, for example, Boston or Chicago, where Norris was actually born--and to which he did devote a major novel, the second installment in his Epic of the Wheat, The Pit?)
However, I certainly tended to not think of Los Angeles as on that list--even though it was the default setting of our screen stories from the sitcom to the action movie. Compared with those metropolises of which Norris wrote, and the other older American cities with them (like now nearly four century old Boston), L.A. seemed not so much "young" as "new," and if in portions glittering essentially hollow--a spot on the map rather than a place that (apart from a handful of Hollywood landmarks that I knew to be only a very small part of the whole sprawl, and remote from the everyday lives of almost everyone living there) lacked local character, or even simple visual distinctiveness. (I remember reading a review of 1997's disaster movie Volcano at the time of its release which lovingly detailed the succession of L.A. sights devastated amid the chaos. It just looked like generic modern cityscape to me.) Reflecting this, when toiling away on thriller plots in those days it was very far down my list of potential settings. (Where North America was concerned I inclined to New York--with legacies of that to be seen in both Surviving the Spike and The Shadows of Olympus.)
It was only reading Mike Davis' City of Quartz that made me look at Los Angeles differently. As with many of Davis' books it is less a neatly unified text than a loose collection of essays examining such varied subjects as the emergence of the city's "power structure," the views European visitors took of the place, the penchant for fortification evident in the local architecture, the counter-insurgency-like policing of the Daryl Gates era, the locale's "homeowners'" associations, the Catholic Church's role in local life, the economic fortunes of nearby Fontana, California--and of course, how the city has been "marketed" nationally and internationally. Still, together they covered a very great deal of territory, from a great many angles, and in the process gave me a sense of Los Angeles as a "place" that it did not have for me before, and better appreciate it not just in itself but as American myth, dream, fantasy and nightmare all at once, and in every one of those respects import in American history culturally as well as materially.
Of course, others remember the book differently. Davis, whose work since that time includes such titles as Late Victorian Holocausts, Planet of Slums and, in his other writing on L.A., Ecology of Fear: Los Angeles and the Imagination of Disaster, is the sort of public intellectual who gets little mainstream attention. (In line with its centrist prejudices the media is so much more fearful of the left than the right that in its case it more often hews to the view that "any publicity is good publicity," and denies it that, smothering it with silence.) And what attention Davis has got has often not been positive--sneers about a proneness to "catastrophism" a cliché of the dismissal of his work.
Also becoming a cliché in its way has been the tendency for those dismissals to look foolish later on. Anticipating a "social explosion" in Los Angeles in City of Quartz Davis looked inspired rather than scare-mongering in the wake of the 1992 Los Angeles riot. Likewise, as COVID-19 upended the world in ways few have the courage to acknowledge even three years on, his book about the danger of a pandemic meant a spike in "demand" for his comment--while one would imagine that the ongoing spread of avian flu that seems to have played a part (but only a part) in the soaring price of eggs to a degree conspicuous even in this moment of soaring prices for everything would have made him similarly in demand now were he still alive, that particular threat having been the focus of the aforementioned work.
However, I certainly tended to not think of Los Angeles as on that list--even though it was the default setting of our screen stories from the sitcom to the action movie. Compared with those metropolises of which Norris wrote, and the other older American cities with them (like now nearly four century old Boston), L.A. seemed not so much "young" as "new," and if in portions glittering essentially hollow--a spot on the map rather than a place that (apart from a handful of Hollywood landmarks that I knew to be only a very small part of the whole sprawl, and remote from the everyday lives of almost everyone living there) lacked local character, or even simple visual distinctiveness. (I remember reading a review of 1997's disaster movie Volcano at the time of its release which lovingly detailed the succession of L.A. sights devastated amid the chaos. It just looked like generic modern cityscape to me.) Reflecting this, when toiling away on thriller plots in those days it was very far down my list of potential settings. (Where North America was concerned I inclined to New York--with legacies of that to be seen in both Surviving the Spike and The Shadows of Olympus.)
It was only reading Mike Davis' City of Quartz that made me look at Los Angeles differently. As with many of Davis' books it is less a neatly unified text than a loose collection of essays examining such varied subjects as the emergence of the city's "power structure," the views European visitors took of the place, the penchant for fortification evident in the local architecture, the counter-insurgency-like policing of the Daryl Gates era, the locale's "homeowners'" associations, the Catholic Church's role in local life, the economic fortunes of nearby Fontana, California--and of course, how the city has been "marketed" nationally and internationally. Still, together they covered a very great deal of territory, from a great many angles, and in the process gave me a sense of Los Angeles as a "place" that it did not have for me before, and better appreciate it not just in itself but as American myth, dream, fantasy and nightmare all at once, and in every one of those respects import in American history culturally as well as materially.
Of course, others remember the book differently. Davis, whose work since that time includes such titles as Late Victorian Holocausts, Planet of Slums and, in his other writing on L.A., Ecology of Fear: Los Angeles and the Imagination of Disaster, is the sort of public intellectual who gets little mainstream attention. (In line with its centrist prejudices the media is so much more fearful of the left than the right that in its case it more often hews to the view that "any publicity is good publicity," and denies it that, smothering it with silence.) And what attention Davis has got has often not been positive--sneers about a proneness to "catastrophism" a cliché of the dismissal of his work.
Also becoming a cliché in its way has been the tendency for those dismissals to look foolish later on. Anticipating a "social explosion" in Los Angeles in City of Quartz Davis looked inspired rather than scare-mongering in the wake of the 1992 Los Angeles riot. Likewise, as COVID-19 upended the world in ways few have the courage to acknowledge even three years on, his book about the danger of a pandemic meant a spike in "demand" for his comment--while one would imagine that the ongoing spread of avian flu that seems to have played a part (but only a part) in the soaring price of eggs to a degree conspicuous even in this moment of soaring prices for everything would have made him similarly in demand now were he still alive, that particular threat having been the focus of the aforementioned work.
The California Dream and its Waning
Reading Mike Davis' City of Quartz I was surprised to find the extent to which Los Angeles is a product of real estate boosterism--the vast metropolis emerging before its industrial base, rather than the industrial base emerging first to become a foundation for a metropolis (with, contrary to what we may imagine given California's association with aerospace, computers, and so much else, manufacturing only blooming late in the state). Surprising, too, was the extent to which that boosterism was politicized in the late nineteenth century and early twentieth century. Southern California was promoted as a "refuge" for old-stock WASPs from the newer waves of immigrant, with their very different ethnic and religious backgrounds, with an element of health faddishness mixed in to produce a vision of "racial revival under a Mediterranean sun." It was also marketed as an "open shop" alternative to trade-unionized San Francisco.
In short it was a thoroughly right-wing vision, compounding a multitude of Victorian prejudices. And indeed, while the "Left Coast" stereotypes have a long history, southern California has long had its right-wing streak, with this evident not only in San Diego or the "Inland Empire," but in Los Angeles itself (certainly evident when one considers its elites, its social divisions, its policing).
Of course, for many on the right the vision of southern California as an ideal has long since given way to a view of it as a "Blue" wasteland exemplifying all it dislikes about the direction in which the country has been moving, such that rather than refuge it has in their view become a place to flee (for "Redder" territory inland). Meanwhile they have not been alone in watching, wondering, worrying, if for different reasons. Davis, coming from the opposite end of the political spectrum, called into question the viability of a metropolis built in that precise geographical location, with its susceptibility to earthquake and fire. He pointed, too, to the distinctly human-caused problem of the deindustrialization of what had once been one of the country's, and the world's, greatest industrial centers (at its height reputedly the second-biggest carmaker in the world after Detroit in its fabled heyday), with all that implied for its social sustainability. Amid an epoch of (likely climate change-amplified) drought, with water short and each summer the fires more horrific than in the last--as deindustrialization and its associated social stresses have mounted--one wonders if that writer, so often sneeringly dismissed as a catastrophist but vindicated by events, was not as right on this score as he has been on so many others.
In short it was a thoroughly right-wing vision, compounding a multitude of Victorian prejudices. And indeed, while the "Left Coast" stereotypes have a long history, southern California has long had its right-wing streak, with this evident not only in San Diego or the "Inland Empire," but in Los Angeles itself (certainly evident when one considers its elites, its social divisions, its policing).
Of course, for many on the right the vision of southern California as an ideal has long since given way to a view of it as a "Blue" wasteland exemplifying all it dislikes about the direction in which the country has been moving, such that rather than refuge it has in their view become a place to flee (for "Redder" territory inland). Meanwhile they have not been alone in watching, wondering, worrying, if for different reasons. Davis, coming from the opposite end of the political spectrum, called into question the viability of a metropolis built in that precise geographical location, with its susceptibility to earthquake and fire. He pointed, too, to the distinctly human-caused problem of the deindustrialization of what had once been one of the country's, and the world's, greatest industrial centers (at its height reputedly the second-biggest carmaker in the world after Detroit in its fabled heyday), with all that implied for its social sustainability. Amid an epoch of (likely climate change-amplified) drought, with water short and each summer the fires more horrific than in the last--as deindustrialization and its associated social stresses have mounted--one wonders if that writer, so often sneeringly dismissed as a catastrophist but vindicated by events, was not as right on this score as he has been on so many others.
A Zombie (Firm) Apocalypse?
The term "zombie firm" apparently goes back to at least 1987 but, unsurprisingly, acquired its current popularity with economic and financial commentators amid the twenty-first century's pop cultural obsession with the phenomenon.
In considering it one should note that there is no generally accepted definition of the term "zombie firm." However, the usage generally denotes a firm that is "economically unviable" in a particular way, namely its surviving on the extension of credit it has virtually no hope of ever repaying--such that the firm may be existent beyond the end of its natural life in a condition that, less than fully "alive," may be better understood as simply "undead."
With zombie firms' survival dependent on easy credit the concern that there are many more of them about than usual (and, perhaps, that firms of greater weight than usual fit the zombie profile) naturally grew during the protracted period of extremely low interest rates which followed the 2007-2008 financial crisis--and with it the concern that should the unsustainably ultra-sub market interest rates go up these firms will suddenly find credit no longer so easy, and the undead become simply the dead, the more in as such a surge in interest rates has been underway since early 2022. Should it be the case that there are more zombies about--that those zombies are larger and more central to the economy--then those zombies' being cut off from credit and finally dying out will mean that much more economic pain as a result of the tightening of credit.
Of course, trying to assess the matter in any rigorous way means trying to precisely establish a standard, and here there is, for the time being at least, room for individual judgment--which is to say, also interest and prejudice. Those looking to justify the monetary policy of the last two decades--to say that the ultra-loose monetary policy that prevailed for almost a decade and a half has been beneficent, and that the current rise in interest rates is also essentially beneficent--seem to be inclined to think that zombie firms are not a very big problem. (Indeed, that is what a 2021 Federal Reserve study of the matter argues, while I have seen nothing from this direction to indicate otherwise.*) By contrast others leerier of some part of this trajectory--for instance, Wall Street figures for whom the interest rate can never be low enough unhappy with their current direction, perhaps especially those concerned with sectors that may be more vulnerable than the average (as with venture capital-funded tech, or the real estate sector that may be the country's "most zombified"), display less equanimity about their economic impact.
For the time being the U.S. is not, officially, in recession, and indeed there has been a recent burst of optimism about the possibility of a "soft landing" in the wake of a decade and a half of monetary chaos--but the present trend (which includes a further ratcheting up of the interest rate) suggests that in even the most optimistic view (to say nothing of a less optimistic one that would seem better founded) such companies will find their path getting steeper, in the process revealing just how many zombies really are walking the Earth among us.
* The cited Federal Reserve study used for its standard the classification of a zombie firm as one with "leverage above the sample annual median, interest coverage ratio . . . below one, and negative real sales growth over the preceding three years," which means a company in the unenviable situation of being in the top 50 percent with regard to the ratio of debt to equity; pre-tax and interest earnings that are less than the interest they must pay; and shrinking sales for three years; such that they can't even keep up with their interest payments, while their earnings are moving in the wrong direction from the standpoint of their ability to pay, let alone get out from under their debt. (For what it is worth this seems to me a fairly reasonable standard.)
In considering it one should note that there is no generally accepted definition of the term "zombie firm." However, the usage generally denotes a firm that is "economically unviable" in a particular way, namely its surviving on the extension of credit it has virtually no hope of ever repaying--such that the firm may be existent beyond the end of its natural life in a condition that, less than fully "alive," may be better understood as simply "undead."
With zombie firms' survival dependent on easy credit the concern that there are many more of them about than usual (and, perhaps, that firms of greater weight than usual fit the zombie profile) naturally grew during the protracted period of extremely low interest rates which followed the 2007-2008 financial crisis--and with it the concern that should the unsustainably ultra-sub market interest rates go up these firms will suddenly find credit no longer so easy, and the undead become simply the dead, the more in as such a surge in interest rates has been underway since early 2022. Should it be the case that there are more zombies about--that those zombies are larger and more central to the economy--then those zombies' being cut off from credit and finally dying out will mean that much more economic pain as a result of the tightening of credit.
Of course, trying to assess the matter in any rigorous way means trying to precisely establish a standard, and here there is, for the time being at least, room for individual judgment--which is to say, also interest and prejudice. Those looking to justify the monetary policy of the last two decades--to say that the ultra-loose monetary policy that prevailed for almost a decade and a half has been beneficent, and that the current rise in interest rates is also essentially beneficent--seem to be inclined to think that zombie firms are not a very big problem. (Indeed, that is what a 2021 Federal Reserve study of the matter argues, while I have seen nothing from this direction to indicate otherwise.*) By contrast others leerier of some part of this trajectory--for instance, Wall Street figures for whom the interest rate can never be low enough unhappy with their current direction, perhaps especially those concerned with sectors that may be more vulnerable than the average (as with venture capital-funded tech, or the real estate sector that may be the country's "most zombified"), display less equanimity about their economic impact.
For the time being the U.S. is not, officially, in recession, and indeed there has been a recent burst of optimism about the possibility of a "soft landing" in the wake of a decade and a half of monetary chaos--but the present trend (which includes a further ratcheting up of the interest rate) suggests that in even the most optimistic view (to say nothing of a less optimistic one that would seem better founded) such companies will find their path getting steeper, in the process revealing just how many zombies really are walking the Earth among us.
* The cited Federal Reserve study used for its standard the classification of a zombie firm as one with "leverage above the sample annual median, interest coverage ratio . . . below one, and negative real sales growth over the preceding three years," which means a company in the unenviable situation of being in the top 50 percent with regard to the ratio of debt to equity; pre-tax and interest earnings that are less than the interest they must pay; and shrinking sales for three years; such that they can't even keep up with their interest payments, while their earnings are moving in the wrong direction from the standpoint of their ability to pay, let alone get out from under their debt. (For what it is worth this seems to me a fairly reasonable standard.)
Friday, February 17, 2023
How Would Dostoyevsky's Underground Man Be Living Today?
Reading one's way through the classics of nineteenth century literature one can scarcely avoid Fyodor Dostoyevsky--and while he is perhaps best known through his thousand-page epic The Karamazov Brothers (one big book that justifies its length far, far better than most in this reader's opinion) I suspect many more who actually read him (rather than lyingly say they do) know him from shorter works. Among the more notable is his Notes From Underground--the first-person confession of a man who, today, might be described as a nineteenth-century hikikomori who has used a modest inheritance to isolate himself in an apartment in the Russian capital of Saint Petersburg for as long as his money will hold out.
Naturally I have found my thoughts turning back to it when taking up the subject of social isolation today. In the course of that I found myself thinking about how technology has changed the situation of someone of similar means attempting to live in such a manner--a point that can seem trivial next to some of the issues with which Dostoyevsky struggles, but have some importance.
In the book the "Underground Man" (as the character has come to be known) was obliged to keep a servant to relieve his household cares (somebody had to go out to get food). Today modern household appliances and online delivery make that rather less necessary.
Meanwhile, Dostoyevsky's Underground Man had little to do in his apartment but read. Today he would have the brave new world of electronic entertainment to keep himself busy--including all the books anyone could handle if he opted to have an e-reader, though I suspect he would have spent a lot of time playing video games. On those occasions when he got the itch to interact with others (which in the book means his barging into the company of people he hates, and who do not want him around, time and again leading to dreadfully humiliating episodes) he could have just done it online--while he would have probably opted for cybersex over his trips to a brothel.
All of this, I think, makes it easier for people to live in isolation--to, while keeping isolated, minimize their contact with others, and keep themselves entertained. And I suspect that many more now do so because of that.
In saying so I render no judgment on "technology"--the availability of which I consider to at least have given people some measure of choice. What seems to me a problem is the condition of a society that makes so many find becoming an Underground Man the least-worst of the options open to them--which is absolutely no testament to its health.
Naturally I have found my thoughts turning back to it when taking up the subject of social isolation today. In the course of that I found myself thinking about how technology has changed the situation of someone of similar means attempting to live in such a manner--a point that can seem trivial next to some of the issues with which Dostoyevsky struggles, but have some importance.
In the book the "Underground Man" (as the character has come to be known) was obliged to keep a servant to relieve his household cares (somebody had to go out to get food). Today modern household appliances and online delivery make that rather less necessary.
Meanwhile, Dostoyevsky's Underground Man had little to do in his apartment but read. Today he would have the brave new world of electronic entertainment to keep himself busy--including all the books anyone could handle if he opted to have an e-reader, though I suspect he would have spent a lot of time playing video games. On those occasions when he got the itch to interact with others (which in the book means his barging into the company of people he hates, and who do not want him around, time and again leading to dreadfully humiliating episodes) he could have just done it online--while he would have probably opted for cybersex over his trips to a brothel.
All of this, I think, makes it easier for people to live in isolation--to, while keeping isolated, minimize their contact with others, and keep themselves entertained. And I suspect that many more now do so because of that.
In saying so I render no judgment on "technology"--the availability of which I consider to at least have given people some measure of choice. What seems to me a problem is the condition of a society that makes so many find becoming an Underground Man the least-worst of the options open to them--which is absolutely no testament to its health.
Top Gun 2 vs. Avatar 2: A Note
In 2022 fans of Top Gun crowed at the reception to the sequel, with some justification. The movie was the #1 hit of the year at the North American box office.
But it was a different story in the global market, where Avatar 2 has been the biggest hit, by a long way. At this point the sequel seems likely to finish with a take about fifty percent higher than Top Gun: Maverick ($2.2 billion+ versus the other film's $1.5 billion), with the difference overwhelmingly made by the international market. (Avatar 2, closing in on $1.6 billion outside North America, made more than twice as much as Maverick in the international market, and more in those markets than Maverick made altogether.)
It is a reminder that, even as the world economy seems to be fragmenting, the international markets still matter--and that Hollywood will come out ahead offering movies with broad global appeal, with Avatar arguably the easier sell worldwide than the flag-waving Top Gun.
The principle would seem to apply elsewhere, with a Chinese film industry offering up jingoistic action-adventure and war movies playing well at home (Wolf Warrior 2, The Battle at Lake Changjin), but probably not helping Chinese film very much in its reaching that bigger, global market, adding obstacles to a path already cluttered with them.
But it was a different story in the global market, where Avatar 2 has been the biggest hit, by a long way. At this point the sequel seems likely to finish with a take about fifty percent higher than Top Gun: Maverick ($2.2 billion+ versus the other film's $1.5 billion), with the difference overwhelmingly made by the international market. (Avatar 2, closing in on $1.6 billion outside North America, made more than twice as much as Maverick in the international market, and more in those markets than Maverick made altogether.)
It is a reminder that, even as the world economy seems to be fragmenting, the international markets still matter--and that Hollywood will come out ahead offering movies with broad global appeal, with Avatar arguably the easier sell worldwide than the flag-waving Top Gun.
The principle would seem to apply elsewhere, with a Chinese film industry offering up jingoistic action-adventure and war movies playing well at home (Wolf Warrior 2, The Battle at Lake Changjin), but probably not helping Chinese film very much in its reaching that bigger, global market, adding obstacles to a path already cluttered with them.
The (D)Evolution of Keir Starmer's Rhetoric: From the Ten Pledges to His 2023 New Year's Speech
Back during the Labour Party's leadership contest in 2020 Keir Starmer presented to the world a social democratic platform of "ten pledges," promising higher taxes on the rich, the end of college tuition, "common ownership of rail, mail, energy and water," the repeal of the Trade Union Act, and "the Green New Deal at the heart of everything we do."
He won the contest of course--and then quickly distanced himself from the ten pledges. Already by the time of his speech in answer to the government's budget a year later he limited himself to milder criticisms than would have been expected given his purported vision for Britain, and the "effective opposition to the Tories" that was itself a promise (#10). Not long afterward he officially scrapped the pledges.
Subsequently, as his 2022 Labour Party Conference keynote speech made clear Starmer has, if sprinkling his speeches with radical touches (throwing around references to "the 1 percent" as if he were an Occupy Wall Street activist), consistently stressed "fiscal responsibility" (for which purpose he repeatedly declares himself ready to defer the accomplishment of "good Labour things"), and only grown more emphatic in his expressions of a desire for "partnership" with business and exaltation of entrepreneurs, with this much more than this leader of the Labour party says about labor, toward which he is at best insultingly aloof. Even his most ambitiously leftish project of 100 percent "clean energy" by 2030, with this done partly through a state-owned firm which could facilitate the transition, appears a pious desire rather than a firm promise (whatever that would mean from the same man who not only made it clear that everything else is secondary to austerity economics, but so casually discarded his prior pledges). And to read his subsequent speeches is to be numbed by the repetitiveness of what have come to seem Starmer's clichés.
The result is that Starmer early on more than confirmed any suspicions that he was a Blairite neoliberal-centrist posing as a leftie to neutralize a leftward challenge, and then as soon as it was convenient reverting to his actual sense because, as he keeps saying "I want to be Prime Minister." Indeed, in his justification of his scrapping of the pledges he exemplifies that particularly neoliberal habit of seeing the correct response to every crisis as a right turn (i.e. "disaster capitalism"), of always dealing with the troubles made by neoliberalism with . . . more neoliberalism.
Of course, Starmer might still "lead" Labour to victory in the next General Election for all that. At this point Britain has already seen over twelve years of continuous, catastrophic (austerity, Brexit, a badly bungled handling of the pandemic in all its facets, a post-war era-style currency crisis and IMF bailout, and now, as always, more austerity in its wake) under a string of unfortunate Prime Ministers it seems almost impossible to say anything positive--David Cameron (who staged the Brexit referendum as a bluff on behalf of the regulation-haters in the City--some bluff that), Theresa May (whose tears were only ever for herself), Boris "let the bodies pile high in their thousands" Johnson (even if he didn't actually utter the words his actions spoke them), Liz Truss (who thinks imported cheese is a disgrace and in the wake of presenting her fiscal vision "got outlasted by a lettuce head"), and that Rishi Sunak (who in his youth bragged about having no working-class friends, and probably isn't making many now with his austerity on steroids). In the process they have made it very, very easy for the official opposition to beat them--especially when it is led by a neoliberal who will be treated far more gently by business and the press than his predecessor, after which victory he will, of course, get hard at work disappointing whatever slight hopes he managed to raise in the voters in another turn of the widening gyre.
He won the contest of course--and then quickly distanced himself from the ten pledges. Already by the time of his speech in answer to the government's budget a year later he limited himself to milder criticisms than would have been expected given his purported vision for Britain, and the "effective opposition to the Tories" that was itself a promise (#10). Not long afterward he officially scrapped the pledges.
Subsequently, as his 2022 Labour Party Conference keynote speech made clear Starmer has, if sprinkling his speeches with radical touches (throwing around references to "the 1 percent" as if he were an Occupy Wall Street activist), consistently stressed "fiscal responsibility" (for which purpose he repeatedly declares himself ready to defer the accomplishment of "good Labour things"), and only grown more emphatic in his expressions of a desire for "partnership" with business and exaltation of entrepreneurs, with this much more than this leader of the Labour party says about labor, toward which he is at best insultingly aloof. Even his most ambitiously leftish project of 100 percent "clean energy" by 2030, with this done partly through a state-owned firm which could facilitate the transition, appears a pious desire rather than a firm promise (whatever that would mean from the same man who not only made it clear that everything else is secondary to austerity economics, but so casually discarded his prior pledges). And to read his subsequent speeches is to be numbed by the repetitiveness of what have come to seem Starmer's clichés.
The result is that Starmer early on more than confirmed any suspicions that he was a Blairite neoliberal-centrist posing as a leftie to neutralize a leftward challenge, and then as soon as it was convenient reverting to his actual sense because, as he keeps saying "I want to be Prime Minister." Indeed, in his justification of his scrapping of the pledges he exemplifies that particularly neoliberal habit of seeing the correct response to every crisis as a right turn (i.e. "disaster capitalism"), of always dealing with the troubles made by neoliberalism with . . . more neoliberalism.
Of course, Starmer might still "lead" Labour to victory in the next General Election for all that. At this point Britain has already seen over twelve years of continuous, catastrophic (austerity, Brexit, a badly bungled handling of the pandemic in all its facets, a post-war era-style currency crisis and IMF bailout, and now, as always, more austerity in its wake) under a string of unfortunate Prime Ministers it seems almost impossible to say anything positive--David Cameron (who staged the Brexit referendum as a bluff on behalf of the regulation-haters in the City--some bluff that), Theresa May (whose tears were only ever for herself), Boris "let the bodies pile high in their thousands" Johnson (even if he didn't actually utter the words his actions spoke them), Liz Truss (who thinks imported cheese is a disgrace and in the wake of presenting her fiscal vision "got outlasted by a lettuce head"), and that Rishi Sunak (who in his youth bragged about having no working-class friends, and probably isn't making many now with his austerity on steroids). In the process they have made it very, very easy for the official opposition to beat them--especially when it is led by a neoliberal who will be treated far more gently by business and the press than his predecessor, after which victory he will, of course, get hard at work disappointing whatever slight hopes he managed to raise in the voters in another turn of the widening gyre.
Wednesday, February 15, 2023
On the Entertainment Media's Coverage of "Bond 26"
It may seem odd that I am writing about the coverage of "Bond 26" rather than about the (as yet nonexistent but widely anticipated) movie itself. However, that coverage is itself odd--sufficiently so as to warrant some remark.
The reason is that the entertainment press keeps talking about the project--even though there is no actual news for them to report, let alone analyze. Either the producers are doing nothing at all, or the preparations are an extremely well-guarded secret. In either case the entertainment press is left with nothing to offer. Yet they keep talking--and in the process recycle the same speculations over and over and over again.
"Who will be the next James Bond?" they ask, and run through (mostly) the same list of candidates over and over again.
"What will the tone of the movies be like?" they ask. Will they stick with the more grounded material and darker, more "character"-oriented tone of the Daniel Craig era, or revert to the more flamboyant adventures of yore?
It was all actually talked to death around the time of Spectre--back in 2015. (You know, before the Trump presidency, before Brexit, before #MeToo, before the pandemic, before lots of things that can not unreasonably leave one feeling that the world has changed.) But the extreme delay in the release of that movie that from the first was expected to be Craig's last, No Time to Die, meant they just had nowhere else to go. And so has the fact that in the year since that movie hit theaters there has been so little for those claqueurs-posing-as-journalists tasked with sustaining public interest in the franchise.
Might that start to change soon? Certainly the film industry seems to be stabilizing, and along lines not much different from what prevailed before--small dramas and comedies ever less likely to be a theatrical draw, and ever more consistently relegated to streaming; but the big franchise action-adventure doing well, as demonstrated over the last year by hits like Spider-Man: No Way Home, Top Gun: Maverick and Avatar: The Way of Water in particular, and likely to be further reconfirmed as we proceed through what looks like a thoroughly blockbuster-packed 2023. The result is that Hollywood will feel more confident about pouring its money into such projects as a new Bond film.
Still, that has only been part of the problem. It seems that No Time to Die was, to the extent that it did less well than some hoped (especially in North America), was held back not merely by the pandemic (undeniable a factor as it was), but also the franchise's plain and simple running down. This most venerable of high-concept action-adventure franchises--the one that can truly be credited with "starting it all"--is showing its considerable age, which if it were a person would have it looking forward to its pension a few years hence. Audiences were less interested, with this going especially for the younger age cohorts for which nostalgia plays less part, and the tropes are less resonant (having grown up on superheroes rather than secret agents), in an ultra-crowded market where 007 appears ever less special (indeed, perhaps even less their favored secret agent hero than the now similarly-approaching-retirement-age gang from the Fast and the Furious franchise). This has likely played its part in the hesitations of the producers about proceeding--and I have no idea how far they may be along in moving past that to settle on just what approach with Bond 26 will get the public excited about 007 again (assuming, of course, that is not a bridge too far"), the more in as the budgets, and the grosses that alone can justify them, make such movies ever more a preposterous gamble for the backers.
The reason is that the entertainment press keeps talking about the project--even though there is no actual news for them to report, let alone analyze. Either the producers are doing nothing at all, or the preparations are an extremely well-guarded secret. In either case the entertainment press is left with nothing to offer. Yet they keep talking--and in the process recycle the same speculations over and over and over again.
"Who will be the next James Bond?" they ask, and run through (mostly) the same list of candidates over and over again.
"What will the tone of the movies be like?" they ask. Will they stick with the more grounded material and darker, more "character"-oriented tone of the Daniel Craig era, or revert to the more flamboyant adventures of yore?
It was all actually talked to death around the time of Spectre--back in 2015. (You know, before the Trump presidency, before Brexit, before #MeToo, before the pandemic, before lots of things that can not unreasonably leave one feeling that the world has changed.) But the extreme delay in the release of that movie that from the first was expected to be Craig's last, No Time to Die, meant they just had nowhere else to go. And so has the fact that in the year since that movie hit theaters there has been so little for those claqueurs-posing-as-journalists tasked with sustaining public interest in the franchise.
Might that start to change soon? Certainly the film industry seems to be stabilizing, and along lines not much different from what prevailed before--small dramas and comedies ever less likely to be a theatrical draw, and ever more consistently relegated to streaming; but the big franchise action-adventure doing well, as demonstrated over the last year by hits like Spider-Man: No Way Home, Top Gun: Maverick and Avatar: The Way of Water in particular, and likely to be further reconfirmed as we proceed through what looks like a thoroughly blockbuster-packed 2023. The result is that Hollywood will feel more confident about pouring its money into such projects as a new Bond film.
Still, that has only been part of the problem. It seems that No Time to Die was, to the extent that it did less well than some hoped (especially in North America), was held back not merely by the pandemic (undeniable a factor as it was), but also the franchise's plain and simple running down. This most venerable of high-concept action-adventure franchises--the one that can truly be credited with "starting it all"--is showing its considerable age, which if it were a person would have it looking forward to its pension a few years hence. Audiences were less interested, with this going especially for the younger age cohorts for which nostalgia plays less part, and the tropes are less resonant (having grown up on superheroes rather than secret agents), in an ultra-crowded market where 007 appears ever less special (indeed, perhaps even less their favored secret agent hero than the now similarly-approaching-retirement-age gang from the Fast and the Furious franchise). This has likely played its part in the hesitations of the producers about proceeding--and I have no idea how far they may be along in moving past that to settle on just what approach with Bond 26 will get the public excited about 007 again (assuming, of course, that is not a bridge too far"), the more in as the budgets, and the grosses that alone can justify them, make such movies ever more a preposterous gamble for the backers.
Robert Heinlein's Friday, Again
In Robert Heinlein's novel Friday the character identified as the Boss remarked at one point in the novel that "a dying culture invariably exhibits personal rudeness. Bad manners. Lack of consideration for others in minor matters," and that this "symptom" of a dying culture "is especially serious in that an individual displaying it never thinks of it as a sign of ill health but as proof of his/her strength."
The remark was not a throwaway line, but relevant to the book's theme, precisely because the titular protagonist's problem was that of making a life for herself in a world that really was supposed to be falling apart.
Of course, people have always complained that their "culture was dying"--and as part of the package always complained about the decline of manners. It has especially been the case that the old have always complained about the young in this manner. And for precisely that reason I ordinarily discount this sort of thing.
Still, the passage caught my eye partly because there was a bit more nuance here, in particular the observation that it is something that worsens, with the bit about mistaking rudeness for "proof of strength" especially telling. A culture where people take such pride in being an "asshole" as we see, in which people who act like "assholes" are worshipped by people whose greatest aspiration in life seems to similarly be "assholes," is not necessarily "dying," but also far from healthy.
The remark was not a throwaway line, but relevant to the book's theme, precisely because the titular protagonist's problem was that of making a life for herself in a world that really was supposed to be falling apart.
Of course, people have always complained that their "culture was dying"--and as part of the package always complained about the decline of manners. It has especially been the case that the old have always complained about the young in this manner. And for precisely that reason I ordinarily discount this sort of thing.
Still, the passage caught my eye partly because there was a bit more nuance here, in particular the observation that it is something that worsens, with the bit about mistaking rudeness for "proof of strength" especially telling. A culture where people take such pride in being an "asshole" as we see, in which people who act like "assholes" are worshipped by people whose greatest aspiration in life seems to similarly be "assholes," is not necessarily "dying," but also far from healthy.
The Screen DC Universe Reboots: What Can We Expect?
The superhero movie genre has been booming from the very start of this century, with the entire period seeing the major studios barraging audiences with A-grade, big-budget blockbusters about the biggest household name characters from the pages of Marvel and DC Comics--accompanied by a smaller but still significant traffic on the small screen (in the age of streaming, increasingly an extension of the big-screen stuff, such that you had to watch Wandavision to fully appreciate Dr. Strange 2).
Unsurprisingly, after being so heavily exploited for so long, the genre is not looking its best.
Consider where things now stand. The Marvel Cinematic Universe (MCU) has gone from looking unstoppable in Phase Three to underwhelming in Phase Four; and the DC Extended Universe a letdown again and again, such that Warner Bros. Discovery is officially giving up on the old vision (all as it likewise seems to be giving up on the small-screen Arrowverse). Amid all that lovers of superhero film and television desirous of more would seem to have had as their best hope for something fresh and interesting the reports of James Gunn and Peter Safran's work on some grand reboot of the DC Universe for after 2023 (at least, excluding the decision to make a sequel to The Batman, due out in 2025).
This month they have revealed part of what audiences can expect, including:
* A TV show all about Waller (the character Viola Davis first played in 2016's Suicide Squad), inventively named "Waller."
* A Green Lantern show "in the vein of True Detective."
* A Game of Thrones-style show (blood-soaked soap opera?) set on Wonder Woman's home island of Themyscira.
* A "much more hard-core" Supergirl pointedly produced by a horrific childhood spent getting to Earth.
* An "impostor syndrome"-themed show about Booster Gold.
* A Batman film (not starring Robert Pattinson) utilizing the content of The Brave and the Bold, complete with Damian Wayne in a "very strange father-and-son story."
That's not all of it, but I think you get the picture.
The entertainment press, of course, is responding with its usual enthusiasm.
I will admit to not sharing the feeling. Admittedly I have been losing enthusiasm for the genre for rather a long time--the genre seeming to me to keep going mainly on the form's convenient fit with the studios' particular marketing requirements rather than its having anything fresh and new to offer (and depending overmuch on such superficial tweaks as the "edginess" of Deadpool, etc. for any appearance of newness). But there is also something else evident in the pattern that has me not looking forward to any of these--namely that (as we ought to have expected from the writer who decided "Let's make Scrappy Doo the villain!" in a textbook example of the principle that the Teen Titans Go! episode "The Return of Slade" had to teach, and which the absolutely unteachable folks in Hollywood seem absolutely incapable of learning), he is going all-in on "dark" and "edgy" and frankly pretentious. (A Green Lantern show "in the vein of True Detective?" Really?) Irksome enough in itself to those of us who think big dumb action movies should actually be fun it is the more annoying because rather than some dramatic shift-of-course in that respect, at least, it actually sounds like just-more-of-the-same stuff that I, for one, didn't particularly care for in the first place.
Unsurprisingly, after being so heavily exploited for so long, the genre is not looking its best.
Consider where things now stand. The Marvel Cinematic Universe (MCU) has gone from looking unstoppable in Phase Three to underwhelming in Phase Four; and the DC Extended Universe a letdown again and again, such that Warner Bros. Discovery is officially giving up on the old vision (all as it likewise seems to be giving up on the small-screen Arrowverse). Amid all that lovers of superhero film and television desirous of more would seem to have had as their best hope for something fresh and interesting the reports of James Gunn and Peter Safran's work on some grand reboot of the DC Universe for after 2023 (at least, excluding the decision to make a sequel to The Batman, due out in 2025).
This month they have revealed part of what audiences can expect, including:
* A TV show all about Waller (the character Viola Davis first played in 2016's Suicide Squad), inventively named "Waller."
* A Green Lantern show "in the vein of True Detective."
* A Game of Thrones-style show (blood-soaked soap opera?) set on Wonder Woman's home island of Themyscira.
* A "much more hard-core" Supergirl pointedly produced by a horrific childhood spent getting to Earth.
* An "impostor syndrome"-themed show about Booster Gold.
* A Batman film (not starring Robert Pattinson) utilizing the content of The Brave and the Bold, complete with Damian Wayne in a "very strange father-and-son story."
That's not all of it, but I think you get the picture.
The entertainment press, of course, is responding with its usual enthusiasm.
I will admit to not sharing the feeling. Admittedly I have been losing enthusiasm for the genre for rather a long time--the genre seeming to me to keep going mainly on the form's convenient fit with the studios' particular marketing requirements rather than its having anything fresh and new to offer (and depending overmuch on such superficial tweaks as the "edginess" of Deadpool, etc. for any appearance of newness). But there is also something else evident in the pattern that has me not looking forward to any of these--namely that (as we ought to have expected from the writer who decided "Let's make Scrappy Doo the villain!" in a textbook example of the principle that the Teen Titans Go! episode "The Return of Slade" had to teach, and which the absolutely unteachable folks in Hollywood seem absolutely incapable of learning), he is going all-in on "dark" and "edgy" and frankly pretentious. (A Green Lantern show "in the vein of True Detective?" Really?) Irksome enough in itself to those of us who think big dumb action movies should actually be fun it is the more annoying because rather than some dramatic shift-of-course in that respect, at least, it actually sounds like just-more-of-the-same stuff that I, for one, didn't particularly care for in the first place.
Saturday, February 11, 2023
What Do The Fabelmans, Batgirl and Rust Tell Us About the Future of the Movies?
Especially as two of the three films I name in the title of this post have not even been released--one of them, in fact, unlikely to even be completed, let alone see the light of day--while the other movie has actually been very little-seen, it may be best to offer a word or two about each one of them.
The Fabelmans is a drama Steven Spielberg made on the basis of his experiences growing up that, if carrying the highly salable Spielberg name and much loved by the critics, was probably never going to play like Indiana Jones or Jurassic Park, or even Schindler's List or Lincoln. Still, box office-watchers had expected the film to do better than it has. (Its first weekend in wide release saw it take in a mere $3 million, while, even with the bump from its slew of seven Oscar nominations--including Best Picture, Best Director and two for the cast's performances--all it has taken in at the North American box office is a bit under $17 million--less than half the reported production budget of $40 million.)
Batgirl, as the reader has probably guessed, is a Warner Bros. Discovery production starring just that character intended for streaming. The production, which ran into trouble partly on account of the pandemic, and underwent a change of directors, saw its budget swell from $70 million to over $90 million. Subsequently there was a change of leadership at the company, which judged the resulting movie's prospects in either the streaming market or in theaters too poor for the movie to be worth completing, and, according to the prevailing reports, decided to just bury the unfinished movie and take the tax write-off.
Rust is an upcoming Western starring Alec Baldwin (who also has story and producer credits)--and if you have heard about this one it is probably because Baldwin, and the movie's armorer, Hannah Gutierrez-Reed, are up on charges of "involuntary manslaughter" as a result of his having fatally shot someone on the set with a gun he did not know to be loaded.
These are quite different films, with not much in common between them, but individually they have been recognized as each indicative of a significant trend in film production. The Fabelmans has been taken as an indication of the continued decline of the salability of drama as a theatrical experience--people generally opting to watch such movies at home. Batgirl has been an indication of the media companies' drawing back from their earlier splurging on streaming in recognition of its limits as a revenue stream (no substitute for selling $15-$20 tickets in theaters, and indeed unlikely to justify the cost of a mid-to-high-budget production like Batgirl). And Rust, where amid talk of the on-set tragedy one constantly hears reference to extreme low budgets and corner-cutting (which, for his part, actor Michael Shannon has called out as part of a bigger pattern) is indicative of the tendency in financing those films that are not "the biggest."
The result is that, while the pandemic period saw some experimentation with streaming in unprecedented ways--Disney making grade-A blockbusters like Black Widow and Mulan available at home for a surcharge on their opening weekend in theaters--any radical shift in the making and marketing of such content seems to have been made implausible by the results, the studios focusing on those theatrical releases and their as yet irreplaceable $15-20 a ticket revenue stream. These blockbusters (the enduring place of which is demonstrated not just by failures like The Fabelmans but undisputed successes like Spider-Man: No Way Home, Top Gun: Maverick, Avatar: The Way of Water) are ever-more dominated by a very few genres, above all action-adventure (and to a lesser extent, splashy animated features dominated by musical comedy), to the point of everything else not only being crowded off the multiplex screens, but people falling out of the habit of catching them there (arguably, the more easily as the theatrical, giant-screen experience provides less "added value" with other movies than it does the noisome spectacles).
Meanwhile, with producers disabused of their illusions about streaming, everything else is being made more penuriously, not only as Rust was, but such that WBD, and its rivals, would probably not authorize a project like Batgirl today (too costly to be a practical for-streaming project, but still too small a production for theaters, with this logic seemingly confirmed by the bigger budget for a Blue Beetle movie intended for theatrical release), and a movie like The Fabelmans, especially when not made by a Spielberg, have ever less chance of being authorized as a production on such a scale.
In short--giant action movies and cartoons at the theater, and the relegation of pretty much everything else to cut-rate streaming, with very little in between. For the film-maker and the film-lover alike it is not a happy situation--a hardening of the already oppressive constraints on what could be made. However, it seems far better to acknowledge that forthrightly than indulge in the mindless boosterism so tiresomely standard for the entertainment press--the more in as it is really just a confirmation of where most of those with any observational skills at all knew things were going way before the pandemic.
The Fabelmans is a drama Steven Spielberg made on the basis of his experiences growing up that, if carrying the highly salable Spielberg name and much loved by the critics, was probably never going to play like Indiana Jones or Jurassic Park, or even Schindler's List or Lincoln. Still, box office-watchers had expected the film to do better than it has. (Its first weekend in wide release saw it take in a mere $3 million, while, even with the bump from its slew of seven Oscar nominations--including Best Picture, Best Director and two for the cast's performances--all it has taken in at the North American box office is a bit under $17 million--less than half the reported production budget of $40 million.)
Batgirl, as the reader has probably guessed, is a Warner Bros. Discovery production starring just that character intended for streaming. The production, which ran into trouble partly on account of the pandemic, and underwent a change of directors, saw its budget swell from $70 million to over $90 million. Subsequently there was a change of leadership at the company, which judged the resulting movie's prospects in either the streaming market or in theaters too poor for the movie to be worth completing, and, according to the prevailing reports, decided to just bury the unfinished movie and take the tax write-off.
Rust is an upcoming Western starring Alec Baldwin (who also has story and producer credits)--and if you have heard about this one it is probably because Baldwin, and the movie's armorer, Hannah Gutierrez-Reed, are up on charges of "involuntary manslaughter" as a result of his having fatally shot someone on the set with a gun he did not know to be loaded.
These are quite different films, with not much in common between them, but individually they have been recognized as each indicative of a significant trend in film production. The Fabelmans has been taken as an indication of the continued decline of the salability of drama as a theatrical experience--people generally opting to watch such movies at home. Batgirl has been an indication of the media companies' drawing back from their earlier splurging on streaming in recognition of its limits as a revenue stream (no substitute for selling $15-$20 tickets in theaters, and indeed unlikely to justify the cost of a mid-to-high-budget production like Batgirl). And Rust, where amid talk of the on-set tragedy one constantly hears reference to extreme low budgets and corner-cutting (which, for his part, actor Michael Shannon has called out as part of a bigger pattern) is indicative of the tendency in financing those films that are not "the biggest."
The result is that, while the pandemic period saw some experimentation with streaming in unprecedented ways--Disney making grade-A blockbusters like Black Widow and Mulan available at home for a surcharge on their opening weekend in theaters--any radical shift in the making and marketing of such content seems to have been made implausible by the results, the studios focusing on those theatrical releases and their as yet irreplaceable $15-20 a ticket revenue stream. These blockbusters (the enduring place of which is demonstrated not just by failures like The Fabelmans but undisputed successes like Spider-Man: No Way Home, Top Gun: Maverick, Avatar: The Way of Water) are ever-more dominated by a very few genres, above all action-adventure (and to a lesser extent, splashy animated features dominated by musical comedy), to the point of everything else not only being crowded off the multiplex screens, but people falling out of the habit of catching them there (arguably, the more easily as the theatrical, giant-screen experience provides less "added value" with other movies than it does the noisome spectacles).
Meanwhile, with producers disabused of their illusions about streaming, everything else is being made more penuriously, not only as Rust was, but such that WBD, and its rivals, would probably not authorize a project like Batgirl today (too costly to be a practical for-streaming project, but still too small a production for theaters, with this logic seemingly confirmed by the bigger budget for a Blue Beetle movie intended for theatrical release), and a movie like The Fabelmans, especially when not made by a Spielberg, have ever less chance of being authorized as a production on such a scale.
In short--giant action movies and cartoons at the theater, and the relegation of pretty much everything else to cut-rate streaming, with very little in between. For the film-maker and the film-lover alike it is not a happy situation--a hardening of the already oppressive constraints on what could be made. However, it seems far better to acknowledge that forthrightly than indulge in the mindless boosterism so tiresomely standard for the entertainment press--the more in as it is really just a confirmation of where most of those with any observational skills at all knew things were going way before the pandemic.
Are Commercials More Annoying Than They Used to Be? Yes, Yes They Are.
A certain sort of person shrugs off other people's observations about the ways in which the world may be changing with the view that everything is always the same, and any perception of change is subjective and meaningless. The bias of contemporary culture toward respect for ideas that look like "eternal verities," the frequency with which people offer superficial observations that are easily debunked, and the "air of superiority to the world at large" with which such statements tend to be made, allows that shrugging banality and cop-out to pass for great wisdom. But all the same, that objective, material world is out there, and we are undeniably part of it and affected by it. Yes, there are times when life probably really is getting harder for most people--and where such evidence goes it is hard to argue with something like life expectancy, with falling life expectancy proof of genuine hardship. The trend of things in the U.S. this past decade, with (according to CDC/World Bank data) male life expectancy (already eroding before the pandemic) fallen over three years in just the 2014 to 2021 period alone (from 76.5 to 73.2 years), and female life expectancy a more modest but still significant two years (from to 81.2 to 79.1). All of this is the more the case as it is virtually undisputed that the deaths lowering that expectancy (the pandemic apart) are "deaths of despair" (alcoholism, drug abuse, suicide, etc.), while the averages conceal significant differences, with the poor suffering more than these figures indicate. Even before the trend had progressed so far the richest 1 percent of men lived 15 years longer than the poorest 1 percent--while the erosion of life expectancy in the years since would seem to have hit the poor harder than the rich (the portion of it related to the pandemic included) . . .
"Wait, wasn't this going to be about commercials?" some readers are doubtless asking.
Yes, in fact it already is. Specifically it seems to me that in a world where people are suffering so much--the actual deaths only the most extreme outcome of the untoward larger trend--even the little things are weighing on us more.
Like the annoyance caused by commercials.
Still, I do think the actual experience of those commercials is relevant. There is how advertisers strive harder and harder to penetrate an ever-more cacophonous media universe to reach a hyper-saturated audience that wants absolutely nothing to do them, and has developed defenses against them. There are such disgusting "innovations" as autoplay for videos no one would choose to play, so that before you have even spotted the video you hear the commercial blaring and are left hurriedly trying to locate it and shut off the distraction so that you can get on with what you came for. There is the way that when we are online going to some website we will be hit in the face with a popup demanding we disable our ad blockers--the element of coercion, the fact of our putting ourselves to the inconvenience of disabling our protections (and then having to enable them again as soon as we leave), acquiescing in their harassing us, the more distasteful. There is their readiness to make their content annoying as the price of keeping us from shutting them out mentally--with one tactic the design of commercials so that after seeing them a hundred times you are still not sure what they are about, or even what they are selling, the approach of catching the viewer's attention to insidiously imprint some subconscious association unbelievably cynical and obnoxious (while in the case of some of us making us wonder just what it was about so that we consciously pay attention, manipulated yet again).
Meanwhile there is the ever-greater attention people are encouraged to devote to "status politics." The result is that the cultural traditionalist will not only be annoyed by what they experience as a ceaseless barrage of "representation," "counter-stereotype" and the like, but the sight of it will be the more charged because of that larger context--every commercial, like everything else, a battle in the culture war. Others who are not averse to such politics may still be annoyed by what they object to not for "wokeness," but simply the "corporate kind." And still others who have no objection to even corporate wokeness will be the more annoyed by the occasions when a commercial is not so "woke." (In short, everyone walks away unhappy, which frankly is the point of elevating such politics.)
One may add that the politics can seem to have had a subtler impact on the commercials. Traditionally commercials relied on the casting of physically attractive persons in them. They relied on an element of glamour and, yes, sex appeal. Humor was important, too. It is not always clear that all this actually helped sell more goods. And often it annoyed as many as it pleased. But for those who were amenable to the content--frequently to the point of, every now and then, actually liking the commercial in question, even when they didn't like commercials generally--it made living in an advertising-soaked media universe easier to take, a single good commercial buying a measure of forgiveness for a lot of bad ones. Now--and I admit this is not the sort of opinion you would see uttered in the mainstream media--in the view of a non-negligible portion of the market, when judged by traditional, conventional criteria, advertising has retreated from the use of attractive models, from glamour, from sex, while altering its use of humor (much of which now seems to revolve around subverting the old expectations by delivering precisely the opposite of what people would have expected--and where they would have preferred it, frankly annoying them rather than making them laugh). For those who feel that way all this removed the sweetener, leaving just the bitter pill that was an unwanted sales pitch being shoved down their throat--while those who saw the now vanished content as objectionable merely saw something objectionable diminished, commercials made less unpleasant than they were before in certain specific ways, which is a different thing from their actually being made pleasant, the bitter pill itself still that. And the result, of course, has been an audience with still more reason to be annoyed by the multimedia onslaught on their senses and their wallets.
"Wait, wasn't this going to be about commercials?" some readers are doubtless asking.
Yes, in fact it already is. Specifically it seems to me that in a world where people are suffering so much--the actual deaths only the most extreme outcome of the untoward larger trend--even the little things are weighing on us more.
Like the annoyance caused by commercials.
Still, I do think the actual experience of those commercials is relevant. There is how advertisers strive harder and harder to penetrate an ever-more cacophonous media universe to reach a hyper-saturated audience that wants absolutely nothing to do them, and has developed defenses against them. There are such disgusting "innovations" as autoplay for videos no one would choose to play, so that before you have even spotted the video you hear the commercial blaring and are left hurriedly trying to locate it and shut off the distraction so that you can get on with what you came for. There is the way that when we are online going to some website we will be hit in the face with a popup demanding we disable our ad blockers--the element of coercion, the fact of our putting ourselves to the inconvenience of disabling our protections (and then having to enable them again as soon as we leave), acquiescing in their harassing us, the more distasteful. There is their readiness to make their content annoying as the price of keeping us from shutting them out mentally--with one tactic the design of commercials so that after seeing them a hundred times you are still not sure what they are about, or even what they are selling, the approach of catching the viewer's attention to insidiously imprint some subconscious association unbelievably cynical and obnoxious (while in the case of some of us making us wonder just what it was about so that we consciously pay attention, manipulated yet again).
Meanwhile there is the ever-greater attention people are encouraged to devote to "status politics." The result is that the cultural traditionalist will not only be annoyed by what they experience as a ceaseless barrage of "representation," "counter-stereotype" and the like, but the sight of it will be the more charged because of that larger context--every commercial, like everything else, a battle in the culture war. Others who are not averse to such politics may still be annoyed by what they object to not for "wokeness," but simply the "corporate kind." And still others who have no objection to even corporate wokeness will be the more annoyed by the occasions when a commercial is not so "woke." (In short, everyone walks away unhappy, which frankly is the point of elevating such politics.)
One may add that the politics can seem to have had a subtler impact on the commercials. Traditionally commercials relied on the casting of physically attractive persons in them. They relied on an element of glamour and, yes, sex appeal. Humor was important, too. It is not always clear that all this actually helped sell more goods. And often it annoyed as many as it pleased. But for those who were amenable to the content--frequently to the point of, every now and then, actually liking the commercial in question, even when they didn't like commercials generally--it made living in an advertising-soaked media universe easier to take, a single good commercial buying a measure of forgiveness for a lot of bad ones. Now--and I admit this is not the sort of opinion you would see uttered in the mainstream media--in the view of a non-negligible portion of the market, when judged by traditional, conventional criteria, advertising has retreated from the use of attractive models, from glamour, from sex, while altering its use of humor (much of which now seems to revolve around subverting the old expectations by delivering precisely the opposite of what people would have expected--and where they would have preferred it, frankly annoying them rather than making them laugh). For those who feel that way all this removed the sweetener, leaving just the bitter pill that was an unwanted sales pitch being shoved down their throat--while those who saw the now vanished content as objectionable merely saw something objectionable diminished, commercials made less unpleasant than they were before in certain specific ways, which is a different thing from their actually being made pleasant, the bitter pill itself still that. And the result, of course, has been an audience with still more reason to be annoyed by the multimedia onslaught on their senses and their wallets.
Subscribe to:
Posts (Atom)