No one will ever admit it at a meeting of composition faculty, but it is no secret that students commonly find taking a composition course a painful experience. Moreover, this is not some recent development which cranky old people can chalk up to "these lazy kids today," but rather seems to have been the case for as long as composition classes have been around. Thus did William H. Whyte quip off-handedly in his 1955 classic of sociology The Organization Man that "As anyone who has ever tried to teach composition knows, the student who has yet to master it would give anything to be done with the chore."
It seems to me worth discussing just why that is. For the time being let us aside the familiar problem of the teacher who gets underprepared, lazy students cynical toward schooling in general; and the student who gets an undertrained or unmotivated teacher working with a questionable textbook and questionable teaching strategies. These situations are, of course, ever present. But even more fundamental is the essence of how composition classes work, what they demand of students, how students experience them, when they are taught at their best to receptive students as well as when they are taught badly to the underprepared and unwilling. As it happens, even here one does not have to search very far for factors that can make even a reasonably willing and able student to feel ready to "give anything to be done with the chore."
Five strike me as particularly noteworthy.
Reason #1: Composition Classes are Mandatory.
First and foremost, composition tends to be mandatory. This may not seem terribly unusual. After all, the core curriculum in colleges tends to feature many different requirements. However, in fulfilling a math, science or humanities requirement students usually have a range of options. In practice, at least, these tend to include options which are easier than others, giving those students unenthusiastic about course work in a subject area a relatively simple way of meeting the requirement, and then moving on with their college career. (Not a math person? You can take "Finite Math." Or maybe even just a computer course.)
By contrast, it is likely to be the case that everyone has to take the same "composition 101" course, and often a "composition 102" afterward, no wriggle room available in their situation. This means that a great many students who would not choose to take composition--because they do not see it as relevant to their later studies and personal career plans, or simply because they do not like it--are forced to do so, and for not one but two semesters.
All by itself, this is enough to fill composition classes with students dubious about it. The fact that many find the class more difficult than they were led to believe it would be does not help.
Reason #2: Composition Classes are Skills-Centered.
Composition is not a subject where one amasses facts, and gradually learns to process them, the way they might in a science or history course, for example. A well-taught science class would teach them, alongside some of the basic facts of the field, something about thinking scientifically (give them a firmer grasp of the scientific method, for example), while a well-taught history class would teach them something about how to think like a historian (as by teaching them to evaluate sources and use the information they get from them to reconstruct what happened in the past, how, why). However, the base of facts would at the more basic stages be primary, the subtler skill secondary.
By contrast it is the extreme opposite in composition, the mental skill almost everything. This means that the emphasis is on brain-work that, while demanding and essential, proceeds in a slower, fuzzier fashion than the memorization of facts.
For example, a student cannot learn to write a good thesis on demand just by memorizing a textbook definition of a thesis (which tends to be badly written anyway). Instead they have to assimilate that knowledge, typically by coming to a deep understanding of how thesis statements work through close reading of many of them, and then practicing the writing of thesis statements over time. Some will learn to write more quickly and master writing more thoroughly than others, but the principle holds for everyone.
Making matters more complicated, writing skills are not separable from other skills like reading and critical thinking. A student who is not an able reader and critical thinker is unlikely to improve their writing without also acquiring those skills, and unfortunately K-12 education show less specific attention to those other skills than they do writing. Indeed, a student's first serious encounter with the terms "close reading," "critical reading" and "critical thinking" is likely to be in that composition classroom. Naturally, having their grade depend on their ability to do these unfamiliar things in which they have not previously been trained (rather than repeat what they have been told on demand) is likely to come as something of a rude shock.
Reason #3: Composition Classes Require Students to Be Intellectuals.
In a composition class a student may be presented with a text such as, for instance, a brief article about the marketing of American food brands in Europe in the 1990s, and told to read it closely and critically, discuss it, write about it.
The situation may seem odd. What does the marketing of food brands have to do with composition?
Strictly speaking, nothing whatsoever. But one must have something on which to exercise, to practice, the reading, thinking, writing skills the class is intended to teach, and an article on the marketing of food brands is not necessarily worse than any other for the purpose--the intellectual activity involved in engaging with the material the real point of the activity.
In that stress on reading, thinking, writing for the sake of reading, thinking, writing, students are asked to "be intellectuals"--to take an interest in subjects that may not have immediate practical use to them--for the purposes of mastering the skills a composition class sets out to teach.
This seems reasonable enough to college instructors, but it is something relatively new to many college students. Once again, their prior academic experience required more listening and note-taking, more memorization and recall on homework assignments and tests where there is a clear, correct answer (even when they may have had to give essay answers), than this more exploratory and less obviously utilitarian activity. Additionally, besides being different from their prior experience, having to read, think, write like this can also be experienced as more demanding, and less certain; more is asked of them, while they are less sure of what is wanted. And that, frequently, makes students dislike it the more, especially if they are inclined to see the demands made on them in school as essentially annoyances on the way to getting the diploma that would enable them to get a "good job." The reality that anti-intellectualism is a very significant force in contemporary culture also encourages cynicism toward this kind of course work, to which classes in the arts and humanities (like composition) are particularly vulnerable.
Reason #4: Grading in Composition Classes is Not "Subjective"--But Can Look That Way.
As might be guessed, perceptions of the grading of their work in such courses hurt rather than help matters. Students commonly think of English, and especially its writing component, as an area where the grading criteria are subjective, because there is often no single answer to an essay assignment, and because quantifying the assessment of writing seems such an uneven process.
Yet, this is not at all the case. Anyone will concede that such matters as spelling, grammar and the conventions of citation are not subjective. Whether one uses the spelling "sale" or "sail" in a piece of writing is not a matter of personal preference--and the objective element in the assessment of a piece of writing does not stop there. Even aspects of prose style can be judged objectively. For example, there is unlikely to be any doubt about whether a sentence (for example, "The door was opened") is written in passive voice or active voice, and whether the use of passive voice instead of the preferred use of active voice was justified by any practical advantage in communication. And of course, contrary to what some may think, even the more "intellectual" aspect of the work can be assessed in a fairly objective way—as with the matter of whether a paper has a thesis in it, or whether a thesis is being supported by evidence rather than merely being asserted.
In fact, composition can be rather like those famously "objective" subjects, science and math. A student writing an argumentative paper has to consider a possible position about the world, examine the evidence for and against it, and draw a conclusion--just as in the scientific method, where one forms a hypothesis, tests it, and evaluates the result of the test. They have to explain their reasoning to their reader--"showing their work," like a science student writing up a lab report, or a math student demonstrating how they arrived at the solution to a problem. And in organizing what they have to say they are likely to rely on clear-cut and rigorous thought-structures like the three-pronged thesis statement, the five-paragraph essay, or the "comparison and contrast" pattern of development--in a word, formulas.
The fact that composition is skills-centered and places a high stress on rigorous reasoning, rigorously assessed, would by itself be enough to make it an unpopular subject--just as math so frequently is. However, there is an added complication here, namely that while students find the class difficult because it is rigorous in these ways, they do not recognize it as being so.
Despite what students are supposed to be learning in these courses they are likely to persist in the view that composition, English and the humanities in general are "soft" subjects, in which grades are handed out arbitrarily, or close to it. This is partly because of the manner in which students are assessed, and grades handed out. In a math class the teacher's choice and wording of test questions, or their readiness to award credit for partial work or grade on a curve, are as much a matter of individual judgment as anything an English teacher does in their class. However, the apparently right-or-wrong nature of the answers on the test, and the quantitative scoring, do much to diminish the sense of arbitrariness--a 65 on a math test (superficially) harder to argue with than a "D" on a term paper.
Reason #5: Composition Classes Require Students to Revise Papers.
It is bad enough to get a grade one does not agree with. It is still worse to get a detailed, critical examination of their work--and be forced to acknowledge that examination by having to go back and modify the paper in substantive ways.
That may not sound like a big problem to those who don't write, but really, it is. The truth is that even professional writers with lengthy experience of being edited and published often find revision a painful, even wrenching experience (often, much more so than producing a piece of writing in a new piece of writing to replace the old entirely), to the point of resisting it when possible. It is therefore not really surprising to find that first-year college students are unenthusiastic about it, and likewise resisting it. Asked to revise their paper they will limit themselves to the most actionable adjustments the instructor ordered. If told their paper lacks organization, for example, that its thesis is less clear than it might be, presents its supporting arguments in no particular order, and contains much that is irrelevant to anything they might be saying, they will often ignore all that and just fix the indicated comma splices--thinking of the instructor as their editor, not their grader. (Indeed, they often get angry if fixing the comma splices does not get them an "A," which they insist that every teacher they ever had gave them.) Meanwhile even those who make an honest, serious effort, may still fall short of the grade they hope for. This makes them still more frustrated with and resentful of the process. And in the end, an instructor is likely to find that no aspect of their interaction with their students is more rancorous than this one.
So there you have it, five major reasons--the mandatory, skills-centered, "intellectual" nature of the class, with its supposed "subjective" grading and burden of revision--that make students dislike it as much as they do. Having said all this it would seem that this is where the persons who has offered such comment is expected, to quote another celebrated '50s-era sociologist, set about "softening the facts into the optimistic, practical, forward-looking, cordial, brisk view," not least by offering some solutions to the problem they have raised--preferably solutions that they can apply all by themselves without help or support from anyone else to miraculous result. Alas, not only is the demand reflective of a desire to avoid facts rather than face them, but the expectation of solutions of that type is often unreasonable, with this case no exception, especially when we are discussing those problems that are most "built into" this kind of class.
Still, that is not to say that we could not be doing better. The most important thing we could do to make such classes more useful would be cultivating a societal attitude more respectful of verbal skills--recognizing that reading, thinking and writing go together, that these are skills to be assimilated rather than injunctions to be memorized, that performance here is not "subjective," that what looks at a glance like "useless intellectualism" can be a valuable training tool--not least as part of better equipping students for this training through an improvement of teaching in grades K-12. Of course, such a change can hardly be produced overnight, even if people who have a real say in these matters care about them, which they do not. (At least in the mainstream of our media and our political life all the pious chatter about "education" is almost invariably about a plethora of other agendas, not actually securing better learning outcomes, even as a means to an end.) However, it does seem to me that clarity on the reality--the instructor understanding what they are asking of students who may not have been properly prepared for all this and may quite understandably be doubtful about the value of the class, and being frank with their students about it--can go some way to bridging the gap between them, and make a better result possible.
Wednesday, September 21, 2022
Tuesday, September 20, 2022
Why Have We Heard So Little About Bond 26?
The shooting of the last Bond film, No Time to Die, wrapped up back in 2019, and the film itself came out in late 2021. This makes it almost three years after the completion of a film known to be the last for the Daniel Craig cycle, and a year after that film's release.
One might have expected by this point to hear something about the plans for the next iteration of the franchise--indeed, to have heard much about it (however much of it may have been rumored, tentative, outdated or simply the usual PR drivel). However, just about every story we hear about the project basically tells us that nothing has been decided, and the next Bond film remains far away (a June piece in Variety telling us that no new actor is in the running for the lead of a film whose filming is two years away, implying a by no means hard release date in 2025).
Just why has so little been said--and if we are to believe what is said, decided--after all this time?
There actually seems no shortage of factors. Some two-and-a-half years into the COVID-19 pandemic the box office is recovering, but not by any means recovered, which fact has been plenty to make decision makers in the industry drag their feet on big decisions. Given that amid that chaos No Time to Die was, at best, an acceptable performer rather than a really stellar one--and given that it was a highly publicized exit for a crop of Bond films that, the official line went, had been rapturously received, underwhelming--one would think them particularly sensitive to that mood. That it was suggested that young people in particular were losing interest (seemingly confirmed by how just the weekend before No Time to Die came out they flocked to the debut of Venom 2, and then a scarce two months later came out for Spiderman in record-breaking numbers) must have been particular cause for anxiety. There are the question marks over the international market that seem likely to remain even after the pandemic fades, like the receptivity of the hugely important China market, while those spinning plots for the series may be ultra-cautious about any Russian element in the story for that reason, and others. (Will even Western audiences embrace a story about a "new Cold War," or be repulsed by it?) Moreover, all of this is happening as interest rates resurge, with all it means for the costs of shooting a big movie with an uncertain schedule.
Still, that far from exhausts it, with one factor much on my mind the franchise's long-time survival strategy, to which the current environment has not been conducive. After the '60s, during which the Bond films had been setting the trend in pop culture in numerous ways (pioneering the high concept movie generally and the action-adventure film specifically, feeding "spymania," etc.) the makers of the Bond films have kept up public interest by shamelessly seizing on whatever trends came along.
In the 1970s there always seemed to be something out there they could use (blaxploitation, kung fu, underwater and outer space adventure, etc.), and on the whole it worked. However, like so much else this practice offered diminishing returns on effort, as the trend-chasing of the '80s in particular showed--the attempts to follow in the footsteps of contemporaneous Hollywood action movies yielding particularly weak grosses at that commercial low point for the series (epitomized by the response to Licence to Kill). The '90s, which was a decade pop culturally more devoted to recycling the '60s and '70s than to any new ideas, left even the Bond films' trend-chasing looking like a recycling of earlier efforts (with the Asian/martial arts elements of Tomorrow Never Dies, like the essential bait-and-bleed plot of the film, a reworking of material from those '70s-era Bond films, in that case The Man with the Golden Gun).
Moreover, the twenty-first century has been less fertile still. In its early years it offered the approach for which Batman Begins is a model--and the makers of the Daniel Craig films made full use of it, taking what Ian Fleming himself called a "cardboard booby" with utmost seriousness--frankly, pretentiousness—in concocting an origin story-telling prequel grounded in its action, and darker and more downbeat in tone, with said tone the more conspicuous as running times got way longer. Some didn't care for this (myself included), but the claqueurs who review movies at the least put on a good show of being enthusiastic, and whether this was because of the turn or in spite of it, the resulting movies did sell a lot of tickets.
Now looking at the latest iteration of the Batman franchise in this very year's The Batman and it seems we still have . . . seriousness and pretentiousness in the making of a dark, downbeat, origin story-telling prequel about a "cardboard booby" that goes on and on and on for three hours. Two decades later, and we are still in the same place, more or less, with regard to the whole approach to storytelling, while even those more superficial elements the storytellers can use to stoke up wide audience interest have not changed much. Superheroes and zombies were the two big ideas circa 2002, and so they remain in 2022--and while I don't think the makers of the next Bond film feel quite desperate enough to make Bond wear a cape, or have him fight zombies (yet). The result is that any reboot of the series made with the prevailing trends in mind is likely to look a lot like the last reboot of the series, the same thing we have had for two decades. And I think that what all that means for those trying to repackage Bond as interesting and contemporary and relevant should not be underestimated by those wondering about the protraction of the process. Putting it bluntly, this time it isn't just the makers of the Bond movies who are out of ideas, it's everyone else, too, leaving them with nothing to "borrow" for their purposes, and that much more reason to go on dithering for the time being.
One might have expected by this point to hear something about the plans for the next iteration of the franchise--indeed, to have heard much about it (however much of it may have been rumored, tentative, outdated or simply the usual PR drivel). However, just about every story we hear about the project basically tells us that nothing has been decided, and the next Bond film remains far away (a June piece in Variety telling us that no new actor is in the running for the lead of a film whose filming is two years away, implying a by no means hard release date in 2025).
Just why has so little been said--and if we are to believe what is said, decided--after all this time?
There actually seems no shortage of factors. Some two-and-a-half years into the COVID-19 pandemic the box office is recovering, but not by any means recovered, which fact has been plenty to make decision makers in the industry drag their feet on big decisions. Given that amid that chaos No Time to Die was, at best, an acceptable performer rather than a really stellar one--and given that it was a highly publicized exit for a crop of Bond films that, the official line went, had been rapturously received, underwhelming--one would think them particularly sensitive to that mood. That it was suggested that young people in particular were losing interest (seemingly confirmed by how just the weekend before No Time to Die came out they flocked to the debut of Venom 2, and then a scarce two months later came out for Spiderman in record-breaking numbers) must have been particular cause for anxiety. There are the question marks over the international market that seem likely to remain even after the pandemic fades, like the receptivity of the hugely important China market, while those spinning plots for the series may be ultra-cautious about any Russian element in the story for that reason, and others. (Will even Western audiences embrace a story about a "new Cold War," or be repulsed by it?) Moreover, all of this is happening as interest rates resurge, with all it means for the costs of shooting a big movie with an uncertain schedule.
Still, that far from exhausts it, with one factor much on my mind the franchise's long-time survival strategy, to which the current environment has not been conducive. After the '60s, during which the Bond films had been setting the trend in pop culture in numerous ways (pioneering the high concept movie generally and the action-adventure film specifically, feeding "spymania," etc.) the makers of the Bond films have kept up public interest by shamelessly seizing on whatever trends came along.
In the 1970s there always seemed to be something out there they could use (blaxploitation, kung fu, underwater and outer space adventure, etc.), and on the whole it worked. However, like so much else this practice offered diminishing returns on effort, as the trend-chasing of the '80s in particular showed--the attempts to follow in the footsteps of contemporaneous Hollywood action movies yielding particularly weak grosses at that commercial low point for the series (epitomized by the response to Licence to Kill). The '90s, which was a decade pop culturally more devoted to recycling the '60s and '70s than to any new ideas, left even the Bond films' trend-chasing looking like a recycling of earlier efforts (with the Asian/martial arts elements of Tomorrow Never Dies, like the essential bait-and-bleed plot of the film, a reworking of material from those '70s-era Bond films, in that case The Man with the Golden Gun).
Moreover, the twenty-first century has been less fertile still. In its early years it offered the approach for which Batman Begins is a model--and the makers of the Daniel Craig films made full use of it, taking what Ian Fleming himself called a "cardboard booby" with utmost seriousness--frankly, pretentiousness—in concocting an origin story-telling prequel grounded in its action, and darker and more downbeat in tone, with said tone the more conspicuous as running times got way longer. Some didn't care for this (myself included), but the claqueurs who review movies at the least put on a good show of being enthusiastic, and whether this was because of the turn or in spite of it, the resulting movies did sell a lot of tickets.
Now looking at the latest iteration of the Batman franchise in this very year's The Batman and it seems we still have . . . seriousness and pretentiousness in the making of a dark, downbeat, origin story-telling prequel about a "cardboard booby" that goes on and on and on for three hours. Two decades later, and we are still in the same place, more or less, with regard to the whole approach to storytelling, while even those more superficial elements the storytellers can use to stoke up wide audience interest have not changed much. Superheroes and zombies were the two big ideas circa 2002, and so they remain in 2022--and while I don't think the makers of the next Bond film feel quite desperate enough to make Bond wear a cape, or have him fight zombies (yet). The result is that any reboot of the series made with the prevailing trends in mind is likely to look a lot like the last reboot of the series, the same thing we have had for two decades. And I think that what all that means for those trying to repackage Bond as interesting and contemporary and relevant should not be underestimated by those wondering about the protraction of the process. Putting it bluntly, this time it isn't just the makers of the Bond movies who are out of ideas, it's everyone else, too, leaving them with nothing to "borrow" for their purposes, and that much more reason to go on dithering for the time being.
Monday, September 19, 2022
The Institution of the Claqueur
Those who read Honore de Balzac's Lost Illusions might be surprised to encounter in it the employment of the claqueur--the professional clapper, hired to applaud. Having invested heavily in a play it was the logical business decision for the producer to buy some insurance for its success--to not trust the material to win a standing ovation from the audience but to have plants among it whose uproarious applause will hopefully prove infectious, get the rest clapping, and convince them that after all it was a great play and they should see it again and tell everyone else to do so also—and these were the people who did the job.
As it went in theaters so did it go elsewhere in the media of the time, as with publishing, Balzac making it clear that in the "theatre of literature," in which the public "sees unexpected or well-deserved success, and applauds," the audience and the world at large are oblivious to the setting of the stage for that success--which include its own "claqueurs hired to applaud."
In this as in so much else the writers of that earlier day, the great Balzac by no means list, were infinitely more honest than those of our own, who never breathe a word about the seamier side of their business as it takes "claquing" to heights scarcely dreamed of by the sleazy publishers we see in Balzac's Human Comedy, with places on the bestseller list all but up for sale.
Indeed, the importance of the claqueur in our time is underlined by what has happened in book publishing since the technological rush of the late '00s democratized its most fundamental element, namely the mechanism for commercial production and distribution of a work of fiction. Thanks to services like Amazon's it became the case that anyone could take a digital copy of their manuscript and, at no cost, convert it into the template of a print and e-book within days available at retail outlets all around the world. And a great many people did just that.
But the idea that this kind of self-publishing would really revolutionize the publishing business never really worked out, because even if the apparatus for physically producing copies of a book, and distributing them, was opened to all, production and distribution is only part of what a publisher provides an author in practice.
There is also the matter of getting the reader to look at what is now offered to them.
But was that not also democratized? What about the Internet? Book bloggers? Social media?
Alas, anyone who thought that could compete with the budgets that buy ad space in the New York Times and commercials on TV and fund book tours, and land reviews and interviews in high-profile publications and other forums, learned otherwise. (And of course the going got rougher, not easier, as, in the name of defending the public from "fake news," Big Tech favored "authoritative sources" at the expense of the less authoritative, and suppressed efforts at self-promotion for anyone who did not buy an ad, to the disadvantage of the self-published writer with zero dollars to spare on such.)
The more astute learned, too, about what else the publishers can command that the self-published generally cannot--respectful consideration from opinion-makers. Those with a big publisher behind them, even if they were not a Somebody before, are a Somebody now by virtue of having the patronage of one of the Big Five, and their book treated as a book by a Somebody--which is a far different thing from a book by a Nobody, which, regardless of what it may contain, gets treated as a book by a Nobody. Innumerable stupid and easily manipulated prejudices play their part--but along with them so do the quid pro quos that, even when they have not personally received payoff in "quid" form, reduce the critic to claqueur. After all, Jay Sherman may have been a knowledgeable and tough-minded critic of the cinema--but he had to answer to network chief Duke Phillips, who regarded it as a critic's duty to "rate movies from good to excellent," and, you may also remember, squeezed out of him an obscenely overgenerous two-and-a-half star rating for the atrocity against film that was his remake of The Dirty Dozen. (By contrast a self-published book worthy of five stars would be very lucky indeed to get the two-and-a-half.)
In short, even as the means of publication were opened to vastly more people than had ever enjoyed access to them in history, the means of publicity were ever more the purview of "the big battalions," who made the most of their advantage--which was quite enough to head off any prospect of self-published writers elbowing the Establishment folks aside to become the new rulers of the domain.
As it went in theaters so did it go elsewhere in the media of the time, as with publishing, Balzac making it clear that in the "theatre of literature," in which the public "sees unexpected or well-deserved success, and applauds," the audience and the world at large are oblivious to the setting of the stage for that success--which include its own "claqueurs hired to applaud."
In this as in so much else the writers of that earlier day, the great Balzac by no means list, were infinitely more honest than those of our own, who never breathe a word about the seamier side of their business as it takes "claquing" to heights scarcely dreamed of by the sleazy publishers we see in Balzac's Human Comedy, with places on the bestseller list all but up for sale.
Indeed, the importance of the claqueur in our time is underlined by what has happened in book publishing since the technological rush of the late '00s democratized its most fundamental element, namely the mechanism for commercial production and distribution of a work of fiction. Thanks to services like Amazon's it became the case that anyone could take a digital copy of their manuscript and, at no cost, convert it into the template of a print and e-book within days available at retail outlets all around the world. And a great many people did just that.
But the idea that this kind of self-publishing would really revolutionize the publishing business never really worked out, because even if the apparatus for physically producing copies of a book, and distributing them, was opened to all, production and distribution is only part of what a publisher provides an author in practice.
There is also the matter of getting the reader to look at what is now offered to them.
But was that not also democratized? What about the Internet? Book bloggers? Social media?
Alas, anyone who thought that could compete with the budgets that buy ad space in the New York Times and commercials on TV and fund book tours, and land reviews and interviews in high-profile publications and other forums, learned otherwise. (And of course the going got rougher, not easier, as, in the name of defending the public from "fake news," Big Tech favored "authoritative sources" at the expense of the less authoritative, and suppressed efforts at self-promotion for anyone who did not buy an ad, to the disadvantage of the self-published writer with zero dollars to spare on such.)
The more astute learned, too, about what else the publishers can command that the self-published generally cannot--respectful consideration from opinion-makers. Those with a big publisher behind them, even if they were not a Somebody before, are a Somebody now by virtue of having the patronage of one of the Big Five, and their book treated as a book by a Somebody--which is a far different thing from a book by a Nobody, which, regardless of what it may contain, gets treated as a book by a Nobody. Innumerable stupid and easily manipulated prejudices play their part--but along with them so do the quid pro quos that, even when they have not personally received payoff in "quid" form, reduce the critic to claqueur. After all, Jay Sherman may have been a knowledgeable and tough-minded critic of the cinema--but he had to answer to network chief Duke Phillips, who regarded it as a critic's duty to "rate movies from good to excellent," and, you may also remember, squeezed out of him an obscenely overgenerous two-and-a-half star rating for the atrocity against film that was his remake of The Dirty Dozen. (By contrast a self-published book worthy of five stars would be very lucky indeed to get the two-and-a-half.)
In short, even as the means of publication were opened to vastly more people than had ever enjoyed access to them in history, the means of publicity were ever more the purview of "the big battalions," who made the most of their advantage--which was quite enough to head off any prospect of self-published writers elbowing the Establishment folks aside to become the new rulers of the domain.
Friday, September 16, 2022
What is COVID-19's Legacy for the Film Business?
Two-and-a-half years ago the COVID-19 pandemic dealt the film industry a truly unprecedented blow, with the (correct) anticipation of collapsing business prompting a rush of delays of major movie releases--starting, as the readers of this blog may remember, with the Bond film No Time to Die.
Of course, big movies were starting to come out again, and a portion of the public was still willing to go see them, but the pandemic had its effect nonetheless--with the result that those movies consistently made far less than would have been expected to take in during "normal" times. Thus did Pixar's animated epic Onward, after seeing its earnings cut off at a mere $62 million, still end up the seventh highest grossing movie of the year, while the sequel to 2017's $400 million hit Wonder Woman pulled in a mere $47 million, and the big Christopher Nolan thriller Tenet just $59 million. The following year Black Widow and Fast and Furious 9 failed to break $200 million--a barrier no film broke until September's Shang-Chi and the Legend of Ten Rings. And the aggregate impact was colossal. Where in 2015-2019 the movie industry averaged $1 billion in ticket sales per month, it averaged under $100 million a month for the whole year after the pandemic's outbreak, with things only slowly picking up, the May-August period usually good for $5 billion+ pulling in a mere $1.6 billion in 2021, all as the inflation beginning to bite cut into the real value of that money.
Of course, things got better in the fall, particularly the late fall. If Venom: Let There Be Carnage did not quite realize the promise of its $90 million opening weekend, the fact that it had that weekend (and followed Shang-Chi through the $200 million barrier) was still something, while if the debut of No Time to Die was a bit of a disappointment many were prepared to chalk it up to other problems (the confusion caused by the film's multiple delays, the lukewarm critical response, and frankly the fact that younger moviegoers seemed to be just plain losing interest in old 007). Still, it was undeniable that those who were upbeat about the situation were "grading on a curve" until December's Spider-Man: No Way Home pulled in an indisputably sensational $800 million (quadruple Shang-Chi's take). Luck can seem to have played its part here--specifically the movie's narrowly beating the resurgence of the pandemic (the 7 day average of new cases surging from about 125,000 on the day of release to an extraordinary new peak of 800,000 a day a month on). Still, it showed that a big movie could pull in a gross as big as what was seen pre-pandemic, and afterward the release schedule, and the grosses, increasingly normalized, with the earnings for summer 2022 about two-thirds the pre-pandemic average (which is to say, twice as good as what we saw the preceding summer), and that at least some of the shortfall was due to there being fewer big movies out than usual (e.g. just four big live-action action movies, as compared with the usual eight-plus). Indeed, one may reasonably expect that, barring some really unpleasant surprise, we will see a return to the pre-pandemic trend (the more in as 2023 is looking really packed with would-be blockbusters).
All the same, amid perhaps the leanest times in the industry's history there were real questions about the fate of a film business that was already long seen as under duress, reflected in its playing-it-ever-safer strategy of rigid production of stereotyped "tentpoles" from the same franchises over and over and over again. In particular there was the question of whether or not the studios would be shifting away from their longtime model of a theatrical release followed by the release of a movie on video at some unspecified later date--usually many months later--in favor of other arrangements. Above all there was the alternative, long-discussed but until then never attempted with a really major production, of "simultaneous" theatrical and home video release (the latter, at a premium price, of course), with which format they experimented with Mulan, Black Widow and other films. However, it seems the revenues reaped from people willing to pay $30 to stream those movies at home opening weekend came nowhere near matching the ticket sales from a robust theatrical opening--and certainly in its hosannas over the success of Top Gun 2 the entertainment press has rushed to acclaim Paramount's assiduously sticking with the old model as a key reason for that success. I think there is room for argument over whether it was so important--but that they are so insistent on it is likely reflective of Hollywood's thinking about the matter, just like Warner Bros' decisions with some of its recent superhero films, specifically its burying Batgirl while, because there was still time in which to do it, I suppose, upgrading Blue Beetle to make what had earlier been a smaller streaming project worthy of the theatrical releases on which the studio will now be focusing.
In short, the old model endures--while if anything the disappointment in same-day streaming and the like (part of a bigger disenchantment with making big-budget movies for streaming, evident in Netflix's own backing away from the practice) allows one to speak of that model's having been reaffirmed, with any more movement in such a direction likely put off for quite some time to come. At the same time, as the decision to bury Batgirl as too expensive for a streaming release but too "small" to compete in the theaters suggests the increasing emphasis of Hollywood on big-screen, big-budgeted, spectacle-oriented high concept action-adventure has also been reaffirmed. The result is that the pandemic's legacy (thus far, at least) has not been to change what Hollywood has been doing, but rather reinforce it--a testimony to the strength of the structural changes in the film business ongoing since the chaos of the studio system's collapse and the advent of television, and the studios' recognizing in George Lucas' Star Wars the basis of the strategy with which they have endeavored to stay profitable ever since.
Of course, big movies were starting to come out again, and a portion of the public was still willing to go see them, but the pandemic had its effect nonetheless--with the result that those movies consistently made far less than would have been expected to take in during "normal" times. Thus did Pixar's animated epic Onward, after seeing its earnings cut off at a mere $62 million, still end up the seventh highest grossing movie of the year, while the sequel to 2017's $400 million hit Wonder Woman pulled in a mere $47 million, and the big Christopher Nolan thriller Tenet just $59 million. The following year Black Widow and Fast and Furious 9 failed to break $200 million--a barrier no film broke until September's Shang-Chi and the Legend of Ten Rings. And the aggregate impact was colossal. Where in 2015-2019 the movie industry averaged $1 billion in ticket sales per month, it averaged under $100 million a month for the whole year after the pandemic's outbreak, with things only slowly picking up, the May-August period usually good for $5 billion+ pulling in a mere $1.6 billion in 2021, all as the inflation beginning to bite cut into the real value of that money.
Of course, things got better in the fall, particularly the late fall. If Venom: Let There Be Carnage did not quite realize the promise of its $90 million opening weekend, the fact that it had that weekend (and followed Shang-Chi through the $200 million barrier) was still something, while if the debut of No Time to Die was a bit of a disappointment many were prepared to chalk it up to other problems (the confusion caused by the film's multiple delays, the lukewarm critical response, and frankly the fact that younger moviegoers seemed to be just plain losing interest in old 007). Still, it was undeniable that those who were upbeat about the situation were "grading on a curve" until December's Spider-Man: No Way Home pulled in an indisputably sensational $800 million (quadruple Shang-Chi's take). Luck can seem to have played its part here--specifically the movie's narrowly beating the resurgence of the pandemic (the 7 day average of new cases surging from about 125,000 on the day of release to an extraordinary new peak of 800,000 a day a month on). Still, it showed that a big movie could pull in a gross as big as what was seen pre-pandemic, and afterward the release schedule, and the grosses, increasingly normalized, with the earnings for summer 2022 about two-thirds the pre-pandemic average (which is to say, twice as good as what we saw the preceding summer), and that at least some of the shortfall was due to there being fewer big movies out than usual (e.g. just four big live-action action movies, as compared with the usual eight-plus). Indeed, one may reasonably expect that, barring some really unpleasant surprise, we will see a return to the pre-pandemic trend (the more in as 2023 is looking really packed with would-be blockbusters).
All the same, amid perhaps the leanest times in the industry's history there were real questions about the fate of a film business that was already long seen as under duress, reflected in its playing-it-ever-safer strategy of rigid production of stereotyped "tentpoles" from the same franchises over and over and over again. In particular there was the question of whether or not the studios would be shifting away from their longtime model of a theatrical release followed by the release of a movie on video at some unspecified later date--usually many months later--in favor of other arrangements. Above all there was the alternative, long-discussed but until then never attempted with a really major production, of "simultaneous" theatrical and home video release (the latter, at a premium price, of course), with which format they experimented with Mulan, Black Widow and other films. However, it seems the revenues reaped from people willing to pay $30 to stream those movies at home opening weekend came nowhere near matching the ticket sales from a robust theatrical opening--and certainly in its hosannas over the success of Top Gun 2 the entertainment press has rushed to acclaim Paramount's assiduously sticking with the old model as a key reason for that success. I think there is room for argument over whether it was so important--but that they are so insistent on it is likely reflective of Hollywood's thinking about the matter, just like Warner Bros' decisions with some of its recent superhero films, specifically its burying Batgirl while, because there was still time in which to do it, I suppose, upgrading Blue Beetle to make what had earlier been a smaller streaming project worthy of the theatrical releases on which the studio will now be focusing.
In short, the old model endures--while if anything the disappointment in same-day streaming and the like (part of a bigger disenchantment with making big-budget movies for streaming, evident in Netflix's own backing away from the practice) allows one to speak of that model's having been reaffirmed, with any more movement in such a direction likely put off for quite some time to come. At the same time, as the decision to bury Batgirl as too expensive for a streaming release but too "small" to compete in the theaters suggests the increasing emphasis of Hollywood on big-screen, big-budgeted, spectacle-oriented high concept action-adventure has also been reaffirmed. The result is that the pandemic's legacy (thus far, at least) has not been to change what Hollywood has been doing, but rather reinforce it--a testimony to the strength of the structural changes in the film business ongoing since the chaos of the studio system's collapse and the advent of television, and the studios' recognizing in George Lucas' Star Wars the basis of the strategy with which they have endeavored to stay profitable ever since.
Friday, September 9, 2022
Reflections on Balzac's Standing in the English-Speaking World
All of us have gaps in our knowledge, and the contents of Balzac's Human Comedy were one of mine until relatively late. It has seemed to me that this was partly because we hear so little of him in the English-speaking world--and indeed it is probably telling that it was not a writer in English who directed me to Balzac as worthwhile, but French author Thomas Piketty in Capital in the Twenty-First Century, in which he usefully drew on examples of nineteenth century literature to support his arguments about capital, income and wealth.
In considering that slighting of Balzac in the English-speaking world, Balzac's utterly unflinching vision of the stupidity, vulgarity, brutality of the money-ruled modern world clearly emergent in his own time, would seem to be of great importance. After all, brutal as Balzac's vision of life can be it is neither the cheap, mindless misanthropy of the confirmed nihilist, nor the even cheaper "look at how cool I am" edgelordism that is equally celebrated by Midcult-gobbling middlebrow idiots incapable of telling the difference between one and the other, but the genius and courage to see, acknowledge, and convey what others miss, ignore, cut out of the picture.
At the same time in the "Anglosphere" we seem to see a greater insistence than in other places on having Establishment poets in the pantheon and no others--and Balzac, even if pure Establishment in his personal politics (not merely Royalist, but Bourbon-supporting Legitimist!) was as an artist far too much the truth-teller to ever be considered that (with many of his particular truths, perhaps, especially unwelcome in a country where, as James Kenneth Galbraith remarked not too many years ago, the word "market" cannot be used in public without the speaker "bending a knee and making the sign of the cross").
Indeed, even in Balzac's own native land the makers of a recent film version of Lost Illusions could not stomach their own chosen material. While for the most part praising that adaptation as faithful to the original (which is far more than he was able to credit the adaptation of that other masterpiece by Balzac, Cousin Bette, with) David Walsh remarked in his review that the director, by his own admission, "found some of Balzac's writing 'harsh and punitive,'" and that the final film "'softened' the novel’s attitude toward certain figures," not least Madame de Bargeton, "and added a more hopeful conclusion."
Of this Walsh remarks that "[i]t's possible that something has been gained in the process, and perhaps something has been lost"--with the latter striking me as the more likely, and alas, also testimony to what we have been losing for so many decades in terms of the readiness to face certain truths, in culture as in life.
In considering that slighting of Balzac in the English-speaking world, Balzac's utterly unflinching vision of the stupidity, vulgarity, brutality of the money-ruled modern world clearly emergent in his own time, would seem to be of great importance. After all, brutal as Balzac's vision of life can be it is neither the cheap, mindless misanthropy of the confirmed nihilist, nor the even cheaper "look at how cool I am" edgelordism that is equally celebrated by Midcult-gobbling middlebrow idiots incapable of telling the difference between one and the other, but the genius and courage to see, acknowledge, and convey what others miss, ignore, cut out of the picture.
At the same time in the "Anglosphere" we seem to see a greater insistence than in other places on having Establishment poets in the pantheon and no others--and Balzac, even if pure Establishment in his personal politics (not merely Royalist, but Bourbon-supporting Legitimist!) was as an artist far too much the truth-teller to ever be considered that (with many of his particular truths, perhaps, especially unwelcome in a country where, as James Kenneth Galbraith remarked not too many years ago, the word "market" cannot be used in public without the speaker "bending a knee and making the sign of the cross").
Indeed, even in Balzac's own native land the makers of a recent film version of Lost Illusions could not stomach their own chosen material. While for the most part praising that adaptation as faithful to the original (which is far more than he was able to credit the adaptation of that other masterpiece by Balzac, Cousin Bette, with) David Walsh remarked in his review that the director, by his own admission, "found some of Balzac's writing 'harsh and punitive,'" and that the final film "'softened' the novel’s attitude toward certain figures," not least Madame de Bargeton, "and added a more hopeful conclusion."
Of this Walsh remarks that "[i]t's possible that something has been gained in the process, and perhaps something has been lost"--with the latter striking me as the more likely, and alas, also testimony to what we have been losing for so many decades in terms of the readiness to face certain truths, in culture as in life.
Hollywood Keeps Becoming Ever More a Parody of Itself
Looking at the schedule of Hollywood releases for 2023 I was struck by just how packed it is with sequels, reboots and the rest--perhaps even more so than in recent years. Even the movies that are not summer blockbuster-type action movies (e.g. at least ten live-action Marvel and DC superhero films, along with new editions of Transformers, Indiana Jones, Star Wars, Star Trek, Ghostbusters, Mission: Impossible, etc., etc.) are do-overs or at least trading on old names and ideas, with the horror movie genre exemplary.* In 2023, it seems, we will have more Scream, more Evil Dead, more Insidious, more The Nun, more Saw, more Exorcist (not to be confused with The Pope's Exorcist, also due out in 2023), while even old Count Dracula will be coming back to the screen in the form of Nicholas Cage (in Renfield). This even seems to be the case with the minute share of theatrically-run filmmaking that now passes for "drama," with Creed III on the way.
Of course, point all this out and a certain kind of suck-up Establishment critic will claim that you've got it all wrong, that the theaters are chock full of original and highly diverse content, so much of it that they don't need to be able to give you any examples, they are all around you--and in the process remind you of the adage that you can't win an argument with an idiot. If they prove themselves other than idiots (or at least, other than complete idiots), they will take a different tack and say that Hollywood is merely giving people what they want--that, yes, it is as you say, but very few people are ready to go for the theater for anything else.
I think that the "giving people what they want" claim has its limits--the consumer will not get what business does not offer, and in this case we are talking about a high-capital industry in a financial milieu where it can make more sense to bury a nearly complete $100 million film rather than to release it because the tax write-off is more attractive than the revenue. Still, it is only fair to admit that there is some truth to audience behavior having an effect on what the studios are ready to invest in--insofar as the function of the theater has changed since that era in which a motion picture could be seen only in the movie house. In an age of small screens everywhere, affording greater convenience and control and far lower cost than the trip to the theater, only big-screen spectacle of a kind for which the bar has only just gone on rising, and the sense of the release being an event, will get them to make that trip--for which nothing else substitutes financially as a revenue stream for Hollywood. (Yes, video, TV, merchandising are important, but the studio's cut of the $15 or $20 ticket is critical, and selling a lot of those tickets what paves the way for big money from all the rest.)
All the same, it is a reminder that in Hollywood, just like everywhere else in the business world, there is far more (insufferably pompous) talk about INNOVATION! than there is the actual thing.
* Yes, ten, count them, ten live-action DC and Marvel superhero movies. The live-action Marvel films are: Kraven the Hunter (January); Ant-Man and the Wasp: Quantumania (February); Guardians of the Galaxy Volume 3 (May); The Marvels (July); Madame Web (October); and Blade (November). The DC films are Shazaam! Fury of the Gods (March); The Flash (June); Blue Beetle (August); and Aquaman and the Lost Kingdom (December). The reader should also note that there will be an animated Spider-Man: Into the Spider-Verse sequel, Spider-Man: Across the Spider-Verse (June), while from outside the Marvel-DC domain there will also be an animated Teenage Mutant Ninja Turtles film, Teenage Mutant Ninja Turtles: Mutant Mayhem (August).
Of course, point all this out and a certain kind of suck-up Establishment critic will claim that you've got it all wrong, that the theaters are chock full of original and highly diverse content, so much of it that they don't need to be able to give you any examples, they are all around you--and in the process remind you of the adage that you can't win an argument with an idiot. If they prove themselves other than idiots (or at least, other than complete idiots), they will take a different tack and say that Hollywood is merely giving people what they want--that, yes, it is as you say, but very few people are ready to go for the theater for anything else.
I think that the "giving people what they want" claim has its limits--the consumer will not get what business does not offer, and in this case we are talking about a high-capital industry in a financial milieu where it can make more sense to bury a nearly complete $100 million film rather than to release it because the tax write-off is more attractive than the revenue. Still, it is only fair to admit that there is some truth to audience behavior having an effect on what the studios are ready to invest in--insofar as the function of the theater has changed since that era in which a motion picture could be seen only in the movie house. In an age of small screens everywhere, affording greater convenience and control and far lower cost than the trip to the theater, only big-screen spectacle of a kind for which the bar has only just gone on rising, and the sense of the release being an event, will get them to make that trip--for which nothing else substitutes financially as a revenue stream for Hollywood. (Yes, video, TV, merchandising are important, but the studio's cut of the $15 or $20 ticket is critical, and selling a lot of those tickets what paves the way for big money from all the rest.)
All the same, it is a reminder that in Hollywood, just like everywhere else in the business world, there is far more (insufferably pompous) talk about INNOVATION! than there is the actual thing.
* Yes, ten, count them, ten live-action DC and Marvel superhero movies. The live-action Marvel films are: Kraven the Hunter (January); Ant-Man and the Wasp: Quantumania (February); Guardians of the Galaxy Volume 3 (May); The Marvels (July); Madame Web (October); and Blade (November). The DC films are Shazaam! Fury of the Gods (March); The Flash (June); Blue Beetle (August); and Aquaman and the Lost Kingdom (December). The reader should also note that there will be an animated Spider-Man: Into the Spider-Verse sequel, Spider-Man: Across the Spider-Verse (June), while from outside the Marvel-DC domain there will also be an animated Teenage Mutant Ninja Turtles film, Teenage Mutant Ninja Turtles: Mutant Mayhem (August).
Thursday, September 8, 2022
The Summer Box Office 2022: A Commercial Consideration
Labor Day has come and gone, marking the end of summer, and making it possible to appraise the season in full.
Doing so it looks like there were no big surprises since the last post on the subject--the box office is recovering, but not all the way back to normal, with a particularly weak August underlining the fact. In 2022 dollars the average take in that month in 2015-2019 was a little over $1 billion. By contrast the August of 2022 took in less than half that much ($467 million). Added to the preceding months this made for a take of about $3.4 billion for the May-August period--as against $5 billion or so in a typical year after one adjusts the old figures for inflation.
In short, the box office revenue remained at least a third down from the usual for the four month period. (I emphasize "at least" because in so many of those prior years a mega-release in very late April took in as much as hundreds of millions that would have ordinarily been counted as part of May's take--not a factor in 2022.) However, it seems worth reiterating that this seems a reflection of the year's weaker than usual slate--not least in the season's having had a mere four big live-action action-adventure blockbusters rather than the usual eight or more, three of which turned out to be less than the performers some might have hoped for because of their various disadvantages. The two of those four belonging to the Marvel Cinematic Universe bespoke the decadence of a franchise already heavily mined for more than a decade. (Thus was it with the self-referential Dr. Strange 2--the plot of which may have more bound up with a small-screen annex of the saga than was a good bet for what had to be a general audience entertainment, while as with so many worn-out series' Thor proved self-parodying enough to likewise leave many less than pleased.) Meanwhile the third installment in the second Jurassic Park trilogy ended that once-mighty saga with a whimper rather than a bang.
The small number of films meant that not only was there less of what people expect to see in the summer but that the roll-out had run its course by early July with Thor 4's debut--where in other years it continues down into late July and often early August (one reason why August 2022 was so lackluster compared with its predecessors, not least the August of 2016 that had such a big boost from the early August launch of Suicide Squad). Of course, those films could have benefited from the lack of competition by their individually getting that much more custom than they would have in a more crowded summer--but this likely only really happened for one of them, Top Gun 2, to go by the extraordinary legs that added so powerfully to its take (the movie making $42 million in its third month of theatrical play, which counted toward its mighty $700 million haul by Labor Day, almost four and a half times what it took in over its holiday weekend debut in an extreme rarity these days). Again that movie proved that single films can still be hits on as grand a scale as in the pre-pandemic period, but all the same, there is that gap in the earnings--if plausibly because Hollywood's release schedule, after all the chaos to which production and distribution has been subject, and all the disappointments of 2020 and 2021, has not yet caught up with the readiness to go to the movies (I think, quite understandably).
Still, it seems plausible that the summer of 2023, which looks like it has the first really "normal" summer release slate since 2019 (Guardians of the Galaxy 3, Fast and Furious 10, Transformers 6, Indiana Jones 5, Mission: Impossible 7, Captain Marvel 2, Meg 2--plus new DC Comics-verse movies about the Blue Beetle and the Flash, not counting animated superhero epics that include a Spider-Man: Into the Spider-Verse sequel and yet another big-screen Teenage Mutant Ninja Turtles cartoon, the opposite of the pattern in a live-action version of the animated classic The Little Mermaid, etc., etc., etc.), could, in the absence of further disasters, bring in the kind of overall gross to which the studios and theater chains have grown accustomed.
And of course, coming on top of the other virtually indistinguishable films squeezed into the preceding four months (more Marvel films in Kraven the Hunter and Ant-Man 3, another DC film in Shazaam 2, John Wick 4, yet another attempt to make a hit movie out of Disney's Haunted Mansion ride, another Super Mario Brothers movie three decades after the virtually forgotten first attempt), and the four months to follow (from Marvel Madame Web, a reboot of Blade and a sequel to Aquaman; a Willy Wonka prequel, a sequel to the recent Ghostbusters sequel, a fourth installment in the rebooted Star Trek series, a Star Wars film from Taikia Watiti, etc., etc., etc.), it will give those who are weary of the torrent of sequels and prequels, remakes and spin-offs abundant opportunity for, as Fred Costanza might call it, "the airing of grievances."
Those so inclined of course have my deepest sympathies--and may feel entirely free to do such airing in the comment thread.
Doing so it looks like there were no big surprises since the last post on the subject--the box office is recovering, but not all the way back to normal, with a particularly weak August underlining the fact. In 2022 dollars the average take in that month in 2015-2019 was a little over $1 billion. By contrast the August of 2022 took in less than half that much ($467 million). Added to the preceding months this made for a take of about $3.4 billion for the May-August period--as against $5 billion or so in a typical year after one adjusts the old figures for inflation.
In short, the box office revenue remained at least a third down from the usual for the four month period. (I emphasize "at least" because in so many of those prior years a mega-release in very late April took in as much as hundreds of millions that would have ordinarily been counted as part of May's take--not a factor in 2022.) However, it seems worth reiterating that this seems a reflection of the year's weaker than usual slate--not least in the season's having had a mere four big live-action action-adventure blockbusters rather than the usual eight or more, three of which turned out to be less than the performers some might have hoped for because of their various disadvantages. The two of those four belonging to the Marvel Cinematic Universe bespoke the decadence of a franchise already heavily mined for more than a decade. (Thus was it with the self-referential Dr. Strange 2--the plot of which may have more bound up with a small-screen annex of the saga than was a good bet for what had to be a general audience entertainment, while as with so many worn-out series' Thor proved self-parodying enough to likewise leave many less than pleased.) Meanwhile the third installment in the second Jurassic Park trilogy ended that once-mighty saga with a whimper rather than a bang.
The small number of films meant that not only was there less of what people expect to see in the summer but that the roll-out had run its course by early July with Thor 4's debut--where in other years it continues down into late July and often early August (one reason why August 2022 was so lackluster compared with its predecessors, not least the August of 2016 that had such a big boost from the early August launch of Suicide Squad). Of course, those films could have benefited from the lack of competition by their individually getting that much more custom than they would have in a more crowded summer--but this likely only really happened for one of them, Top Gun 2, to go by the extraordinary legs that added so powerfully to its take (the movie making $42 million in its third month of theatrical play, which counted toward its mighty $700 million haul by Labor Day, almost four and a half times what it took in over its holiday weekend debut in an extreme rarity these days). Again that movie proved that single films can still be hits on as grand a scale as in the pre-pandemic period, but all the same, there is that gap in the earnings--if plausibly because Hollywood's release schedule, after all the chaos to which production and distribution has been subject, and all the disappointments of 2020 and 2021, has not yet caught up with the readiness to go to the movies (I think, quite understandably).
Still, it seems plausible that the summer of 2023, which looks like it has the first really "normal" summer release slate since 2019 (Guardians of the Galaxy 3, Fast and Furious 10, Transformers 6, Indiana Jones 5, Mission: Impossible 7, Captain Marvel 2, Meg 2--plus new DC Comics-verse movies about the Blue Beetle and the Flash, not counting animated superhero epics that include a Spider-Man: Into the Spider-Verse sequel and yet another big-screen Teenage Mutant Ninja Turtles cartoon, the opposite of the pattern in a live-action version of the animated classic The Little Mermaid, etc., etc., etc.), could, in the absence of further disasters, bring in the kind of overall gross to which the studios and theater chains have grown accustomed.
And of course, coming on top of the other virtually indistinguishable films squeezed into the preceding four months (more Marvel films in Kraven the Hunter and Ant-Man 3, another DC film in Shazaam 2, John Wick 4, yet another attempt to make a hit movie out of Disney's Haunted Mansion ride, another Super Mario Brothers movie three decades after the virtually forgotten first attempt), and the four months to follow (from Marvel Madame Web, a reboot of Blade and a sequel to Aquaman; a Willy Wonka prequel, a sequel to the recent Ghostbusters sequel, a fourth installment in the rebooted Star Trek series, a Star Wars film from Taikia Watiti, etc., etc., etc.), it will give those who are weary of the torrent of sequels and prequels, remakes and spin-offs abundant opportunity for, as Fred Costanza might call it, "the airing of grievances."
Those so inclined of course have my deepest sympathies--and may feel entirely free to do such airing in the comment thread.
Wednesday, September 7, 2022
The Schoolteacher as Failure: A Perennial Pop Cultural Theme
With a new school year upon us it seems natural that the talk is turning again to the matter of the schools--with that talk lent an additional edge by widely circulated reports of a teacher shortage approaching crisis levels, and the possibility of a mass exodus from the teaching profession as a growing possibility in the near term, or even already in its beginnings.
Whatever one makes of the actual severity of the present situation, and the likelihood of its escalating into a collapse of the educational system, the fact remains that it has meant some renewal of attention to teachers' discontents.
Not the least of them is how little respect teachers, especially primary and secondary school teachers, get in the United States.
There is a lot to talk about here, but one aspect of the matter that has caught my eye as both meriting comment while getting little attention is how pop culture tends to portray teachers as having failed in their lives somehow. Breaking Bad's Walter White is an excellent example--the fact that he is teaching high school chemistry in itself indicative of how far the onetime Sandia Labs scientist and almost near-billionaire has fallen in the world, with his having to wash his students' cars to top off his meager teaching income likewise topping off his indignity.
Others were making the point long before that--and more explicitly--with Eugene Burdick and Harvey Wheeler's classic thriller Fail-Safe worth citing. There the political scientist Groteschele, who is haunted by the memory of his onetime surgeon father's having been forced to make his living as a butcher after his immigration to the United States, in a moment in which he has become fearful for his career, speculates about the "academic equivalent of his father's butcher-shop job" and concludes that it would be "grade-school teacher to a bunch of idiot children."
One even sees the attitude presented in rather lighter fare, with a notable example the Dan Schneider sitcom Victorious episode "Tori Tortures Teacher," in which former actor and high school drama teacher Erwin Sikowitz's students take him to a play for his tenth-year anniversary as a teacher--and the play turns out to be about a man miserable in his life because instead of realizing his dreams he has been a high school teacher for ten years, right after which Sikowitz displays fairly extreme signs of distress.
Unlike White, and (probably) even more unlike Groteschele had worse come to worst, Sikowitz tells a now anxious and guilt-ridden Tori (truthfully) that he is very happy being a teacher, that he actually loved the play and that his distress was actually caused by something else (his girlfriend breaking up with him by text during the play, which, as it is Sikowitz, left him saddened for reasons other than what would normally be expected). But all the same the idea that a teacher is "just" a teacher is there, and even in this age of supposed hypersensitivity to every social concern and alleged hand-wringing over every offense to some group or other it seems likely, and telling, that no one thought anything of putting it into a show (at least ostensibly) made for children who will be dealing with teachers every day for many years to come.
Of course, if all this depicts disrespect do I think the writers were being disrespectful? Quite frankly, no. They were simply, realistically, I think, acknowledging the disrespect society at large feels toward the profession--which, in its various manifestations (including the very material manifestation of low pay compared with other jobs in which people put similar education to use), makes it a "last resort" for far more people than acknowledge the fact in a culture which deflects any talk of the hard facts of work, its conditions and the monetary compensation without which people cannot live with smarmy talk of "Do it for the children!"-type idealism and service that they would never dare to pull on medical doctors.
Whatever one makes of the actual severity of the present situation, and the likelihood of its escalating into a collapse of the educational system, the fact remains that it has meant some renewal of attention to teachers' discontents.
Not the least of them is how little respect teachers, especially primary and secondary school teachers, get in the United States.
There is a lot to talk about here, but one aspect of the matter that has caught my eye as both meriting comment while getting little attention is how pop culture tends to portray teachers as having failed in their lives somehow. Breaking Bad's Walter White is an excellent example--the fact that he is teaching high school chemistry in itself indicative of how far the onetime Sandia Labs scientist and almost near-billionaire has fallen in the world, with his having to wash his students' cars to top off his meager teaching income likewise topping off his indignity.
Others were making the point long before that--and more explicitly--with Eugene Burdick and Harvey Wheeler's classic thriller Fail-Safe worth citing. There the political scientist Groteschele, who is haunted by the memory of his onetime surgeon father's having been forced to make his living as a butcher after his immigration to the United States, in a moment in which he has become fearful for his career, speculates about the "academic equivalent of his father's butcher-shop job" and concludes that it would be "grade-school teacher to a bunch of idiot children."
One even sees the attitude presented in rather lighter fare, with a notable example the Dan Schneider sitcom Victorious episode "Tori Tortures Teacher," in which former actor and high school drama teacher Erwin Sikowitz's students take him to a play for his tenth-year anniversary as a teacher--and the play turns out to be about a man miserable in his life because instead of realizing his dreams he has been a high school teacher for ten years, right after which Sikowitz displays fairly extreme signs of distress.
Unlike White, and (probably) even more unlike Groteschele had worse come to worst, Sikowitz tells a now anxious and guilt-ridden Tori (truthfully) that he is very happy being a teacher, that he actually loved the play and that his distress was actually caused by something else (his girlfriend breaking up with him by text during the play, which, as it is Sikowitz, left him saddened for reasons other than what would normally be expected). But all the same the idea that a teacher is "just" a teacher is there, and even in this age of supposed hypersensitivity to every social concern and alleged hand-wringing over every offense to some group or other it seems likely, and telling, that no one thought anything of putting it into a show (at least ostensibly) made for children who will be dealing with teachers every day for many years to come.
Of course, if all this depicts disrespect do I think the writers were being disrespectful? Quite frankly, no. They were simply, realistically, I think, acknowledging the disrespect society at large feels toward the profession--which, in its various manifestations (including the very material manifestation of low pay compared with other jobs in which people put similar education to use), makes it a "last resort" for far more people than acknowledge the fact in a culture which deflects any talk of the hard facts of work, its conditions and the monetary compensation without which people cannot live with smarmy talk of "Do it for the children!"-type idealism and service that they would never dare to pull on medical doctors.
Tuesday, September 6, 2022
Ray Kurzweil's Predictions About Education
A generation on Ray Kurzweil's The Age of Spiritual Machines still stands as perhaps the most influential single book to come out of the '90s-era boom in transhumanist, posthumanist and Singularitarian thought. This is partly a function of its author's having had a particularly high mainstream profile compared with his peers (it is one of the ironies of life as a public intellectual that only those who are already receiving attention have their remarks noticed, never mind treated with any respect--the media just like the man looking for his lost car keys under the light), but also a testament to the book's relative breadth and comprehensiveness, which included explicit forecasts for 2009, 2019 and later dates that were much talked about at the time, and remembered as sufficiently interesting to be revisited since.
In considering those forecasts one must acknowledge that the apparent thought put into them varied greatly. Kurzweil's predictions regarding concrete technological developments only a short way removed from his field of specialty in computing tended to be solidly grounded, even when they were not correct--Kurzweil premising them on neural nets enabling increasingly high-quality pattern recognition--such that they still have some interest years later. (The apparent explosion of progress in artificial intelligence in the '10s, after all, was largely a function of this—a belated consequence of researchers getting faster computers and easy access to vast content via broadband Internet, but still a validation of his case for the potential of neural nets to achieve the ends he described.) He was less impressive when making predictions in the social realm on which he seems less expert, and frankly rather conventional, such that his work is more interesting as a relic of the '90s than anything else. (Thus does it go with his forecasts regarding the international political system and war, Kurzweil having rather complacently predicted interstate conflict finished in a world of enduring U.S. hegemony, with armed conflict reduced to computer security and drone wars against terrorist groups in the next decade, and only continuing to wither after that.)
Other predictions fell in between those two extremes, like Kurzweil's discussion of education in that book, which given the issue's higher than usual profile right now seems more than usually timely.
Revisiting the predictions in Spiritual Machines I find that, in line with his other anticipations, Kurzweil predicted that electronic displays--generally of the tablet type--would be well on their way to supplanting paper documents of all kind, and speaking the primary mode of text creation; software increasingly handling the actual teaching (imparting information, explaining what has not been understood, testing and reinforcing learning); and human instructors "attend[ing] primarily to issues of motivation, psychological well-being, and socialization" by 2009. Ten years on in 2019 direct-eye displays would have replaced the tablets in a situation where paper would have become all but residual, while human teachers would be "monitors and counselors" instead of "sources of learning and knowledge."
Of course, as in so many other ways, our actual 2019 proved to be far behind Kurzweil's 2009. The tablet did not become commercialized for many years after that--and by the way proved a flop, the cell phone remaining the standard computational device for the student, it seems, which has been all the more reason why paper has remained so important. Meanwhile, the incorporation of educational software has yet to go anywhere near so far, especially at the higher levels of instruction, in part because of its limited capacity for really individualized response. As a result the primary job of the instructor is not acting as a monitor or counselor, but still actually teaching.
Indeed, underlining the distance between Kurzweil's predictions and our reality has been his treatment of the matter of remote learning, which he takes quite as increasingly the norm over this time frame. Instead in early 2020 we all discovered how little had been done in this area as in the face of the COVID-19 pandemic much of the world switched its school systems over to this mode--with results that have been undeniably awkward and deeply controversial, and a large body of opinion demanding a "return to normal" regardless of the dangers possibly entailed for children, school staff and society at large.
One aspect of what one might flatteringly call the "discourse," not often foregrounded, was the situation's bearing out the more cynical view of what schools do--rather than impart education, keep children from getting into trouble while their parents are at work. If they don't do the job someone else has to sit and watch them, a task in an age of atomized nuclear families and single parent households apt to fall on mom or dad--with all that means for an adult work force that, to the displeasure of employers, to a significant degree switched over to remote work and has been resistant to the move back to the old norm bosses so clearly want to make happen. The result is that even were remote learning as good as, even better than, the old way in its purely academic results, it would still fail to be an adequate "babysitter"--and it can seem all too telling that Kurzweil failed to consider that.
In considering those forecasts one must acknowledge that the apparent thought put into them varied greatly. Kurzweil's predictions regarding concrete technological developments only a short way removed from his field of specialty in computing tended to be solidly grounded, even when they were not correct--Kurzweil premising them on neural nets enabling increasingly high-quality pattern recognition--such that they still have some interest years later. (The apparent explosion of progress in artificial intelligence in the '10s, after all, was largely a function of this—a belated consequence of researchers getting faster computers and easy access to vast content via broadband Internet, but still a validation of his case for the potential of neural nets to achieve the ends he described.) He was less impressive when making predictions in the social realm on which he seems less expert, and frankly rather conventional, such that his work is more interesting as a relic of the '90s than anything else. (Thus does it go with his forecasts regarding the international political system and war, Kurzweil having rather complacently predicted interstate conflict finished in a world of enduring U.S. hegemony, with armed conflict reduced to computer security and drone wars against terrorist groups in the next decade, and only continuing to wither after that.)
Other predictions fell in between those two extremes, like Kurzweil's discussion of education in that book, which given the issue's higher than usual profile right now seems more than usually timely.
Revisiting the predictions in Spiritual Machines I find that, in line with his other anticipations, Kurzweil predicted that electronic displays--generally of the tablet type--would be well on their way to supplanting paper documents of all kind, and speaking the primary mode of text creation; software increasingly handling the actual teaching (imparting information, explaining what has not been understood, testing and reinforcing learning); and human instructors "attend[ing] primarily to issues of motivation, psychological well-being, and socialization" by 2009. Ten years on in 2019 direct-eye displays would have replaced the tablets in a situation where paper would have become all but residual, while human teachers would be "monitors and counselors" instead of "sources of learning and knowledge."
Of course, as in so many other ways, our actual 2019 proved to be far behind Kurzweil's 2009. The tablet did not become commercialized for many years after that--and by the way proved a flop, the cell phone remaining the standard computational device for the student, it seems, which has been all the more reason why paper has remained so important. Meanwhile, the incorporation of educational software has yet to go anywhere near so far, especially at the higher levels of instruction, in part because of its limited capacity for really individualized response. As a result the primary job of the instructor is not acting as a monitor or counselor, but still actually teaching.
Indeed, underlining the distance between Kurzweil's predictions and our reality has been his treatment of the matter of remote learning, which he takes quite as increasingly the norm over this time frame. Instead in early 2020 we all discovered how little had been done in this area as in the face of the COVID-19 pandemic much of the world switched its school systems over to this mode--with results that have been undeniably awkward and deeply controversial, and a large body of opinion demanding a "return to normal" regardless of the dangers possibly entailed for children, school staff and society at large.
One aspect of what one might flatteringly call the "discourse," not often foregrounded, was the situation's bearing out the more cynical view of what schools do--rather than impart education, keep children from getting into trouble while their parents are at work. If they don't do the job someone else has to sit and watch them, a task in an age of atomized nuclear families and single parent households apt to fall on mom or dad--with all that means for an adult work force that, to the displeasure of employers, to a significant degree switched over to remote work and has been resistant to the move back to the old norm bosses so clearly want to make happen. The result is that even were remote learning as good as, even better than, the old way in its purely academic results, it would still fail to be an adequate "babysitter"--and it can seem all too telling that Kurzweil failed to consider that.
Thursday, September 1, 2022
The World of Publishing in Balzac's Lost Illusions
Not long ago I had occasion to discuss Jack London's classic novel Martin Eden here, and in particular its brilliant conveyance of what it is actually like to try and make a living as a writer. I could not help but think of that aspect of the book as I read my way through Honore de Balzac's classic three-decker novel Lost Illusions (1837-1843), one of the narrative threads of which is Lucien Chardon's attempt to make a career for himself as a poet and novelist in the Bourbon Restoration era.
Of course, important differences exist between the works. Where London captured the experience of being a writer, Balzac, if affording some memorable bits regarding that end of things (not least the reaction of a callow young author to the discovery that publishers say they read their work when they haven't even looked at it), offered a broader and fuller vision of that business of which Martin was never in a position to see very much. In this it helps that Lucien is, in the relevant portion of the story, physically present at the heart of the French publishing world in Paris, and dealing with the leading figures in that world face to face on many an occasion, so that unlike Martin Lucien never need wonder if those to whom he would submit his manuscripts are actual people rather than some form rejection-letter generating piece of machinery. The result is an unequaled treatment of "the realities of the craft, the practical difficulties of the trade," as effective as it is because of its attentiveness to the "social machinery at work" "behind the scenes in the theatre of literature," which the newcomer to this world has to "learn to see by bumping against the wheels and bruising [himself] against the shafts, and chains" of that machinery.
As Lucien learns from the publishers' own mouths publishers are not in the business for "amusement," and certainly no "stepping-stone to future fame" for aspiring authors like Lucien, but "speculators in literature" who are out "to make money"--"the monument reared with your life-blood . . . simply a good or a bad speculation for a publisher" that is only "so much capital to risk." And due to the necessities of "bring[ing] out a name and . . . induc[ing] the public to take up an author and his book" the mediocrity, or worse than mediocrity, who nonetheless has a "reputation" that can be "bought ready-made," however expensively, is a better risk than even an obviously brilliant unknown asking for a pittance, "[t]he manuscripts for which [such] give a hundred thousand francs pay[ing] better than work by an unknown author who asks six hundred." (After all, as increasingly becomes apparent to Lucien and those who follow him on his journey, the broad public sees "success," but not what alone permits it to happen--"the preparations, ugly as they always are," like "the claqueurs hired to applaud"--the good review, the favorable "buzz," all up for sale, and their purchase an ugly necessity.)
Even early on Lucien can only ask one of these men "how could a man publish his first book at all?" when that is the case--to which the man lecturing him sneers that "That is not my affair." The answer is that one has to depend on chance, which has the most depressing of implications for those in whose lives "chance plays no part," which is no random thing, for "there are no chances except for men with a very wide circle of acquaintance; chances of success of every kind increase with the number of your connections; and, therefore, in this sense also the chances are in favor of the big battalions"--those who know the rich and powerful and highly placed because they are themselves rich and powerful and highly placed; those who already have "distinguished names" who can make names for themselves, those who already have money who can hope to make (more) money in this line; so that the author who is not of the elite, who hopes to make a name and a living as an author on the basis of only their talent and work--and certainly to get rich and famous by being an author rather than becoming an author by being rich and famous--faces a pretty bleak prospect.
Moreover, the intelligent reader cannot take comfort in the thought that the system will at least permit "cream to rise" if it can get past these (gigantic, for most virtually invincible) barriers, for the author "of talent rises above the level of ordinary heads," and only comes to be appreciated with time, too long for the speculators, "dealers in printed paper" who "do do not care to take real literature, books that call for the high praise that comes slowly," and "would sooner take the rubbish that goes off in a fortnight than a masterpiece which requires time to sell." (Indeed, lest one imagine that the unsalability of poetry is a latterday phenomenon, young Lucien personally hears the fact declared by one of those publishers himself: anyone who comes into his establishment and says that they bring him a work of poetry one to be "show[n] . . . the door at once," for "Verses mean reverses in the booktrade.")
If Balzac could seem hard on publishers Balzac is just as hard on authors--"the most brutal bookseller in the trade . . . not so insolent, so hard-hearted to a newcomer as the celebrity of the day," terrified as they are of a "possible rival," no, no help likely to come from this direction either. Nor, even, can one hope for much solace from their nearest and dearest, for just as "the world at large declines to believe in any man's superior intellect until he has achieved some signal success," it is the case that"[i]n the home circle, as in the world without, success is a necessity"--family, by whom "[a] man with a career before him is never understood," and indeed second to none in the severity of their judgment on a lack of "success," looking at their relation who has not "made it" as not merely someone who perhaps did not get their chance, but as one who had been weighed and measured and found wanting, the world right in its assessment and their dear one wrong, a failure who deserved to be so. Amid all that it is far less likely that the caterpillar will become a butterfly, so to speak, than that the "life is crushed out of the grubs before they reach the butterfly stage"--their efforts wasted, their potentials unrealized, and very likely their souls damaged or even destroyed, even if they do happen to make it. ("Perhaps it is impossible to attain to success until the heart is seared and callous in every most sensitive spot," Lucien realizes in a moment of clarity.) Indeed, Lucien's aspirations to a career as a poet are crushed out so forcefully so early on that he opts to makes his career as part of the "machinery" instead. ("Unconsciously he gave up the idea of winning fame in literature, for it seemed easier to gain success" as a hack journalist--and alas discovered the way here, too, so treacherous that he ended up sitting before his sister defeated, setting the stage for even worse to come. Much, much worse.)
Of course, it is a truism today that publishing is a business--but most who say this do so in brushing off some other person's painful experience, excusing the brutality with which they have been treated and so participating in it, worsening it, and all that usually without understanding what it means for publishing to actually be a business. By contrast it would be a very stupid reader indeed who came away from having read all of Lost Illusions without "getting it" (that publishing is a capitalist enterprise to publishers and only that, regardless of what they, their spokespersons and their sycophants in the media say; that publishers have no interest in cultivating new authors; that for an "ordinary" person however talented the way in is nearly impossible, critical acclaim another commodity to be purchased, and not even their loved ones likely to support them in what is almost certain to be a futile struggle, etc., etc., etc.). For the book gives us so much truth so plainly and forcefully that if those who ordinarily give the public their information about such things ever cracked open a book rather than just talking about them they would denounce it in a rage exceeded only by their stupidity, just as they denounce anyone who would speak any of this book's truths in public today, the same way the conventional denounce anyone who challenges the propaganda for the world as one great big meritocracy where talent and hard work pay off and those who have not done well have no one to blame but themselves. Which, I suppose, makes it not such a bad thing that the self-important but actually ever less important opinion-makers of such type never do crack open a book, let alone any book that is really worth reading.
Of course, important differences exist between the works. Where London captured the experience of being a writer, Balzac, if affording some memorable bits regarding that end of things (not least the reaction of a callow young author to the discovery that publishers say they read their work when they haven't even looked at it), offered a broader and fuller vision of that business of which Martin was never in a position to see very much. In this it helps that Lucien is, in the relevant portion of the story, physically present at the heart of the French publishing world in Paris, and dealing with the leading figures in that world face to face on many an occasion, so that unlike Martin Lucien never need wonder if those to whom he would submit his manuscripts are actual people rather than some form rejection-letter generating piece of machinery. The result is an unequaled treatment of "the realities of the craft, the practical difficulties of the trade," as effective as it is because of its attentiveness to the "social machinery at work" "behind the scenes in the theatre of literature," which the newcomer to this world has to "learn to see by bumping against the wheels and bruising [himself] against the shafts, and chains" of that machinery.
As Lucien learns from the publishers' own mouths publishers are not in the business for "amusement," and certainly no "stepping-stone to future fame" for aspiring authors like Lucien, but "speculators in literature" who are out "to make money"--"the monument reared with your life-blood . . . simply a good or a bad speculation for a publisher" that is only "so much capital to risk." And due to the necessities of "bring[ing] out a name and . . . induc[ing] the public to take up an author and his book" the mediocrity, or worse than mediocrity, who nonetheless has a "reputation" that can be "bought ready-made," however expensively, is a better risk than even an obviously brilliant unknown asking for a pittance, "[t]he manuscripts for which [such] give a hundred thousand francs pay[ing] better than work by an unknown author who asks six hundred." (After all, as increasingly becomes apparent to Lucien and those who follow him on his journey, the broad public sees "success," but not what alone permits it to happen--"the preparations, ugly as they always are," like "the claqueurs hired to applaud"--the good review, the favorable "buzz," all up for sale, and their purchase an ugly necessity.)
Even early on Lucien can only ask one of these men "how could a man publish his first book at all?" when that is the case--to which the man lecturing him sneers that "That is not my affair." The answer is that one has to depend on chance, which has the most depressing of implications for those in whose lives "chance plays no part," which is no random thing, for "there are no chances except for men with a very wide circle of acquaintance; chances of success of every kind increase with the number of your connections; and, therefore, in this sense also the chances are in favor of the big battalions"--those who know the rich and powerful and highly placed because they are themselves rich and powerful and highly placed; those who already have "distinguished names" who can make names for themselves, those who already have money who can hope to make (more) money in this line; so that the author who is not of the elite, who hopes to make a name and a living as an author on the basis of only their talent and work--and certainly to get rich and famous by being an author rather than becoming an author by being rich and famous--faces a pretty bleak prospect.
Moreover, the intelligent reader cannot take comfort in the thought that the system will at least permit "cream to rise" if it can get past these (gigantic, for most virtually invincible) barriers, for the author "of talent rises above the level of ordinary heads," and only comes to be appreciated with time, too long for the speculators, "dealers in printed paper" who "do do not care to take real literature, books that call for the high praise that comes slowly," and "would sooner take the rubbish that goes off in a fortnight than a masterpiece which requires time to sell." (Indeed, lest one imagine that the unsalability of poetry is a latterday phenomenon, young Lucien personally hears the fact declared by one of those publishers himself: anyone who comes into his establishment and says that they bring him a work of poetry one to be "show[n] . . . the door at once," for "Verses mean reverses in the booktrade.")
If Balzac could seem hard on publishers Balzac is just as hard on authors--"the most brutal bookseller in the trade . . . not so insolent, so hard-hearted to a newcomer as the celebrity of the day," terrified as they are of a "possible rival," no, no help likely to come from this direction either. Nor, even, can one hope for much solace from their nearest and dearest, for just as "the world at large declines to believe in any man's superior intellect until he has achieved some signal success," it is the case that"[i]n the home circle, as in the world without, success is a necessity"--family, by whom "[a] man with a career before him is never understood," and indeed second to none in the severity of their judgment on a lack of "success," looking at their relation who has not "made it" as not merely someone who perhaps did not get their chance, but as one who had been weighed and measured and found wanting, the world right in its assessment and their dear one wrong, a failure who deserved to be so. Amid all that it is far less likely that the caterpillar will become a butterfly, so to speak, than that the "life is crushed out of the grubs before they reach the butterfly stage"--their efforts wasted, their potentials unrealized, and very likely their souls damaged or even destroyed, even if they do happen to make it. ("Perhaps it is impossible to attain to success until the heart is seared and callous in every most sensitive spot," Lucien realizes in a moment of clarity.) Indeed, Lucien's aspirations to a career as a poet are crushed out so forcefully so early on that he opts to makes his career as part of the "machinery" instead. ("Unconsciously he gave up the idea of winning fame in literature, for it seemed easier to gain success" as a hack journalist--and alas discovered the way here, too, so treacherous that he ended up sitting before his sister defeated, setting the stage for even worse to come. Much, much worse.)
Of course, it is a truism today that publishing is a business--but most who say this do so in brushing off some other person's painful experience, excusing the brutality with which they have been treated and so participating in it, worsening it, and all that usually without understanding what it means for publishing to actually be a business. By contrast it would be a very stupid reader indeed who came away from having read all of Lost Illusions without "getting it" (that publishing is a capitalist enterprise to publishers and only that, regardless of what they, their spokespersons and their sycophants in the media say; that publishers have no interest in cultivating new authors; that for an "ordinary" person however talented the way in is nearly impossible, critical acclaim another commodity to be purchased, and not even their loved ones likely to support them in what is almost certain to be a futile struggle, etc., etc., etc.). For the book gives us so much truth so plainly and forcefully that if those who ordinarily give the public their information about such things ever cracked open a book rather than just talking about them they would denounce it in a rage exceeded only by their stupidity, just as they denounce anyone who would speak any of this book's truths in public today, the same way the conventional denounce anyone who challenges the propaganda for the world as one great big meritocracy where talent and hard work pay off and those who have not done well have no one to blame but themselves. Which, I suppose, makes it not such a bad thing that the self-important but actually ever less important opinion-makers of such type never do crack open a book, let alone any book that is really worth reading.
Keir Starmer, Centrist
Robin McAlpine recently penned an interesting piece on the politics of current United Kingdom Labour Party leader Keir Starmer.
Not so long ago I addressed those politics on the basis of two of Starmer's major statements to the public (his February 2020 leadership contest pledges, and his February 2021 "New Chapter for Britain" speech). Here McAlpine offers a broader assessment with the benefit of greater hindsight (if, alas, not too much in the way of specifics).
McAlpine writes that Starmer, who has apparently endeavored to make himself known as "Mr. Rules," "views society as a transaction undertaken between an elite ruling class and a vast class of plebs and the job of a politician as to keep well out of that relationship except in the case of emergency," with that defined not as "starving children," but rather "enough starving children to risk the stability of the system in a way which might cause a breakdown in the relationship between the elite and the plebs," which is to say that "[h]e is there to protect the powerful, which means once in a while they might need to be protected from themselves, briefly."
This transactional view, with the politician as rule-minded mediator among contending interests, and whose highest priority must be maintaining stability via minimalistic adjustments that, at their most feather-ruffling, may occasionally entail compromise the elite may not love, rather than any vision for the country (of making it more just, etc.), is the very essence of "civil," "pluralist," centrist politics--which, of course, are fundamentally and deeply conservative in the political philosophy sense of the term, whatever those who get their information about what words like "conservative" mean from cable news may think.
McAlpine's rather concise summation of that centrism (even if, admittedly, he never uses the word) is, appropriately, rounded out by what centrism has come to in this era, with Britain since Tony Blair no exception, namely a "supply-side" mentality in which said politician "should be facilitating the elite as the way to get 'the best deal' for the plebs" (or at least, that's what they tell everyone), and one might add, his observation that commentators for a publication like the Guardian demand exactly that when they use punditocrat weasel words like "credibility."
Others, however--McAlpine among them--find this kind of credibility not very credible at all, especially in these crisis-ridden times in which centrism, for all its highly touted pragmatism and knack for forging compromises, looks more rigidly ideological than ever before.
Not so long ago I addressed those politics on the basis of two of Starmer's major statements to the public (his February 2020 leadership contest pledges, and his February 2021 "New Chapter for Britain" speech). Here McAlpine offers a broader assessment with the benefit of greater hindsight (if, alas, not too much in the way of specifics).
McAlpine writes that Starmer, who has apparently endeavored to make himself known as "Mr. Rules," "views society as a transaction undertaken between an elite ruling class and a vast class of plebs and the job of a politician as to keep well out of that relationship except in the case of emergency," with that defined not as "starving children," but rather "enough starving children to risk the stability of the system in a way which might cause a breakdown in the relationship between the elite and the plebs," which is to say that "[h]e is there to protect the powerful, which means once in a while they might need to be protected from themselves, briefly."
This transactional view, with the politician as rule-minded mediator among contending interests, and whose highest priority must be maintaining stability via minimalistic adjustments that, at their most feather-ruffling, may occasionally entail compromise the elite may not love, rather than any vision for the country (of making it more just, etc.), is the very essence of "civil," "pluralist," centrist politics--which, of course, are fundamentally and deeply conservative in the political philosophy sense of the term, whatever those who get their information about what words like "conservative" mean from cable news may think.
McAlpine's rather concise summation of that centrism (even if, admittedly, he never uses the word) is, appropriately, rounded out by what centrism has come to in this era, with Britain since Tony Blair no exception, namely a "supply-side" mentality in which said politician "should be facilitating the elite as the way to get 'the best deal' for the plebs" (or at least, that's what they tell everyone), and one might add, his observation that commentators for a publication like the Guardian demand exactly that when they use punditocrat weasel words like "credibility."
Others, however--McAlpine among them--find this kind of credibility not very credible at all, especially in these crisis-ridden times in which centrism, for all its highly touted pragmatism and knack for forging compromises, looks more rigidly ideological than ever before.
Wednesday, August 31, 2022
Why it's So Hard to Make Sense of the Reports of a Mass Exodus From the Teaching Profession
As a new school year begins the education system is again in the headlines, with much talk of fed-up teachers leaving the profession on such a scale as to warrant the term "crisis"--and the press doing its usual lousy job of explaining the topic. Various pieces have given us various numbers to support their assessment of the situation, but per usual little or none of the context that would clarify their meaning. We are told again and again that x numbers of teachers' quit, that there are so many thousand vacant positions in this or that state. But we could only really know if that signals a crisis if we have something to compare those numbers to--numbers that would tell us what the situation is like in "normal" times.
Indeed, I had to spend a lot of time trudging through a lot of the coverage before I came across Derek Thompson's piece in The Atlantic baldly asserting that "[c]omprehensive national data on teacher-turnover rates (the share of teachers who quit each year) . . . are simply not available, or don't go back far enough to tell us whether this year is different."
In other words, Thompson reports that the statistics that would give us the standard just don't exist because no one has bothered to collect the data, or at least, compute the relevant information.
However, I think the key word there is "comprehensive." While a rigorously constructed time series for the country as a whole would be ideal I have found it easy to locate patchier data covering particular years that may or may not be representative, and particular localities, which suggest a rate of perhaps 8 percent a year. As there are some 3.5 million primary and secondary school teachers in the U.S., that would work out to something in the range of 250,000-300,000 teachers leaving the profession in a normal year. This pattern would seem confirmed in a Bureau of Labor Statistics projection that some 270,000 primary and secondary-school teachers would be leaving the profession annually in 2016-2026.
If we go by those estimates then one would expect the standard for a post-pandemic elevation of the rate to mean 300,000+ departing each year, and perhaps many more.
For what it is worth, the Wall Street Journal reports a departure of a mere 300,000 public school staff of all kinds (and therefore, not all of them teachers) in the entire February 2020 to May 2022 period.
It is possible that there are other, different estimates--but so far I have not seen any other statistics purportedly addressing the matter as delineated in precisely that way (departures from K-12 teaching over the whole time frame).
In their absence taking that claim at face value it would seem that quits have actually been few in number relative to a normal period.
I have to admit that I find what may actually be a decline in departures from the profession on the part of teachers deeply counterintuitive, given the elevation of quit rates in general that has given rise to talk of a Great Resignation; given the reality that even before the pandemic there was considerable discontent with conditions in the teaching profession; and that the stresses of the pandemic must have had some negative effect on the willingness of those in the job to stay in it.
However, it may be the case that other factors are offsetting all that.
One may be what has been said by some analysts of the Great Resignation--that much of it is about people leaving a job they don't like to take another, more attractive, job, in the same line of work. Teachers who quit a job at one school and take up a job at another school they think will be more congenial would not count as leaving the profession--even as their departures create difficulties for the schools they left, with the less attractive places to work plausibly suffering disproportionately (and indeed, we are told that poorer school districts, and rural and urban districts, are suffering relative to affluent suburban districts offering better pay and conditions).
Another factor may be that, as their relatively high unionization, and perhaps rising militancy, suggest, many teachers have some inclination to try and bargain collectively for a better deal rather than take their individual chances with the market (certainly as against private-sector computer programmers).
Moreover, one should remember--even if not every writer of such articles seems to do so--that the sense of the situation being one of crisis is as much a matter of what people have been told might happen as what actually has happened. If there has as yet been no "mass exodus" from the profession, some, pointing to an abundance of polling data in which teachers report that they are seriously thinking about quitting, that these accumulating stresses will come to a head in exactly that fashion. The significance of the data is difficult to ascertain, as it is far from clear how many can or will act on such thoughts--but it seems safe to say the entry into this more speculative territory has permitted that much more room for the play of fear and prejudice about the matter, with results all too predictable across the ideological spectrum.
Indeed, I had to spend a lot of time trudging through a lot of the coverage before I came across Derek Thompson's piece in The Atlantic baldly asserting that "[c]omprehensive national data on teacher-turnover rates (the share of teachers who quit each year) . . . are simply not available, or don't go back far enough to tell us whether this year is different."
In other words, Thompson reports that the statistics that would give us the standard just don't exist because no one has bothered to collect the data, or at least, compute the relevant information.
However, I think the key word there is "comprehensive." While a rigorously constructed time series for the country as a whole would be ideal I have found it easy to locate patchier data covering particular years that may or may not be representative, and particular localities, which suggest a rate of perhaps 8 percent a year. As there are some 3.5 million primary and secondary school teachers in the U.S., that would work out to something in the range of 250,000-300,000 teachers leaving the profession in a normal year. This pattern would seem confirmed in a Bureau of Labor Statistics projection that some 270,000 primary and secondary-school teachers would be leaving the profession annually in 2016-2026.
If we go by those estimates then one would expect the standard for a post-pandemic elevation of the rate to mean 300,000+ departing each year, and perhaps many more.
For what it is worth, the Wall Street Journal reports a departure of a mere 300,000 public school staff of all kinds (and therefore, not all of them teachers) in the entire February 2020 to May 2022 period.
It is possible that there are other, different estimates--but so far I have not seen any other statistics purportedly addressing the matter as delineated in precisely that way (departures from K-12 teaching over the whole time frame).
In their absence taking that claim at face value it would seem that quits have actually been few in number relative to a normal period.
I have to admit that I find what may actually be a decline in departures from the profession on the part of teachers deeply counterintuitive, given the elevation of quit rates in general that has given rise to talk of a Great Resignation; given the reality that even before the pandemic there was considerable discontent with conditions in the teaching profession; and that the stresses of the pandemic must have had some negative effect on the willingness of those in the job to stay in it.
However, it may be the case that other factors are offsetting all that.
One may be what has been said by some analysts of the Great Resignation--that much of it is about people leaving a job they don't like to take another, more attractive, job, in the same line of work. Teachers who quit a job at one school and take up a job at another school they think will be more congenial would not count as leaving the profession--even as their departures create difficulties for the schools they left, with the less attractive places to work plausibly suffering disproportionately (and indeed, we are told that poorer school districts, and rural and urban districts, are suffering relative to affluent suburban districts offering better pay and conditions).
Another factor may be that, as their relatively high unionization, and perhaps rising militancy, suggest, many teachers have some inclination to try and bargain collectively for a better deal rather than take their individual chances with the market (certainly as against private-sector computer programmers).
Moreover, one should remember--even if not every writer of such articles seems to do so--that the sense of the situation being one of crisis is as much a matter of what people have been told might happen as what actually has happened. If there has as yet been no "mass exodus" from the profession, some, pointing to an abundance of polling data in which teachers report that they are seriously thinking about quitting, that these accumulating stresses will come to a head in exactly that fashion. The significance of the data is difficult to ascertain, as it is far from clear how many can or will act on such thoughts--but it seems safe to say the entry into this more speculative territory has permitted that much more room for the play of fear and prejudice about the matter, with results all too predictable across the ideological spectrum.
Tuesday, August 9, 2022
Top Gun 2's Box Office Run: Further Thoughts
With the second week of August already upon us it seems fair to say that Top Gun: Maverick has taken the box office crown for the season (and perhaps, the year). Not only did it have an extraordinary opening weekend exceeding even the high expectations for it, taking in over $160 million over the 4-day period. Its legs have been extraordinary, especially late in its run, the film's grosses eroding only very slightly from week to week. The result is that where its merely doubling its opening weekend would have been respectable, it has already grossed four times that sum, with its take still climbing--according to Box Office Mojo, some $662 million in the bank as of last Sunday, after an $7 million gross in that eleventh weekend in theaters (a mere 17 percent down from the $8.4 million of the prior tenth weekend). The result is that where even fairly late into its run I had expected it to top out at a (spectacular) $550 million, that late-stage resilience makes it now appear quite capable of finishing north of the (even more spectacular) $700 million mark.
Just why has this film so totally proven an outlier? Certainly it has helped that the media has been very much on its side. Nevertheless, the public had to be responsive to the push--and it was rather more so than I expected given what seemed to me the film's many liabilities in the present market (the sheer passage of time since the first Top Gun, the exhaustion of '80s nostalgia after so many years of its exploitation, the habituation of the public to more fantastical and CGI-driven blockbusters, etc.). Where this is concerned some have made much of the fact that Paramount eschewed the recent practice announcing a streaming date for the film, discouraging a critical part of the audience from just waiting and catching their film at home a few weeks later, increasing theatrical attendance. Perhaps. However, it seems to me that there is at least one obviously important factor generally getting overlooked, namely the weakness of the competition this summer. Instead of the usual eight-plus big action movies we typically saw through 2019 the summer of 2022 had just four, with the other three less than stellar performers by summer champion standards--helping clear the way for Top Gun 2 to do as well as it did by encouraging repeat business that would probably not have happened had there been more choices for fans of big-screen action.
Just why has this film so totally proven an outlier? Certainly it has helped that the media has been very much on its side. Nevertheless, the public had to be responsive to the push--and it was rather more so than I expected given what seemed to me the film's many liabilities in the present market (the sheer passage of time since the first Top Gun, the exhaustion of '80s nostalgia after so many years of its exploitation, the habituation of the public to more fantastical and CGI-driven blockbusters, etc.). Where this is concerned some have made much of the fact that Paramount eschewed the recent practice announcing a streaming date for the film, discouraging a critical part of the audience from just waiting and catching their film at home a few weeks later, increasing theatrical attendance. Perhaps. However, it seems to me that there is at least one obviously important factor generally getting overlooked, namely the weakness of the competition this summer. Instead of the usual eight-plus big action movies we typically saw through 2019 the summer of 2022 had just four, with the other three less than stellar performers by summer champion standards--helping clear the way for Top Gun 2 to do as well as it did by encouraging repeat business that would probably not have happened had there been more choices for fans of big-screen action.
The Summer 2022 Box Office: A Hopefully Not-Too-Early Assessment
With August upon us the summer film season is fast drawing to a close--the more in as the last really big release, Thor: Love and Thunder, is already a month behind us, making it not too early to make some assessment of how the season has gone that will look better than premature and wildly off the mark come the "official" end of summer on Labor Day Weekend.
A bit of simple math indicates that in the months of May, June and July the North American box office took in some $2.9 billion. How does that stack up against prior years? Simply adjusting the Box Office Mojo numbers for inflation as reported by the Bureau of Labor Statistics one gets an average of about $4.2 billion for the May-July period in 2015-2019--with the figure arguably conservative because of how simply bumping the first summer release back into early April, as an ever more aggressive Disney-Marvel did with Avengers 3 and 4 in 2018 and 2019, takes a big chunk out of the summer's earnings. (Avengers: Endgame picked up $427 million in the last five days of April 2019, which if added to the year's total would have raised it by about an eighth--from $3.35 to $3.8 billion before the adjustment for today's higher ticket prices--and of course 2022 saw nothing of the kind.)
The result is that this year the box office did about two-thirds as well as the average for those years. It might be added that July did a better than that--its $1.13 billion take about three-quarters of the July average for 2015-2019 (of $1.46 billion). That is far superior to how it did back in 2021, when you had MCU movies, F9 and the rest making at best half their expected earnings (and the whole three month period barely equaled the July earnings, with just under $1.2 billion in current dollars). But to claim that the situation has returned to the pre-pandemic norm would still be exaggerating things a good deal.
Of course, in considering that fact one has to admit that the release slate was not quite the same as in those prior years. Where 2015-2019, on average, had eight really big, brand name, live-action action-adventure films playing in theaters in the relevant period (DC/Marvel superhero stuff, spy-fi stuff of the Mission: Impossible/Jason Bourne/Fast and Furious franchises, etc.), 2022 had only four by my count (Dr. Strange 2, Top Gun 2, Jurassic World: Dominion, Thor 4), and of these only Top Gun 2 went "above and beyond" expectations (while Dr. Strange was merely respectable, Jurassic World 3 the weakest earner in the trilogy, and Thor arguably an underperformer). And it seems to me that there is at least an argument to be made that the comparative weakness of the summer slate reflects justifiable caution on the part of the studios as much as it does a cause of the summer's lackluster grosses by pre-pandemic standards.
A bit of simple math indicates that in the months of May, June and July the North American box office took in some $2.9 billion. How does that stack up against prior years? Simply adjusting the Box Office Mojo numbers for inflation as reported by the Bureau of Labor Statistics one gets an average of about $4.2 billion for the May-July period in 2015-2019--with the figure arguably conservative because of how simply bumping the first summer release back into early April, as an ever more aggressive Disney-Marvel did with Avengers 3 and 4 in 2018 and 2019, takes a big chunk out of the summer's earnings. (Avengers: Endgame picked up $427 million in the last five days of April 2019, which if added to the year's total would have raised it by about an eighth--from $3.35 to $3.8 billion before the adjustment for today's higher ticket prices--and of course 2022 saw nothing of the kind.)
The result is that this year the box office did about two-thirds as well as the average for those years. It might be added that July did a better than that--its $1.13 billion take about three-quarters of the July average for 2015-2019 (of $1.46 billion). That is far superior to how it did back in 2021, when you had MCU movies, F9 and the rest making at best half their expected earnings (and the whole three month period barely equaled the July earnings, with just under $1.2 billion in current dollars). But to claim that the situation has returned to the pre-pandemic norm would still be exaggerating things a good deal.
Of course, in considering that fact one has to admit that the release slate was not quite the same as in those prior years. Where 2015-2019, on average, had eight really big, brand name, live-action action-adventure films playing in theaters in the relevant period (DC/Marvel superhero stuff, spy-fi stuff of the Mission: Impossible/Jason Bourne/Fast and Furious franchises, etc.), 2022 had only four by my count (Dr. Strange 2, Top Gun 2, Jurassic World: Dominion, Thor 4), and of these only Top Gun 2 went "above and beyond" expectations (while Dr. Strange was merely respectable, Jurassic World 3 the weakest earner in the trilogy, and Thor arguably an underperformer). And it seems to me that there is at least an argument to be made that the comparative weakness of the summer slate reflects justifiable caution on the part of the studios as much as it does a cause of the summer's lackluster grosses by pre-pandemic standards.
Has Top Gun 2 Really "Saved the Movies?"
In Indiewire Tom Brueggemann made the case that Top Gun 2 has "saved the movies." His argument goes that in a Hollywood which has thrown over stars for franchises, and these overwhelmingly of the CGI-loaded "media" sci-fi type (comic book superheroes, Star Wars, etc.), Top Gun: Maverick has scored far and away the summer's and the year's biggest success with an old-fashioned star-driven vehicle without the superhero and other trappings--suggesting there is still room for other kinds of content.
It is an interesting idea. But I couldn't help noticing that where those other films Mr. Brueggemann talked about as having previously achieved the feat--Easy Rider, Jaws, Star Wars, etc.--generally brought something new to the screen, Top Gun 2 was pointedly old-fashioned --and, I think, hardly proved that its old-fashioned success was replicable. It seems telling that we are talking about a movie not with some newly minted star (there aren't any, and it's far from clear that there can be) but a star of the '80s who has sustained his career in part through franchise films (the Mission: Impossible sequels keeping his name on the marquee through thick and thin, while in 2017 he got involved with Marvel's Dark universe via the remake of the remake of The Mummy), without which he might not still be a star. It also seems telling that the film is still an action movie sequel milking '80s nostalgia, not so different on that level from, for instance, many of Michael Bay's Transformers movies--what is old here less completely a throwback than it may seem at first glance.
Indeed, I suspect that rather than bringing back the old-style star-driven film the movie will be remembered as a last hurrah for that type of film, not least because I do not see the studios rushing back to the old star-driven model, put off by its comparative unpredictability, as well as for lack of prospects as promising as a Tom Cruise-starring sequel to Top Gun. (As we have seen time and again, putting other '80s stars into follow-ups to their hits of that era—Stallone in a new Rambo, Schwarzenegger in a new Terminator--does not get the studios very far, and nor will putting Cruise in Rain Man Revisited or Jerry Maguire II.) In fact, its principal legacy that way will probably be to induce Paramount to convert Top Gun into yet another franchise while it is still hot—with what result, I cannot say.
It is an interesting idea. But I couldn't help noticing that where those other films Mr. Brueggemann talked about as having previously achieved the feat--Easy Rider, Jaws, Star Wars, etc.--generally brought something new to the screen, Top Gun 2 was pointedly old-fashioned --and, I think, hardly proved that its old-fashioned success was replicable. It seems telling that we are talking about a movie not with some newly minted star (there aren't any, and it's far from clear that there can be) but a star of the '80s who has sustained his career in part through franchise films (the Mission: Impossible sequels keeping his name on the marquee through thick and thin, while in 2017 he got involved with Marvel's Dark universe via the remake of the remake of The Mummy), without which he might not still be a star. It also seems telling that the film is still an action movie sequel milking '80s nostalgia, not so different on that level from, for instance, many of Michael Bay's Transformers movies--what is old here less completely a throwback than it may seem at first glance.
Indeed, I suspect that rather than bringing back the old-style star-driven film the movie will be remembered as a last hurrah for that type of film, not least because I do not see the studios rushing back to the old star-driven model, put off by its comparative unpredictability, as well as for lack of prospects as promising as a Tom Cruise-starring sequel to Top Gun. (As we have seen time and again, putting other '80s stars into follow-ups to their hits of that era—Stallone in a new Rambo, Schwarzenegger in a new Terminator--does not get the studios very far, and nor will putting Cruise in Rain Man Revisited or Jerry Maguire II.) In fact, its principal legacy that way will probably be to induce Paramount to convert Top Gun into yet another franchise while it is still hot—with what result, I cannot say.
Subscribe to:
Posts (Atom)