I object to the role the term "doomscroll" plays in our conversations about the news.
The term's users claim that the unhealthy effect of online news perusal is a matter of the Internet user seeking out the negative. This slights the reality that an obscenely crass and criminally irresponsible news media is engaged in relentless fear-mongering for the sake of exploiting the feelings of fear and anger they create--to say nothing of promoting their assorted political agendas. It slights, too, the reality that the media has much to work with in fostering this situation--in part because that news media has so well-served these agendas, and created such confusion and apathy in those who have anything to do with it. (Indeed, they are quite happy to give platform space to doomists of various kinds who keep insisting that the public is not scared enough, effectively defending their exploitative and apathy-extending behavior.) To the extent that individuals do end up chasing the negative, they have been conditioned to do so by the ceaseless barrage of bad news intended to terrify, to infuriate, ultimately to break them.
The criticism could and should go where it is due--though those who like to accuse Internet users of doomscrolling instead prefer to adhere to the eternal principle of the sniveling conformist, that with all of the power comes none of the responsibility.
Saturday, April 20, 2024
Is the Internet Making People Dumber? (Short Answer: Yes)
Many have for a long time expressed concern that the Internet has had a negative effect on its users' habits and mental capacities, with obvious issues the capacity to concentrate on anything at length, and cope with text, particularly of more complex kinds, for various reasons (that screen-based devices work less well than paper books at rewarding attention, that the Internet encourages the skim over the close-read, etc.).
All this seems to me very plausible--and it also seems plausible to me that, rather than as I had hoped, technology progressing so that the disadvantages of screen-based devices were reduced, the opposite has happened, because of how the Internet has changed. It has become more oriented toward shorter items over long (as with the way "microblogging" through Twitter replaced real blogging), and less text-based and more audiovisual (as with how all blogging has been replaced by vlogging), with all that implies for users' habituation to doing anything but reading with attention for long (even the "six paragraph blogpost" that the "genius" Sam Bankman-Fried demanded everything be reduced to likely getting beyond many of them).
And as if that were not enough, there is the way in which any prolonged, deeper, usage has come to entail ceaseless interruption, ceaseless diversion, ceaseless breaks in any train of thought by endless popups extorting from them "agreement" to allow cookies, disable ad blockers, login, pay up, etc. for the privilege of looking at what they were likely pushed to look at thanks to search engine optimization, "sponsored" results and the rest, in spite of the fact that the search results are ever more likely to be pure garbage. (Never mind the kind of research I do for books such as these; simply following the news has become far more of a grind than it was before.)
The endless assault on the Internet user's thought process, with all its intellectual and even psychological effects, seems all too likely to corrode the ability of an Internet user to think at all should they subject themselves to very much of it.
But one should not expect to see the (apropos of a more fitting but far more offensive term) courtiers of Big Tech who write the news--the kind of people who said Sam Bankman-Fried was a genius--to admit it. Instead they dutifully defend everything that has been done, and will be done (they are always sure paywalls work, always sure the problems must be the work of something other than the vicious exploitation of the web user, etc.), while sanctimoniously attacking everyone who thinks that the Internet ought to actually be useful as simply scum who want "free stuff."
All this seems to me very plausible--and it also seems plausible to me that, rather than as I had hoped, technology progressing so that the disadvantages of screen-based devices were reduced, the opposite has happened, because of how the Internet has changed. It has become more oriented toward shorter items over long (as with the way "microblogging" through Twitter replaced real blogging), and less text-based and more audiovisual (as with how all blogging has been replaced by vlogging), with all that implies for users' habituation to doing anything but reading with attention for long (even the "six paragraph blogpost" that the "genius" Sam Bankman-Fried demanded everything be reduced to likely getting beyond many of them).
And as if that were not enough, there is the way in which any prolonged, deeper, usage has come to entail ceaseless interruption, ceaseless diversion, ceaseless breaks in any train of thought by endless popups extorting from them "agreement" to allow cookies, disable ad blockers, login, pay up, etc. for the privilege of looking at what they were likely pushed to look at thanks to search engine optimization, "sponsored" results and the rest, in spite of the fact that the search results are ever more likely to be pure garbage. (Never mind the kind of research I do for books such as these; simply following the news has become far more of a grind than it was before.)
The endless assault on the Internet user's thought process, with all its intellectual and even psychological effects, seems all too likely to corrode the ability of an Internet user to think at all should they subject themselves to very much of it.
But one should not expect to see the (apropos of a more fitting but far more offensive term) courtiers of Big Tech who write the news--the kind of people who said Sam Bankman-Fried was a genius--to admit it. Instead they dutifully defend everything that has been done, and will be done (they are always sure paywalls work, always sure the problems must be the work of something other than the vicious exploitation of the web user, etc.), while sanctimoniously attacking everyone who thinks that the Internet ought to actually be useful as simply scum who want "free stuff."
Average Shot Length, Beyond the Action Movie
Considering the development of cinematic action one finds that a major aspect of the cinematography of the action film is a use of shorter shots, and closer shots, intended to intensify the effect of fight scenes and the like.
Still, it struck me while watching a Hallmark film that was a simple made-for-TV production remote from any pretension to being an action film, or technically dazzling the viewer in any way, the shot length was pretty short--mainly because of the use of close shots. All that happened in the scene was two people talking--but the camera cut from one face to the other whenever each of the actors said their line, because rather than frame the two faces together in the same shot there was just close shot of one face all the way through the dialogue.
One can achieve certain effects that way--but one can also achieve certain effects with that shot of the two faces together as the conversation went on.
It seemed to me that rather than there being a compelling narrative or visual reason for preferring one cinematographic approach to the other the makers of the movie opted for the alternation between close shots of whoever spoke at the moment out of habit, habit formed in an era in which short, close shots had simply become standard.
Still, it struck me while watching a Hallmark film that was a simple made-for-TV production remote from any pretension to being an action film, or technically dazzling the viewer in any way, the shot length was pretty short--mainly because of the use of close shots. All that happened in the scene was two people talking--but the camera cut from one face to the other whenever each of the actors said their line, because rather than frame the two faces together in the same shot there was just close shot of one face all the way through the dialogue.
One can achieve certain effects that way--but one can also achieve certain effects with that shot of the two faces together as the conversation went on.
It seemed to me that rather than there being a compelling narrative or visual reason for preferring one cinematographic approach to the other the makers of the movie opted for the alternation between close shots of whoever spoke at the moment out of habit, habit formed in an era in which short, close shots had simply become standard.
Power, Responsibility and "Personal Responsibility"
It is a truism that with power comes responsibility. (Uncle Ben's formulation is particularly popular, but only one relatively late wording of an age-old theme.)
Most people speaking it (who are not Uncle Ben) are simply paying the tribute to virtue that is hypocrisy.
After all, the person of conventional mind is very comfortable with a hierarchical and unequal order of things in which power is concentrated at the top, but it is those at the bottom endlessly held responsible.
It is the latter, not the former, for whom phrases like "personal responsibility" are meant--Cory Doctorow all too right when he notes that "personal responsibility" is code for "total lack of empathy," for "respectable" opinion is never short of explanations for why those at the bottom deserve none.
Most people speaking it (who are not Uncle Ben) are simply paying the tribute to virtue that is hypocrisy.
After all, the person of conventional mind is very comfortable with a hierarchical and unequal order of things in which power is concentrated at the top, but it is those at the bottom endlessly held responsible.
It is the latter, not the former, for whom phrases like "personal responsibility" are meant--Cory Doctorow all too right when he notes that "personal responsibility" is code for "total lack of empathy," for "respectable" opinion is never short of explanations for why those at the bottom deserve none.
The Criteria for Claqueurs
Where being a claqueur is concerned it seems that two traits are of particular help.
One is having no memory at all, because with no point of reference one has no standards--and thus finds it much easier to pretend that everything put in front of them is a staggering work of heartbreaking genius, then totally forget it so that they can praise the next thing just as highly, pouring forth hyperbolic praise upon hyperbolic praise.
The other trait is the eternal sneer of the insufferable "cool" person when it is useful for them to put something down--because that is the nature of their hire in this case (the reviewers know when a hatchet job is expected, and deliver it accordingly), but also because when everything is a remake it is necessary to put down the old thing for the sake of elevating the new thing. ("Yeah, it's a remake. But the old one was lame." Like with how they beat up on the pre-reboot James Bond movies for the sake of promoting the Daniel Craig-starring movies.)
These two traits, independently and certainly in combination, generate endless stupidity--but it is of course the promotion of stupidity that the claqueur mainly serves in this era of debased standards.
One is having no memory at all, because with no point of reference one has no standards--and thus finds it much easier to pretend that everything put in front of them is a staggering work of heartbreaking genius, then totally forget it so that they can praise the next thing just as highly, pouring forth hyperbolic praise upon hyperbolic praise.
The other trait is the eternal sneer of the insufferable "cool" person when it is useful for them to put something down--because that is the nature of their hire in this case (the reviewers know when a hatchet job is expected, and deliver it accordingly), but also because when everything is a remake it is necessary to put down the old thing for the sake of elevating the new thing. ("Yeah, it's a remake. But the old one was lame." Like with how they beat up on the pre-reboot James Bond movies for the sake of promoting the Daniel Craig-starring movies.)
These two traits, independently and certainly in combination, generate endless stupidity--but it is of course the promotion of stupidity that the claqueur mainly serves in this era of debased standards.
Ted Gioia's Odyssey as a Facebook User
I have in the past found Ted Gioia's music journalism quite worthwhile--as with his pieces on the degeneration of music criticism into lifestyle reporting, and the purchase of more old music relative to new as of late. However, his recent item about the decline of Facebook usefully took up rather a different subject. Where the word "enshittification" has come into wide and useful usage, here we have an in-depth personal account of the experience of the enshittification of a major platform from the standpoint of a longtime user that does that much more to further clarify just what enshittification is all about, well worth the time of anyone interested in the subject.
The Irony of Jaws 19
In its first act Back to the Future, Part II was set in the then futuristic-seeming year of 2015. Some aspects of that future (mostly the attractive ones) have come to seem fairly dated--like the idea of flying cars running on garbage, and hoverboards, and weather reports accurate down to the minute. Others have proven more perceptive--like how '80s nostalgia would be big in the '10s, down to a fondness on the part of some for retro gaming.
Others were a mix of both. In the film's depiction of "Jaws 19" playing at theaters the film was entirely right about how shameless Hollywood would become about indulging its addiction to franchises. But it was, ironically, wrong about the Jaws franchise being the epitome of that. After Jaws 4 flopped back in the summer of 1987 (not the only fourquel to do so that box office season) it pretty much left Jaws alone, putting an end to one of the franchises that launched Hollywood along its current, high concept-dominated path, even as it stuck like glue to just about every other franchise of those same years, so that the Star Wars and Superman and Star Trek and Alien movies just kept on coming.
Others were a mix of both. In the film's depiction of "Jaws 19" playing at theaters the film was entirely right about how shameless Hollywood would become about indulging its addiction to franchises. But it was, ironically, wrong about the Jaws franchise being the epitome of that. After Jaws 4 flopped back in the summer of 1987 (not the only fourquel to do so that box office season) it pretty much left Jaws alone, putting an end to one of the franchises that launched Hollywood along its current, high concept-dominated path, even as it stuck like glue to just about every other franchise of those same years, so that the Star Wars and Superman and Star Trek and Alien movies just kept on coming.
"Realism" in a Misanthropic Era
There is some disagreement about exactly when literary realism as we know it began, and what it includes, but it is safe to say that it at least refers to a modern literary and broader artistic movement premised on a scientific view of life that bases its writing on a rigorous and copious usage of observable fact, rather than writing from supposedly transcendent truths, romanticism, myth, artificial conventions.
These days, however, the term has taken on a particular baggage, with as is often the case film critic David Walsh worth citing. Writing of one film that had been described as "realistic" he observed that it had become common to simply use the term to mean "presenting humanity in the dimmest possible light." Thus "[f]ilm writers and directors seek to outdo each other . . . in their depictions of people's depravity and sadism"--while displaying an "almost grateful acceptance of the filthiness." One can add to what Walsh says that they delight in the display, in rubbing what they present in the viewer's face with a "Welcome to the real world!" swagger as they endeavor to destroy any hope the reader, the viewer, may have of there ever being anything better, not least by implicating them in the evil they see. (Presenting, for example, the horrors of war, and what some think some must be done to prepare for them, they say "You want me on that wall." Presenting horrific forms of economic exploitation they say "You are a beneficiary of this, too, and so forfeit your right to say anything about it.")
There are certain political assumptions there--very dark assumptions about human beings and society and the universe in which they live (which, past a certain point, may actually be quite contrary to the scientific, rationalist character of realism), and one might suggest, at least grounds for accusation of political manipulation (with such manipulation, again, at odds with any genuine realism).
In an era in which a modicum of comprehension of political ideology (and political rhetoric) could be taken for granted among mainstream cultural commentators, and a meaningful range of views existed among them, those assumptions would be recognized and acknowledged and challenged, certainly by critics not necessarily sharing those assumptions. (Just read Roger Ebert's old reviews and you will see what we used to get in even a publication like the Chicago Tribune.) However, as we are reminded again and again, this is not such an era, but the extreme opposite of one--as Walsh makes clear in the rest of a review with an importance extending far beyond the appraisal of one movie.
These days, however, the term has taken on a particular baggage, with as is often the case film critic David Walsh worth citing. Writing of one film that had been described as "realistic" he observed that it had become common to simply use the term to mean "presenting humanity in the dimmest possible light." Thus "[f]ilm writers and directors seek to outdo each other . . . in their depictions of people's depravity and sadism"--while displaying an "almost grateful acceptance of the filthiness." One can add to what Walsh says that they delight in the display, in rubbing what they present in the viewer's face with a "Welcome to the real world!" swagger as they endeavor to destroy any hope the reader, the viewer, may have of there ever being anything better, not least by implicating them in the evil they see. (Presenting, for example, the horrors of war, and what some think some must be done to prepare for them, they say "You want me on that wall." Presenting horrific forms of economic exploitation they say "You are a beneficiary of this, too, and so forfeit your right to say anything about it.")
There are certain political assumptions there--very dark assumptions about human beings and society and the universe in which they live (which, past a certain point, may actually be quite contrary to the scientific, rationalist character of realism), and one might suggest, at least grounds for accusation of political manipulation (with such manipulation, again, at odds with any genuine realism).
In an era in which a modicum of comprehension of political ideology (and political rhetoric) could be taken for granted among mainstream cultural commentators, and a meaningful range of views existed among them, those assumptions would be recognized and acknowledged and challenged, certainly by critics not necessarily sharing those assumptions. (Just read Roger Ebert's old reviews and you will see what we used to get in even a publication like the Chicago Tribune.) However, as we are reminded again and again, this is not such an era, but the extreme opposite of one--as Walsh makes clear in the rest of a review with an importance extending far beyond the appraisal of one movie.
The Distastefulness of Slang
It seems a thing to be expected that, as life goes on, humans face new realities, and situations that make them reckon with old realities they may not have noticed before, or look at those old realities in new ways--and that they will come up with terms to go with those new or new-old realities, situations, states of being, etc.. Some of these terms may be coinages by intellectuals engaged in scientific, academic, philosophical inquiry, but many are likely to emerge from the bottom-up, in the colloquial usages of persons who have no formal standing of that kind--in a word, slang.
Still, there is something very off-putting to many about a great deal of slang, with this arguably extending beyond the distaste of the old and stuffy for the new, or the annoyance of those who are not "with it" at constantly being bewildered by such neologisms.
Why is that?
One explanation may be that of the colloquialisms that people come up with, and even those colloquialisms that come into wide use, a very high proportion are unsatisfactory for whatever reason, perhaps not actually conveying very much, or becoming irritating in a hurry. Certainly this would seem to account for how much slang proves ephemeral, the usage seemingly everywhere for a while and then disappearing such that it afterward only appears as a retro or ironic phenomenon. (I certainly don't miss the '90s tendency to make a false statement and then add "NOT!" at the end.)
Another I can think of would have to do with what many find the repellent quality of slang in our particular day. One possibility is that slang reflects an era, and that an illiterate, intellectually stultified, mean, crass, crude era of backsliding and backwardness and barbarism, of sadism and stupidity and self-satisfied poshlost regarding the whole thing--such as naturally goes with the worship of personal power and its brutal exercise so emblematic of our time--will produce slang to match, reflected in, among much else, the extreme disrespect which has become our default mode of dealing with each other, with every use of "Whatever!" as a dismissive one-word sentence reminding us, and making those who have any intelligence or sensitivity left in them shrink in disgust. Meanwhile those of us who can understand the meaning of such things may remember what Captain Kirk had to say about the use of language in 1986--since which year things have gone far, far worse for those struggling for cultured speech.
Still, there is something very off-putting to many about a great deal of slang, with this arguably extending beyond the distaste of the old and stuffy for the new, or the annoyance of those who are not "with it" at constantly being bewildered by such neologisms.
Why is that?
One explanation may be that of the colloquialisms that people come up with, and even those colloquialisms that come into wide use, a very high proportion are unsatisfactory for whatever reason, perhaps not actually conveying very much, or becoming irritating in a hurry. Certainly this would seem to account for how much slang proves ephemeral, the usage seemingly everywhere for a while and then disappearing such that it afterward only appears as a retro or ironic phenomenon. (I certainly don't miss the '90s tendency to make a false statement and then add "NOT!" at the end.)
Another I can think of would have to do with what many find the repellent quality of slang in our particular day. One possibility is that slang reflects an era, and that an illiterate, intellectually stultified, mean, crass, crude era of backsliding and backwardness and barbarism, of sadism and stupidity and self-satisfied poshlost regarding the whole thing--such as naturally goes with the worship of personal power and its brutal exercise so emblematic of our time--will produce slang to match, reflected in, among much else, the extreme disrespect which has become our default mode of dealing with each other, with every use of "Whatever!" as a dismissive one-word sentence reminding us, and making those who have any intelligence or sensitivity left in them shrink in disgust. Meanwhile those of us who can understand the meaning of such things may remember what Captain Kirk had to say about the use of language in 1986--since which year things have gone far, far worse for those struggling for cultured speech.
'90s Nostalgia: A Personal Note
I remember living through the '90s and finding them appalling--an era of backwardness and stupidity as the worst of the "Greed is good" '80s just went on exploding, and hope of anything better seemed at a historic low point, all as far too many opted for the stupid self-satisfaction of ironic poses in the face of it all. (I have said in the past that the irony at least indicates an acknowledgment of something wrong, as against the sheer obliviousness we have seen since, but it was not the healthiest reaction--and did its part to make things worse.)
But hey, there were The Simpsons, and Seinfeld, and any number of other such pleasures--and like many others I have occasion to remember them and laugh all these years later, a good deal more than I am able to do at what pop culture offers me these days.
But hey, there were The Simpsons, and Seinfeld, and any number of other such pleasures--and like many others I have occasion to remember them and laugh all these years later, a good deal more than I am able to do at what pop culture offers me these days.
Nostalgia: A Few Words
There is a well-known tendency on the part of critical commentators to criticize feelings of nostalgia. They point out that the past was not a kind place to most, and attraction to that past can seem to them to elide or even validate that unkindness--with misanthropes of the postmodernist variety perhaps especially likely to point out such things.
Still, consider the common objects of nostalgia--typically bits of pop culture. Films, shows, music and the like. One can see in them not a forgetfulness of the badness of the past, but fondness for those things that helped get us through the bad times, remembering which can be a reminder that we will get through the bad times in which we are living now.
There seems to me something life-affirming in that.
Still, consider the common objects of nostalgia--typically bits of pop culture. Films, shows, music and the like. One can see in them not a forgetfulness of the badness of the past, but fondness for those things that helped get us through the bad times, remembering which can be a reminder that we will get through the bad times in which we are living now.
There seems to me something life-affirming in that.
Things Composition Instructors Do That Annoy Students: Shadow Grades
In teaching composition, certainly where I was trained in and first did the job, it was common procedure to employ "shadow grades" on certain writing assignments. In this case what this meant was that students would write papers, the instructors would evaluate them, and provide their feedback, with the "shadow grade" consisting of the grade they had earned--with the understanding that revision was assigned, and the grade improvable. That is to say that here was their grade; and in the written notes, here was what they had to do to turn whatever they got into the grade they actually want.
All of this seems reasonable enough, easy enough to understand, and even comfortable for the student, who is given a chance to fix what may have needed fixing. But it did not always seem that way to the student.
To be quite honest, some did not seem to fully grasp the system. But even where they did what was real to them was the grade they had now, not the hypothetical better grade that was by no means guaranteed--the more in as many found, or said they found, anything but an "A" a rude shock. ("Every English teacher I've ever had has always given me an 'A!'" I heard many a time--while I can say that I did see weeping right there in class, and even tantrums, over Cs, in at least one case on the part of a student who was clearly way, way, way past eighteen and any claim to forgiveness for callowness on the score of age.) And of course, if the gap between the grade they hoped for and the grade they got was wide, and they had enough experience to know just how grinding revision can be, they may have felt despair--as indeed, I found that students commonly decline to do anything more than make simple corrections to grammar and spelling. (For instance, if the instructor tells the student that what they have is not a unified essay, but a collection of disconnected paragraphs, and that they will have to establish a clear thesis and either connect or discard the material that does not fit in with it, they are unlikely to go about that to any great degree.)
Do I pretend to have a tidy fix for the problem I am raising that instructors in such classes can hope to implement with the blessing of the higher-ups? Of course not. Having to revise--and even rewrite--goes with the territory. And students do want to know what grade they are likely to get in the end. It is, after all, what most of them care about--because even if their elders have the luxury of smarmily telling them "It's not what grade you get, it's what you learn!" no one will much care "what they learned" if their grade point average slips below the minimum threshold required by their scholarship, or if they fail the class and are required to take it over again, perhaps paying a stiff financial penalty on top of the setback involved in taking the class the second time around. Indeed, a poor grade may mean enough hardship that they have to drop out of school--in which case, there goes their chance to learn something.
Admitting that I do see, if not a solution, then at least a way in which we could be doing better, specifically that it would not all be so stressful were students not sent in as underprepared for college academically and in other ways as they have tended to be, a problem exacerbated by the reality of the credentialing crisis that people in authority seem happy to go on feeding (and giving economists Nobel Prizes for justifying); were a college education not so expensive and thus so precarious for so many; perhaps, were our grading systems less punitive.
Alas, such things lie far, far outside the purview of anyone likely to be teaching an actual composition course.
All of this seems reasonable enough, easy enough to understand, and even comfortable for the student, who is given a chance to fix what may have needed fixing. But it did not always seem that way to the student.
To be quite honest, some did not seem to fully grasp the system. But even where they did what was real to them was the grade they had now, not the hypothetical better grade that was by no means guaranteed--the more in as many found, or said they found, anything but an "A" a rude shock. ("Every English teacher I've ever had has always given me an 'A!'" I heard many a time--while I can say that I did see weeping right there in class, and even tantrums, over Cs, in at least one case on the part of a student who was clearly way, way, way past eighteen and any claim to forgiveness for callowness on the score of age.) And of course, if the gap between the grade they hoped for and the grade they got was wide, and they had enough experience to know just how grinding revision can be, they may have felt despair--as indeed, I found that students commonly decline to do anything more than make simple corrections to grammar and spelling. (For instance, if the instructor tells the student that what they have is not a unified essay, but a collection of disconnected paragraphs, and that they will have to establish a clear thesis and either connect or discard the material that does not fit in with it, they are unlikely to go about that to any great degree.)
Do I pretend to have a tidy fix for the problem I am raising that instructors in such classes can hope to implement with the blessing of the higher-ups? Of course not. Having to revise--and even rewrite--goes with the territory. And students do want to know what grade they are likely to get in the end. It is, after all, what most of them care about--because even if their elders have the luxury of smarmily telling them "It's not what grade you get, it's what you learn!" no one will much care "what they learned" if their grade point average slips below the minimum threshold required by their scholarship, or if they fail the class and are required to take it over again, perhaps paying a stiff financial penalty on top of the setback involved in taking the class the second time around. Indeed, a poor grade may mean enough hardship that they have to drop out of school--in which case, there goes their chance to learn something.
Admitting that I do see, if not a solution, then at least a way in which we could be doing better, specifically that it would not all be so stressful were students not sent in as underprepared for college academically and in other ways as they have tended to be, a problem exacerbated by the reality of the credentialing crisis that people in authority seem happy to go on feeding (and giving economists Nobel Prizes for justifying); were a college education not so expensive and thus so precarious for so many; perhaps, were our grading systems less punitive.
Alas, such things lie far, far outside the purview of anyone likely to be teaching an actual composition course.
School as Hoop-Jumping, and the Failure of Education
I have from time to time discussed here my experience of college teaching--and where college teaching often goes wrong, certainly in those areas where I am most familiar with it (the teaching of college English). Where all this is concerned one can point to failures in the design of curricula and of methods (as with composition teachers who mistake themselves for Zen Masters and run their classes in line with their delusions).
However, that is far from the only source of the problem, with what the student brings with them often an issue as well. Most of the criticism we hear about this has to do with their K-12 preparation, which is a matter not just of what schools fail to instill in their students, but what they them do instill in them instead--a cynicism toward education.
It has long been understand that, besides any narrowly academic content, our schools (designed to prepare workers for employment in nineteenth century mills; education for two centuries ago, today!) go to great lengths to inculcate habits of punctuality, self-restraint, tolerance of boredom, and above all, deference to authority figures. To many it may seem that they do not succeed in all this so well as they ought. However, they succeed in this to a rather greater degree than they do in inculcating a respect for knowledge and an understanding that they may get something useful to them here, such that the so-called "hidden curriculum" of habit inculcation can seem the real curriculum, the only curriculum, as it is the rest of what they are supposed to learn that gets obscured, everything they do really seeming to be about placating these authority figure messing with their lives.
At higher levels of education, where more is demanded of the student intellectually--where unless they can take a genuine interest and make a genuine effort nothing much can be expected--this gets to be a real problem, perhaps especially in classes in the humanities that get so little respect. It seems to me that it should be fairly obvious that, for instance, a composition class intended to aid a student in developing the skills of reading closely, thinking critically, and communicating clearly, can have value to everyone--but no matter what anyone tells those ruined for such training by prior experience, they believe that it is just more hoop-jumping, to be treated accordingly.
However, that is far from the only source of the problem, with what the student brings with them often an issue as well. Most of the criticism we hear about this has to do with their K-12 preparation, which is a matter not just of what schools fail to instill in their students, but what they them do instill in them instead--a cynicism toward education.
It has long been understand that, besides any narrowly academic content, our schools (designed to prepare workers for employment in nineteenth century mills; education for two centuries ago, today!) go to great lengths to inculcate habits of punctuality, self-restraint, tolerance of boredom, and above all, deference to authority figures. To many it may seem that they do not succeed in all this so well as they ought. However, they succeed in this to a rather greater degree than they do in inculcating a respect for knowledge and an understanding that they may get something useful to them here, such that the so-called "hidden curriculum" of habit inculcation can seem the real curriculum, the only curriculum, as it is the rest of what they are supposed to learn that gets obscured, everything they do really seeming to be about placating these authority figure messing with their lives.
At higher levels of education, where more is demanded of the student intellectually--where unless they can take a genuine interest and make a genuine effort nothing much can be expected--this gets to be a real problem, perhaps especially in classes in the humanities that get so little respect. It seems to me that it should be fairly obvious that, for instance, a composition class intended to aid a student in developing the skills of reading closely, thinking critically, and communicating clearly, can have value to everyone--but no matter what anyone tells those ruined for such training by prior experience, they believe that it is just more hoop-jumping, to be treated accordingly.
What Ever Happened to the Debate About Euthanasia?
The recent "assisted suicide" of former Dutch Prime Minister Dries and Eugenie van Agt put euthanasia back into the news--for a little while, at least.
Back in the 1990s, and their overhang into the early 2000s, the topic of euthanasia and all associated with it--like the philosophical, moral and legal question of whether human beings have a "right to die"--seems to have been much more commonly discussed than in the years since. Indeed, it seemed a "hot button" culture war topic comparable to abortion, sufficiently so that South Park creators Trey Parker and Matt Stone devoted an episode of their short-lived sitcom That's My Bush! to it.
Little of it has been heard since--but I suspect it is not because the matter was somehow "settled" in some consensus. Quite the contrary. The evidence indicates that a majority, perhaps a very significant majority, of Americans support the right to die, but taking the fifty states as a whole very little changed legally here.
Why a gap so vast, and at the same time so unremarked?
One possibility is that the issue was crowded out by a crisis-packed century, and the explosion of conflicts that were previously suppressed rather than settled. (Remember how a certain sort of commentator was prone to claim that the country had become "post-racial" not so many years ago? They don't say that now.)
In that it may have helped that the existing situation aligned with the preferences of the right, especially given its success in driving the agenda in American politics in recent decades. There was no equivalent to Roe vs. Wade here for them to protest against--and so, as far as the media was concerned, no argument, making it that much easier for the issue to fall by the wayside.
Back in the 1990s, and their overhang into the early 2000s, the topic of euthanasia and all associated with it--like the philosophical, moral and legal question of whether human beings have a "right to die"--seems to have been much more commonly discussed than in the years since. Indeed, it seemed a "hot button" culture war topic comparable to abortion, sufficiently so that South Park creators Trey Parker and Matt Stone devoted an episode of their short-lived sitcom That's My Bush! to it.
Little of it has been heard since--but I suspect it is not because the matter was somehow "settled" in some consensus. Quite the contrary. The evidence indicates that a majority, perhaps a very significant majority, of Americans support the right to die, but taking the fifty states as a whole very little changed legally here.
Why a gap so vast, and at the same time so unremarked?
One possibility is that the issue was crowded out by a crisis-packed century, and the explosion of conflicts that were previously suppressed rather than settled. (Remember how a certain sort of commentator was prone to claim that the country had become "post-racial" not so many years ago? They don't say that now.)
In that it may have helped that the existing situation aligned with the preferences of the right, especially given its success in driving the agenda in American politics in recent decades. There was no equivalent to Roe vs. Wade here for them to protest against--and so, as far as the media was concerned, no argument, making it that much easier for the issue to fall by the wayside.
The "'80s Jerk," Again
The last time that I raised the issue of the "'80s jerk" my explanation of the phenomenon was that in that earlier era of film-making we still had lots of grounded, smaller-scale stories where the stakes are personal rather than galactic and petty meanness or ambition could be significant plot points in movies viewed by large numbers of people.
Yet I can also see a possibility of there being more to the matter than that--reflecting the rightward turn of politics and popular culture in that time. One is a strengthening of the tendency (admittedly, probably always stronger in America than other places in the Western world) to think less in terms of society and the way it is structured and more of individuals, and the obstacle to the hero's realization of their goals as a matter of entirely individual villains. ("It's not the apple barrel that's rotten, just a few bad apples," the conventional always say. Like Sigourney Weaver being all that's really wrong with Wall Street so that Carly Simon sings "Let the river run!" over the end credits to Working Girl.)
Another may be that it was a vehicle of selective bashing of authority--the kind of authority that those skewing rightward did not much like, who tend to be the equal and opposite of the ultraconformist "maverick" the conventional so love. (Thus in libertarian Ivan Reitman's Ghostbusters we see the Environmental Protection Agency presented in a very bad light indeed--in contrast with the protagonists who departed academia for private entrepreneurship who save the day.)
So far as I can tell the essential outlook endures, even as the stories that big movies tend to tell have not.
Yet I can also see a possibility of there being more to the matter than that--reflecting the rightward turn of politics and popular culture in that time. One is a strengthening of the tendency (admittedly, probably always stronger in America than other places in the Western world) to think less in terms of society and the way it is structured and more of individuals, and the obstacle to the hero's realization of their goals as a matter of entirely individual villains. ("It's not the apple barrel that's rotten, just a few bad apples," the conventional always say. Like Sigourney Weaver being all that's really wrong with Wall Street so that Carly Simon sings "Let the river run!" over the end credits to Working Girl.)
Another may be that it was a vehicle of selective bashing of authority--the kind of authority that those skewing rightward did not much like, who tend to be the equal and opposite of the ultraconformist "maverick" the conventional so love. (Thus in libertarian Ivan Reitman's Ghostbusters we see the Environmental Protection Agency presented in a very bad light indeed--in contrast with the protagonists who departed academia for private entrepreneurship who save the day.)
So far as I can tell the essential outlook endures, even as the stories that big movies tend to tell have not.
Subscribe to:
Posts (Atom)