Tuesday, December 3, 2024

A Remembrance of Mike Tyson's Punch-Out!! After the Tyson-Paul Bout

As I remarked not long ago, in past decades, and certainly the late '80s, sports games loomed a lot larger within the world of video gaming, just as the world of sports, and its stars, loomed larger within pop culture generally. This all went for boxing, with the most famous boxing game of all time Nintendo's Mike Tyson's Punch-Out!!, like a lot of sports games of the day trading on the cachet of a then-reigning superstar of the sport to sell a game that really didn't have all that much to do with the celebrity in question, or even sport. (As others have observed many a time, the game was less a simulation of boxing--not so easy to provide in any meaningful sense to a player using a controller with a mere four buttons and crosspad--than a puzzle game/rhythm game in which winning the bouts was a matter of figuring out, or more likely being told or reading in a strategy guide, some trick to defeating your opponents, like taking their opening their mouth as a signal to punch them in the gut, then mastering the particular combination of button pushes needed to do the trick.)

Over the course of that game the player fought a succession of matches with a string of entirely imaginary figures on the way up to their bout with "Iron Mike," up at the absolute pinnacle of the boxing world--with the notorious difficulty of the game, and that final match, reflecting and with many reinforcing Tyson's image as a giant of the sport.

Of course, Tyson's star was not long in falling after his achieving his "undisputed heavyweight champion" of the world status, and the game's release. His defeat by knockout by James "Buster" Douglas, his conviction and prison sentence, his effort to regain his standing in the boxing world that is mainly remembered for his biting off a piece of then-heavyweight champion Evander Holyfield's ear in the ring, and, just like so many post-fifteen-minutes superstars before him, acquiescing in a career of self-parody to pay his bills (exemplified by his voicing himself in seventy episodes of Adult Swim's The Mike Tyson Mysteries), are all notorious. Still, in the wake of Tyson's ridiculous and revolting "bout" with the similarly ridiculous and revolting Jake Paul (eight two-minute rounds of "elder abuse" by an ex-Disney Channel sitcom star pretending to be a big brute barbarian), Tyson has never seemed further than he is now from the days when Nintendo so flattered him in that edition of one of its most classic games--while I find myself reminded yet again that just as in the '90s Tyson was already a scandal and a disaster, the country seemed to be going through a nervous breakdown, one which has just gone on from decade to decade, bringing us to the point where few even possess the ability to see any significances in this all too symptomatic episode anymore.

The Online Media Bias Check Web Sites

As we have found ourselves deluged by information and opinion online from an unprecedented number and diversity of sources it has been appropriate that we have also seen a number of web sites emerge which have as their purpose helping web users evaluate the information with which others are presenting them. Thus do sites like Snopes provide a fact-checking service. And thus do sites like Media Bias/Fact Check also rate sites according to their usually undeclared political tendencies.

Again, I think this an essentially positive development--but it seems to me that the sites are less useful than they might be because of the narrow and muddled way in which we speak of "right" and "left," "conservative" and "liberal," and so on, which neglects political philosophy and "social vision," and such issues as the economy or foreign policy, to place the stress on one's positions in the culture war, and inclination toward one rather than the other of the two major parties. (Pro-life and Republican-leaning=conservative/right, pro-choice and Democratic Party-leaning=liberal/left, they say, as if there were no more to political life than that!)

I imagine that the editors of many of these sites do not question the received usages, but even those who do find themselves facing a dilemma. Do they go by the standards commonly used in the media and in political discourse generally, and which the public generally has in mind when they bother to evaluate what they are being told? Or do they challenge that usage with a more grounded and rigorous way of making judgments about web site biases even at the risk of leaving the conventional and hasty users of their sites confused and frustrated and perhaps inclined to look elsewhere for clarification?

As it happens every one of these sites of which I know goes with the first choice, accepting the conventional political labels and their uses. This seems to me understandable but also unfortunate, as they could have been more genuinely helpful by promoting other ways to understand the political scene--with, I think, the problem getting worse rather than better with time, and the sites in question becoming less useful as a result, even by their narrowed standards. Consider, for example, the New York Times, as analyzed in the Columbia Journalism Review. One study discussed in that publication showed that the Times' greatly favored the concerns of the right in its selection of stories, even as judged in the flawed, conventional ways--considerably more than did the Washington Post, for example. Yet Media Bias/Fact Check, on the basis of "wording and story selection" (emphasis added) judged the paper to be "moderately favor[ing] the left," warranting its classification as "Left-Center biased." Meanwhile, as Media Bias/Fact Check admits that the op-eds in the paper have failed fact checks, it treats this as not interfering with the "High" credibility they accord the paper. This, of course, ignores the extent to which not only are opinion pieces subject to editorial approval, and many a reader likely to not distinguish too carefully between supposed reportage and declared opinion pieces, but the way in which, in the words of Amber A'Lee Frost, the Times has treated those opinion pieces as a kind of "clickbait" producing "a vast pool of pseudopolitical content, wide as an ocean, shallow as a puddle" to a degree not only "bloating" the paper, but "positioned very prominently, at the very top of the website." (By contrast, she notes, the Financial Times "sticks them at the bottom.") One may add to this the extent to which those opinion pieces have both been increasingly right-wing, and increasingly suspect in their never close to perfect respect for fact (why bother with those when made-up conversations with nonexistent cab drivers will do?), exemplified by the Times' controversial hiring of climate denialist Bret Stephens.

The result is that abiding by the old standards gives us less and less potential for clarity on what really matters here--and it seems to me that those really serious about judging sources are going to need to do a little more work than just quickly checking the handier evaluations. (Like reading a whole article every now and then in a periodical--if not the CJR, then something at least.)

Has the Red State-Blue State Narrative Changed Over the Years?

The simple-minded, obfuscating, indeed obscurantist discourse treating American politics as a contest between the irreconcilable national cultures of "Red" and "Blue" states has just gone on and on, but rereading Thomas Frank's old "American Psyche" essay I find myself reminded that the narrative has not been perfectly constant.

One aspect of it that has changed is the way that talk of the division once played into the hype about a "New," information age economy that was still very strong about the turn of the century. Those more favorably disposed toward the "Blue" states of the Northeast, Great Lakes and West coast saw them as at the forefront of that new economy, with their cultures playing an important part in that. Their disproportionate share of the more prestigious institutions of higher learning, the more open and cosmopolitan and accepting culture of their cities and their workplaces, were supposed to be key to attracting the superlative "knowledge" workers and technological and entrepreneurial talent that made such an economy possible--all as they criticized the Red states as looking backward nostalgically to an idealized industrial age past and taking comfort in bigotry rather than accommodating themselves to the rules of the hard but rewarding new game.

The narrative was, of course, nonsense. While there certainly are high-technology industries the broader vision of a profoundly transformative information economy, however interesting it may have been as a theory, was in the form that most came to know it and discuss it ultimately cover for neoliberals deflecting attention from deindustrialization and other problems the country had so that they could press ahead with their policies, as we are reminded when we consider just how much we still live in that insanely unsustainable fossil fuel-guzzling world dominated by "the brute force of things" that figures like George Gilder talked as if we had transcended thirty years ago, and ignore the rules of that still very physical, industrial age world at our cost.

It is a sign of just how much less salable the nonsense became that we now hear so much less of it than we did before.

However, if that has changed much else in the narrative abides with the commentariat--and not to our benefit.

The "Moral" Teaching That "Those Who Have None of the Power Have All of the Responsibility."

It is, of course, an ancient moral truism that with power comes responsibility--and that, as Ben Parker had it, with great power comes great responsibility.

However, many of those who would pass themselves off as moralists, in practice, tend to apply the extreme opposite principle in making their judgments. In their view it is the powerless who are at fault for everything in their own lives and with society as well.

It is a profoundly illogical position. But it is very convenient for a sniveling conformist coward ever-groveling before power--being which is pretty much a job requirement for the makers of respectable opinion.

Tuesday, November 5, 2024

The Failure of One Movie, or of a Generation of Filmmakers? David Walsh Discusses Megalopolis

Seeing that David Walsh had reviewed Megalopolis I wondered whether his judgment would challenge the generally negative judgment of the critics. However, the very subtitle of his review makes clear that where the movie's quality was concerned he did not, declaring it a "weak, terribly confused fable about modern-day America." For his part Walsh makes clear that he does not think the movie's being a "fable" is the problem--such in his view capable of "be[ing] revealing and illuminating, bringing out truths in generalized, clarifying form." The film's fault is not that it is a fable but that the fable is "crude and poorly done" in virtually every respect, "technical" ones included, Walsh specifically citing "script," "staging," "acting," "dramatic coherence," and "overall look and 'feel'" before coming to the matter of "social insight," which seems to be really the fatal thing here given the subject matter that Coppola elected to take up (the central conflict in the story between an inventor-architect's aspiration to rebuild a troubled "New Rome" as "Megalopolis" using revolutionary new materials, and the machinations of powerful enemies intent on stopping him, who whip up a reactionary mass movement in opposition). Walsh regards Coppola's evident concerns with fascism and dictatorship as "legitimate" but also thinks that in the movie Coppola "confront[s] a complex society’s immensely complex dilemmas" with "lazy, self-indulgent banalities worthy of the 1970s' 'counterculture'" and indeed a social vision readable as comprised wholly of residues of it, namely "an unhealthy combination of bohemian self-indulgence, quasi-mysticism and extreme . . . individualism." To Walsh this seems especially evident in the tale's centering on "a persecuted, tortured intellectual 'genius'" far above a populace presented here only "as easily manipulated fodder for right-wing demagogues" "retaining his prominence on the world stage and directing its future evolution" being the sole hope of salvation for a world in crisis (which comes off as self-indulgent given how Coppola seems to only too obviously and strongly see himself in the film's "persecuted, tortured intellectual 'genius,'" Adam Driver's inventor-architect Cesar Catalina). Indeed, Walsh proceeds from there to argue that those few critics who have had positive words for the film--it is these and not the far more numerous detractors that he concerns himself with--praise exactly those elements he found unsatisfactory about it, reflecting how they, too, are captive to the same unfortunate way of looking at the world.

Considering that I think of how one of Walsh's themes as a critic has long been the way which artists' outlook and the work that follows from it reflects their times--and his view of American film having suffered since the '70s from how deadly the last half century has been for any sort of critical, socially-informed perspective, with all the implications this has had for those artists whose subject is human beings. If a half century ago artists like Coppola had displayed a measure of genuine social criticism and dissent in the years since they made their peace with the world they so miserably failed to change, and looked to their own enjoyments in it, as the weakest and least satisfying in their outlook came to the fore. The result was that even what passed among them for social concern was "noisy, energy-consuming thrashing about" reflecting fears for their expectations of "continu[ing] to function 'freely' (and prosperously) in a decaying and threatening world."

Of course, Walsh has repeatedly given his readers the impression over the past couple of years that, amid all that has been happening in the world, artists were beginning to really look about themselves again and think hard about what they saw. Indeed, Walsh wrote glowing reviews to two films in 2023 by filmmakers whose works he had consistently panned in the past--Yorgos Lanthimos' Poor Things, and Christopher Nolan's Oppenheimer. Meanwhile if Walsh's annual Oscar coverage these past many decades has generally treated the ceremony as a thing to be endured rather than enjoyed by any really thinking and feeling person, he seemed to see what was very possibly the emergence of a different spirit in the ceremony earlier this year (where it seemed to be a good sign that the two movies by Lanthimos and Nolan, in his view deservedly, between the two of them took home eleven statues, including Best Picture, Best Director, three of the four acting prizes, and Best Adapted Screenplay, as the makers of the generally less worthy fare competing with them generally ended up with consolation prizes). In Walsh's judgment, however, rather than Megalopolis being one of the "green shoots" portending a recovery in American cinema, the film as he describes it is instead a monument to the decadence of the past years he has so often described, in which what was least satisfying in Coppola's work even at its Godfather/Apocalypse Now best (the "murkiest and least coherent, and most self-aggrandizing, elements") is pretty much all the director has to offer now. Indeed it can seem to say something that where Walsh so often closes a review of a really unsatisfactory film or ceremony with an evocation of American filmmaking's healthier situation in the past, and the hints of movement toward something better today, his review of Megalopolis closes with its damning judgment of this movie, Walsh offering nothing beyond that at review's end.

David Walsh's Review of Francis Ford Coppola's Megalopolis: Some Reflections

In recent years film and culture critic David Walsh and his colleagues have attended to fewer and fewer major Hollywood releases, frequently going for months without reviewing a single such movie--while their publishing a review of a big "blockbuster" has become especially rare. This year has been no exception. There was a more than five month gap between their review of Alex Garland's Civil War back in April, and their belated publication of their only review of a major Hollywood film of the summer, Lee Isaac Chung's Twisters, at the start of October.

This may seem just as well given that there has been a very great deal else for them to write about in this era of "polycrisis," during which the arts have reflected the troubles in the larger world--with the institutions of the art world facing existential crisis (prominent museums, symphonies, schools withering and dying for lack of funding, and even megabuck film studios floundering) amid post-pandemic economic stress, government austerity, culture war; with artists and members of associated occupational groups finding it ever harder to make a living and being driven to strike action the same way so many other workers are (most recently, in the video game industry); with the horrific events of our times driving artists to take public stands, and those conventional wisdom flatters by calling "leaders" once more showing their colossal hypocrisy in the battles over free speech that rage in their wake; among much, much else. Indeed, if their review page covered previous Mad Max and Planet of the Apes and Deadpool films, their sequels could hardly seem worth the trouble amid all that, making their taking a pass on writing about them quite natural--the more in as from a critical standpoint such as their own there is often not very much to say about them. Indeed, it seems telling of their feeling about the poverty of American filmmaking today that Walsh and his colleagues recently had time for a series of articles to the movies of 1974.

Still, North American audiences do every now and then get a really big-budget, highly-publicized wide release made by people who are at least aspiring to present them with something more than another Big Dumb Blockbuster, at the very least a Big But Not So Dumb Blockbuster (whether successfully or unsuccessfully). Dealing with these Walsh and company usually do rise to the occasion--with Josh Varlin's appraisal of both parts of Denis Villeneuve's Dune outstanding, and Jacob Crosse and Patrick Martin's coverage of Civil War one of the rather small portion of the outpouring of reviews of that film that seemed to me truly worthwhile.

Francis Ford Coppola's Neo-Roman science fiction epic Megalopolis is another such film, and after its release last month Walsh undertook its review. Alas, his assessment was not positive--but certainly more interesting than most of what I have seen of the outpouring of negative comment, not least for what the film seems to say about a whole epoch in the history of filmmaking.

Finding Interest in the Quotidian

It has been the longstanding view in modern times that really serious artistic work is "realistic," with work that conspicuously departs from reality justly marginal--better-suited to, for example, children than adults, and people who are less than totally adult in some sense (the "nerd" who likes cartoons and video games, certainly, seen as less fully adult than their peers, and the inverse of the "cool" kid who appears more autonomous and sophisticated and grown-up than their peers). There have always been exceptions, of course, but the point is that they have been exceptions, this very much the rule.

Much of this may seem a matter of the general silliness that we get when people of conventional mind try to make distinctions about what is proper and what is not--which produce such absurdities as the idea that Fantasy Football is a pastime worthy of adults, but not playing football on a video game console. But it may be that there are more substantial factors inclining the young more toward the fantastic, the not so young toward the realistic--and especially the everyday. The young lead very limited lives, and the big wide world of which they have seen less--and the worlds beyond that--may have an attraction for them that they do not to a more experienced, more world-weary adult who has already seen a bit of the world, often on terms that have not been particularly pleasant, and been disillusioned by it, so that such things as long journeys to faraway places have not the same romance for them.

That sounds negative, and I have no intention of pretending it is not, but at the same time I do not think all the factors are negative ones. Alongside what they lose is there what probably is more likely to come with age than in any other way, the ability to take an interest in the little things, and the everyday things--mastery of the handling of the details of which is, after all, an area where realism has been stronger than those forms of fiction which incline to flights of fancy.

The Waning of Nostalgia for the Rat Pack

I remember that in the late '90s there was a resurgence of interest in the old "Rat Pack" and its members and their works. Thus did we see, for example, a homage of sorts to the group in Doug Liman's Swingers, the Steven Soderbergh remake of Ocean's Eleven that launched a whole cinematic franchise, and such oddities as Larry Bishop's Mad Dog Time, as Ray Liotta played the actual Frank Sinatra in HBO's film The Rat Pack. (Meanwhile remembrances of the group cropped up in a good many smaller ways--as with Brent Spiner's rendition of Dean Martin's "Sway," as seen in the film Out to Sea--itself a piece of nostalgia in its reunion of Jack Lemmon and Walter Matthau.)

The boom in Rat Pack nostalgia waned, but something of it seemed to come back a decade or so later, helped perhaps by the obsession of a culturally influential stratum with the aesthetic of the '50s as Mad Men became a hit.

However, a decade after there was no repeat of that. The only really significant usage of the Rat Pack's work in pop culture in the late '10s was in Todd Phillips' Joker--a film which, significantly, was itself a throwback (to the work of Martin Scorsese in the '70s and early '80s), and in which the particular theme of "Send in the Clowns" was highly relevant (and probably the reason the film also used Sinatra's rendition of "That's Life" in the end credits).

Some might see cultural politics in that, identifying nostalgia for the Rat Pack with a kind of machismo of which the mainstream has been less and less approving, with the #MeToo era a low point for such tolerance. However, one can also see it as a matter of how in popular memory even as grand a legacy as that of Frank Sinatra and his cohorts fades--perhaps the more quickly with pop culture getting ever more fragmented, and popular memory getting shorter all the time.

The Copyright Nazi Sheds Crocodile Tears for the Struggling Artist

It is rare that anyone in this society expresses sympathy for the struggling artist. Quite frankly, with very few exceptions, no one cares about struggling artists but the struggling artists themselves--not even their more successful brethren. (As Balzac put it when writing of the publishing world in Lost Illusions, "the most brutal bookseller in the trade is not so insolent, so hard-hearted to a newcomer as" they, for where a "bookseller sees a possible loss of money" in a newcomer's manuscript, a successful author sees--"dreads"--in the newcomer "a possible rival," with the result that "the first shows you the door, the second crushes the life out of you.")

The result is that such expressions of sympathy from people who were not struggling artists have tended to get my attention in the past--until time and again the reference to the troubles of artists proved to be just a hook for another rant about the glories of copyright and attack on anyone not taking a maximalist view of such rights as scum. That said, I will not get into the rights and wrongs of copyright here--but the plain and simple matter of the fact is that sterner enforcement of copyright laws is just not going to do much for the newcomer. Such laws defend the interests of those who possess intellectual property. They do nothing for those who have yet to produce any of commercial value. The standard copyright supporter's position is that a strong copyright regime incentivizes the creation of such property--but those struggling artists need a lot more than that if they are to do so successfully, and those professing concern for them show no interest in that whatsoever.

After all, if we grant what copyright's supporters say, and that a more stringent copyright regime does leave, for example, publishers with fatter profits, what are they likely to do with them? Give newcomers more chances? Only those who have no understanding of publishing, business or the neoliberal age can imagine that they would prefer this to a course of new mergers and acquisitions, or financial engineering, because publishers exist to make money, not produce books (and not even necessarily make money by producing books if they can get more, faster, with greater certainty in some other way)--the monuments that young authors rear with their life's blood, as Balzac put it, to them "simply a good or a bad speculation," and in the view of publishers, generally a bad one, whatever the merits of the work. Alas, a refusal to acknowledge such facts is a requirement in those given platforms from which one can reach an appreciable audience in our time, helping make the reference to the interests of struggling writers the cynical thing that it is.

What Irony is Really About: Superiority Without Responsibility

The word "irony" is much misused, so much so that reflecting the situation the writers of Teen Titans Go! actually had one episode in which such misuse by other members of the team had Robin devoting much of one of their adventures to explaining the concept, to the point of giving them an eighth grade English class explanation of the differences between verbal, dramatic and situational irony (as they went through one of their adventures, of course).

Sound as far as it went, it was also consistent with the fact that even those who actually use the word "irony" with impeccable correctness from the standpoint of denotation and syntax are not often sensitive to what it means to look at others and their troubles "ironically," namely the sense of superiority without responsibility involved, for instance, "I can see that this person is heading for a fall, but I can just sit back and enjoy that."

It is the outlook of the self-satisfied aristocrat without respect, sympathy, empathy for lesser beings, whose destruction he takes as entertainment, and the history of art being what it is, pervasive across our inheritance of higher culture, widespread today, and altogether absolutely irresistible to a middlebrow mind--which is why we see so much of it about, all as it is a rare occurrence that anyone points out that this might not be an entirely healthy state.

"But He's So Smart!"

It does not seem uncommon for a certain sort of person to defend a position on the basis of its endorsement by some public figure alleged to have a higher than normal level of intelligence.

"But he's so smart!" they will say when anyone challenges them on this.

It is a case of that very basic logical fallacy, the "appeal to authority," one that is the more blatant because alleged high general intelligence is the source of the authority on what may be a very specialized topic, addressing which high intelligence may simply not mean much unless it has been trained and informed and, of course, put to use on the problem in a serious way, which is far from always being the case. The intelligent may on average be better-equipped to form a rigorously thought-out opinion, but they have their areas of special concern, and there are so many issues in the world, and only so many hours in the day, while even the greatest intelligence is uneven in its performance across the full gamut of mental tasks and subject matter, and even among the intelligent few are capable of wholly setting aside prejudice and self-interest. And they often offer opinions in line with all these limitations--even the genuinely accomplished offering only banalities and worse when they speak about something outside the area of their special expertise (and alas, not realizing it themselves).

The illogic of the approach is underlined by the selectivity of the approach, and its vulnerability. Those who appeal to the authority of intelligence in, for example, citing tech billionaires' statements in support of unrestricted capitalism cannot on this basis respond when someone cites that supreme icon of the Cult of Intelligence, Albert Einstein, in defense of the opposite--remembering that he published "Why Socialism?" in the very first issue of the Monthly Review. Indeed, taking that into account it becomes very easy to argue that tech billionaires (even if one accepts the far from unimpeachable claims for their "superior" intelligence) support capitalism not because they have reasoned out that this is best for society, but because they are billionaire capitalists who, perhaps knowing and caring nothing about anything else, are selfishly defending their positions of extreme privilege.

That Cheerful "Think Again!"

I have always found the rhetorical device of stating some misapprehension supposedly existing among their audience and then saying or writing "Think again!" at the end exceedingly obnoxious.

There is not only an often unwarranted assumption that the audience is thinking the thing they will presume to show to be false because they know so much more than the idiots in the audience do (especially offensive when the misapprehension in question is something only the deeply ignorant or profoundly stupid are likely to believe), but the cheery self-satisfaction in slapping them in the face with their presumed misapprehension with that "Think again!"

"Think that eating a tub of fried lard every day is good for your health? Think again!"

Think that's good writing?

Think again!

The Obnoxiousness of Prefacing a Statement with the Word "Look"

A great many people seem to find a person's starting their statements with the word "Look" off-putting.

A significant part of it is likely the strong association of the tendency with a tone of exasperation, and condescension, which can seem at the least graceless and very easily insulting, which is inseparable from the fact that the exasperated, graceless, insulting person who has resorted to this usage is presuming to tell another person what to do. To command them. If someone can't stand being told what to do even by a person indisputably authorized to do so, how are they going to feel when someone with no grounds for bossing them about that way starts speaking to them in that manner? Especially when they are, as is very likely the case, dictating to them not simply what they are to do but how they are to see the world, forcing their self-serving subjectivity upon their own?

It is no accident that in an interview that is other than the usual flattering promotion by a courtier a cornered rat of a politician, lacking much in the way of self-awareness or alertness to the subtleties of the English language, or respect for the intelligence of an audience in many cases likely to be far smarter than they, so often begins an answer to their interlocutor with that word "Look"--a fact which does nothing to make the unpleasant usage seem any more genial to those who have to hear it.

How Much Research Do Writers Really Do?

Considering the matter of writers' writing from fiction--a tendency so extreme that they do so even when they are writing about that professional activity they know personally, writing--it seems that a good explanation is, besides the demand that they produce work according to formula to get their paychecks, their essentially impressionable natures.

Of course, considering that one might come to that question of the research writers are supposed to do.

This has always been much on my mind because the writers I gravitated toward tended to write "information-heavy" narratives, much of the interest of which was their showing us something of the world, which they often treated in "big picture" fashion, doing which well often entailed a very heavy burden of research. Even as I found myself reading fewer techno-thrillers and more science fiction, and then less science fiction and more of what we call "classics," this stayed with me--such that even as my personal reading consisted more than before of Capital L literature, it was people like Balzac and Zola and Sinclair and Dreiser who got my time and my respect.

These days I suspect that figures like them, who are pretty unfashionable these days (today literary critics are apt to ignore Balzac, mock at Zola's science-mindedness, deride Sinclair for "message," treat Dreiser as "a dead dog"), are also unrepresentative in this way, most writers doing very little research at all. There are often practical reasons for that, especially for those who find their ability to make enough money to live on are driven to work at high speed, with all that means for the opportunity to properly research their subject matter. However, that impressionability seems to have given them that other way of thinking about their subject matter even when they were not so pressed--with the crummy result that fills up our bestseller lists with dreck, as I find myself far more interested in Balzac and Zola and Sinclair and Dreiser than the works our retailers are ceaselessly trying to foist upon us today.

Why Do Writers So Often Write From Other Fiction Instead of Life?

Where the answer to the question that is this post's title is concerned my thought had long been that it was a matter of plain and simple laziness--hacks taking the path of least resistance, and relentlessly encouraged in this by the Dauriats of the industry, whose determination to pay for what is most easily sold within the market they have created through the exercise of their power to the crassest ends leaves them open to only a painfully limited range of material.

Yet considering the view that artists are a "sensitive" lot working in unconscious and impressionistic rather than conscious and analytical ways suggests that there is something more. They go by their impressions--and the reality is that, in a mediated world, their impressions will be mediated ones, especially in regard to anything outside what is usually rather a narrow range of personal experience. Depicting, for example, police investigations, they are apt to have a head full of episodes of procedurals, and so this unavoidably influences what they are likely to write when they pen an episode of a procedural themselves--easily producing results like those atrocious episodes of Castle which I watched unsure of whether I was supposed to be taking it as a parody or not.

Subscribe Now: Feed Icon