Saturday, April 20, 2024

Are Bloggers Wasting Their Time Tinkering with Their Blogs in the Hopes of Getting More Readers?

I suspect at this stage of things that most of those who started blogging have been less than thrilled with their results--certainly to go by the vast number of bloggers who quit the endeavor, often after just a little while. They were given the impression that what goes online is necessarily seen by vast numbers of people, that they would not be crazy to expect the things they put up to "go viral," that they could end up web celebrities.

Instead they mostly found themselves ignored--and when not ignored often insulted. Indeed, many may have been disappointed to find that rather than persistence paying off they have got less and less result with time--finding the search engines less likely to index their posts than before, seeing less and less evidence of actual humans reading their blog by leaving a comment or posting a link, as they suspect that what page views they get are spam, bots, and the rest.

There is no great mystery in this. The reality is that online the ratio of people looking for attention is extremely high relative to those ready to pay attention. The reality is that the great majority of Internet usage is passive, with any kind of engagement a rarity (very, very few of those who do find anything likely to share it with others). The reality is that, where the blogger especially is concerned, the Internet has been moving away from full-bodied text toward shorter-form content (like microblogging) and audiovisual experience (the vlogger rather than even the microblogger). And as if all this did not suffice to mean that things were going from bad to worse the game is entirely rigged at every level in favor of big names, big money, those with legacy media connections, and the other advantages.

Alas, these obvious realities are something few are ready to admit--instead what prevails a stupid meritocratic-aspirationalist view that treats every outcome as the result of a tough but fair test of individual talent and effort. And even those who can see through that stupidity often do not simply shrug their shoulders, but strive instead for a better outcome. One way is by trying to get search engines to treat them more favorably (indexing more of their items, ranking them more highly in search results, etc.), the more in as many a would-be advice-giver claims to know what will do the trick.

Should they post more frequently, or less frequently? Should they go for longer posts, or shorter ones? Are there too many links in their posts, or not enough? Is it a mistake to have so many internal links--for instance, posts that provide handy collections of links to past posts? Should they be weeding out old posts that perhaps did not get looked at as much as others? And so on and so forth. They will hear a lot of advice (typically vague yet completely contradictory advice, supported by no evidence whatsoever)--and if acting on it, spend a lot of time on things that have nothing to do with actual blogging.

My experience is that all that hassle will not accomplish much for most of them, for many reasons. My personal suspicion is that the infrequency with which search engines "crawl" your blog means that it will be a long time before you get any positive result, even if you really do achieve one, which seems to me doubtful. I suspect that the search engines are simply overwhelmed by the amount of content out there, and the efforts to manipulate it--to the extent that they are not catering to the sponsors, demands for censorship, etc. that doubtless tie up much in knots even beyond the direct effect of such corruption of search results (and that half-baked experiments with artificial intelligence are likely lousing things up further). The result is that they probably ought to spare themselves the trouble--and, if they really think their blogging has any meaning at all, get on with that instead.

Never Underestimate the Suckerdom of Bosses

A while back Cory Doctorow rather nicely summed up what I suspect is (contrary to the past decade's Frey-Benedikt study-kickstarted hype, and the post-Open AI GPT-3.5 hype too) the real state of progress in applying artificial intelligence to the vast majority of tasks workers actually perform: "we're nowhere near a place where bots can steal your job," but "we're certainly at the point where your boss can be suckered into firing you and replacing you with a bot that fails at doing your job."

For the worker who gets fired, alas, that the bot they were replaced with fails to do their job is cold comfort. They remain fired, after all--and the odds of their getting a phone call asking them to come back (never mind come back and get compensated for what they were put through) seem to me very slim indeed.

Such is the reality of the relationship between power and responsibility, above all in the Market.

The Unusability of the Internet and the Taming of the Cyber-Space frontier

Back in the 1990s there was a good deal of cyber-utopianism. I think of most of it as having been hype for the web and the tech sector, and for Silicon Valley, as well as broader "market populist" justifications for the neoliberal age by way of their fantasies about the information age--but all the same the idea was there that those who had been shut out, marginalized, denied a chance to speak might have a chance to make themselves heard here.

I suspect that most of those who acted on such premises eventually discovered the cruel lie for themselves, that the Internet was less radical in these ways than advertised, and became less and less so as time went on. That it is a medium whose usership is essentially passive, and carefully gatekept, so that Big Money, legacy media help and well-resourced supporters make all the difference between whether one's message is noticed or not, all to the advantage of the rich and powerful, and disadvantage of the poor and weak. Now with the Internet the mess that it is one may imagine that ever more than before the only way to get a message out through it is to put ever more resources behind it--as those who cannot, whatever they have to say, are seen and heard by virtually no one but themselves.

The End of the Cyber-Frontier

Back in the 1990s the cyber-utopians, reflecting their ideological background, seemed inclined to portray the Internet as a new frontier for the taking. (Indeed, Thomas Friedman, in the midst of lionizing Enron for its business model that he was sure was a perfect symbol of how the twenty-first century would belong to America, exalted its activity as part of a great "cyberspace land grab.")

Today there is no sense of anything like that online, the cyber-frontier long since closed--and, just as with the frontier in American history rather than the version of it the mythmakers continue to promulgate to this day, the result has been less a mass enfranchisement of the many than the further enrichment of an already rich few, as we are reminded by how much, regardless of how Jim Cramer rearranges the letters in his acronyms, a handful of giant firms, the same ones for rather a long time now, have turned what (falsely) seemed an unclaimed continent ripe for the taking into their private domains, and that ever more fully as such of the small freeholders who had made a place for themselves find their situation less and less tenable.

Nostalgia as Protest--and its Limitations

I suppose that where nostalgia is concerned we can speak of "pull" and "push." We may be pulled toward some aspect of the past by its apparent intrinsic fascination--but we may also be pushed away from the present by its repulsions, and this exactly why some frown upon nostalgia as they do.

Preferring the past to the present has historically been the preserve of the conservative and especially the reactionary, who really do want to bring back something of the past. But in a reactionary era the liberal might find comfort in looking backward. So have many liberals, faced with the '80s and all that came after, been nostalgic for the '60s.

It is a more awkward fit in their case. Those who believe in progress, if sincere about their belief, should expect that the best days lie ahead rather than behind them. Thinking otherwise is a reflection of the profound demoralization on their part that has played a far from inconsiderable part in moving the world further and further away from what they said they wanted it to be--while this nostalgia for the past, alas, has been far from the only expression of that sad state.

Of "Optimism"

"Optimism" has always struck me as a slippery word whose usage often betrays a lightness of mind. To be optimistic, after all, always means being optimistic in relation to a particular thing--and unavoidably, being pessimistic about another thing. (I am optimistic about how "our team" will do, which means I am pessimistic about how "their team" will do when we go up against them in the competition, for instance.)

So which do we focus on, the optimism or the pessimism?

As is the case with so much else in life, Authority tends to decide what counts as "optimistic," and of course they identify it with unquestioning faith in the status quo and unswerving conformism, pessimism with the opposite attitude--even though optimism about what exists tends to go with pessimism about much else. (Thus the classical conservative is a believer in the wisdom of traditional social arrangements, and optimistic about adherence to them--but pessimistic about human nature, reason, the prospects for a more just social order. Whether they are optimistic or pessimistic, again, depends on who is doing the talking.)

Confusing matters further in doing the above those in charge often promote what they like to call "optimism" as a broad mental attitude--and a positive character trait, and a lack of such optimism also such an attitude, pessimism equally a broad mental attitude and a negative character trait of which one ought to be ashamed.

As if the way in which optimism and pessimism tend to be bound up, and one or the other emphasized in a shabbily manipulative way, were not enough, it is worth noting another great ambiguity within the term and its usage. There is, on the one hand, the optimism of the person who looks at a problem and says "I see a problem, but I think I can fix it." At the same time there is the person whose response is, instead, "I see a problem but I'm sure it will all work itself out somehow."

The former is the optimism of the problem-solver ready to work for a solution. The latter is the optimism of the person who thinks that they do not have to solve a problem--a response that can be genuine, but often bespeaks complacency, irresponsibility, callousness, in which case optimism is nothing but a lazy or cowardly dodge, with this still more blatant in the case of those who see a problem and then deny that they see any problem at all. These days, this, too, accounts for a very large part of what is said and done under the banner of "optimism."

Naturally we ought to be a lot more careful about how we use the word--and how we respond when others use it, not being intimidated or ashamed when they chastise us for failing in "optimism" according to whatever standard they presume to apply to us for their own convenience.

The Supposed End of Work, and the Drivel Spoken About It

A few years back, when the hype about automation anticipated more than glorified autocompletes, fashionable "thinkers" proclaimed the imminence of radical changes in day-to-day living. Completely misunderstanding the 2013 Frey-Osborne study of automation they thought we were looking at "the end of work" within a few years.

Thus did we have a little, mostly unserious, talk about how society could adapt to half of the work force being put out of a job, like what the relation of work to income would be.

Amid it all, of course, were those who assured everyone that work would still be part of people's lives, with these usually seeming to mean alienated labor done in the market for an employer for wages, the psychological and "spiritual" value of which presumably eternal arrangement they insisted were salutary, claims the media generally treated with the utmost respect.

Of course, few had anything to say about the fact that the media was usually quoting elite commentators whose ideas about what "work" is were remote from the lives of 99 percent of the public, whose experience and thoughts were, as usual, of no account. Many of these, for many of whom "work" has never ceased to mean the hell on Earth that we see in Emile Zola's Germinal, or Upton Sinclair's The Jungle, or Takiji Kobayashi's Kanikosen--and many more for whom work means far, far less but still considerable suffering--would disagree with those "tech" company vice-presidents who think that their sipping a cappuccino on the G6 as they fly home from Davos is "hard work" about whether work is the uplifting and necessary thing they say we cannot do without, instead hopefully looking ahead to the possibility of a different world.

The Beginning of the End of the Cult of the Good School?

By "cult of the Good School" I refer to the profoundly irrational hierarchy that exists among institutions of higher learning in the minds of the conventional wisdom-abiding.

In speaking of that hierarchy as irrational I am not denying that schools differ with regard to the selectivity of their student intake, and the resources available to them for the fulfillment of their educational mission; or denying that that selectivity and those resources may make a difference in the quality of the education and/or training they impart to their graduates. Nor do I deny that these practical differences--or for that matter, the less rational reliance on institutional "reputation"--may make a difference to employers, with implications for their graduates' outcomes with regard to employment and income.

However, the truth is that very few making distinctions among those schools actually know anything about how the schools in question actually stack up against each other in even the more measurable, practical ways (for instance, the average quality of undergraduate instruction); or actually know anything about exactly what difference the school a student graduated from makes in the job market. Rather they just "know" that one school is "better" than others, with most carrying around an image of various strata in their minds--with the Ivy League "better" than others, and within that stratum Harvard better than Yale, all as private schools generally come in ahead of public schools, etcetera, etcetera, albeit with many exceptions all along the line. (Much as many desire to park their car in Harvard yard, the science major will not be looked down upon for choosing the Massachusetts Institute of Technology instead--while Berkley, UCLA, Chapel Hill, among others, in spite of being state institutions, are quite well respected indeed, more so than most private institutions.)

Those who think this way imagine that any young person of intelligence and ambition ought to aim as high as they can in the hierarchy--and if they do not their elders, and conforming peers, think there is something wrong with them. Should they question those elders' and those peers' presumption they will not get an intelligent answer--this "Just what people do."

The insistence on all this ultimately come down to simple-minded snobbery, especially insofar as the data that even the more rigorous sift tend to be flawed and even corrupt. Meanwhile a good deal of longstanding statistical data, confirmed again and again by more recent studies, calls into question the idea of some great wage premium deriving from going to the elite institutions, or even inherent employability. The "power elite" may disproportionately be associated with such colleges, but, even setting aside the fact that the advantages most of them had did not begin or end with the college they attended (where they are, after all, less likely to be going because they were exceptionally bright than because they were exceptionally privileged), the average is a different matter. Stupid writers for trash publications may make claims such as that one can major in somersaults at Harvard and still get a "good job," but as was reconfirmed by a recent study, your major is probably going to be a bigger determinant of whether you get a job in your line and what it will pay than the school to which you went.

However, this foolishness is now up against an era of tighter means, diminished prospects, and greater skepticism about the value of a college education generally, and specific avenues to getting a college education particularly--and as a result of weighing the actual dollars and cents return on their investment in a college degree, and finding that they will very probably be better off commuting to the nearest campus of the university system in their state (maybe after racking up as many credits as they can at the still closer closest campus of their community college) than selling themselves into lifetime debt slavery for "the college experience" of which so many college presidents pompously speak at a Brand Name School. Of course, considering that it is worth acknowledging that the people whose ideas are the mainstream ideas are the ones most remote from the situation of tightened means and diminished prospects; and have absolutely zero respect for the views of the majority. Still, their changing attitude in the face of the crass stupidities of the Cult of the Good School seem likely to increasingly expose those crass stupidities for exactly what they are over time.

Does it Make Sense to Self-Publish a Book in 2024 (if Ever it Did)?

Considering the question that is the title of this post I think it worth starting with a bit of background (covered before in some degree, but worth revisiting). Specifically, circa 2010, while self-publishing was not new, the proliferation of the e-book reader and print-on-demand publishing, significantly by way of Amazon, made part of self-publishing much easier. Writing, editing, copyediting were as costly as ever. But the cost of physically reproducing and distributing the book fell dramatically. For nothing up front anyone could take a manuscript, upload it, and in days have it on sale at retailers all around the world. Meanwhile, if there was still the problem of promotion, there was optimism that the Internet could facilitate that at similarly low cost--and that things generally were looking up. It seemed easier to get people to take chances on 99 cent e-books than on more expensive print works, and e-books were supposed to be moving toward dominance of the market; while it was to be hoped that the self-published would develop ways and means to publicize themselves, and win their way toward acceptance. There were, for instance, notions of book blogs providing a promotional ecosystem for the self-published writer.

Alas, the traditional publishers, faced with the disruption of their business by the e-book, did what companies usually do in the face of such challenge--fight back against disruption, and not by becoming "more competitive" but rather any way they could fair or unfair, and as is usually the case when established firms fight back from their position of advantage with every means at their disposal, they succeeded. The containment of the e-book (I suspect fewer people have e-readers now than did so in 2015) meant the containment of the self-published to a significant degree, all as the Internet, the utility of which for low-cost promotional efforts was probably always far lower than cyber-utopians would have had the public believe, became a lot less conducive to such efforts, with the web increasingly crowded, and search engines, social media and the rest increasingly "enshittified," fragmenting online life so that only big, well-funded, operations had much chance of reaching a wider public, to the disadvantage of the self-published, with this including not only book writers but bloggers, who found their audiences disappearing.

Likely significant in that has been the evolution of the Internet in the broadband era, away from the written word, and toward audiovisual content, often taken in over small screens carried by people on the go. The vlog replaced the blog, while people gravitated away from Facebook to TikTok--as reading generally collapsed. Consider, for instance, how the young adult book boom went bust back in 2015--not coincidentally, about the time that a majority of young people had acquired smart phones. Consider, too, how even as the e-book was contained, the sale of the mass market paperback collapsed (which is why, very likely, you have noticed the disappearance of the paperback racks at the supermarkets, convenience stores and other retail outlets you frequent).

The result is that I suspect that, all other things being equal, getting any self-published work to an audience requires doing more for less than was the case a decade ago--maybe much more for much less.

Of course, all things are not equal. And therein lie the questions that you probably should ask when considering the endeavor, specifically "Is there anything that will give this book a chance in the face of all that?" And if the answer to the first question is "'No,'" "Is what I have to get out there so important that it is worth my while to do that anyway?"

I admit that it can be very unsatisfying to ask a question and get in return another question. However, it is an honest opinion--which it seems to me is far more likely than you are likely to find in most places on the web, where all we hear from are the elitists of traditional publishing who think the self-published author has no right to be in the same market as they, and the services looking to bully and frighten the self-published author into handing over their money so that they profit whether the author themselves has any chance of achieving their own goals or not.

Politics, Art, Hypocrisy

One of the great hypocrisies of art criticism is the sneer at art for being "political."

After all, in treating any subject--or refusing to treat that subject--one makes choices in which there is a politics, even if only at any implicit or unconscious level.

The result is that the condemnation is not of political art but of art the critic does not like--with, given the tendency toward Establishment politics of any critic likely to have a major platform, what they do not like dissent.

Of course, they are free to not like something for taking a view different from their own. Everybody likes having their view of the world validated, and dislikes having it challenged, and one or the other may be part of their experience of an art work. The problem is that rather than forthrightly owning to their prejudices they promulgate that double standard--because they are lacking in sufficient self-awareness to recognize it (as they will not if they are people of conventional outlook, as most people are), because they are too cowardly to admit to what they do recognize in the work and in themselves (specifically that they are not broad-minded enough to stand the disagreement), or simply because the admission would make more difficult the hatchet job they so delight in doing on a work that displeases them (or which, their feelings apart, may simply be required of them by peers, editors or anyone else).

As all this indicates, looking at a work with which one disagrees, appraising its good qualities and bad, and then discussing that openly and freely, is much, much more difficult than most realize--and frankly beyond many of those who strut about calling themselves professional critics.

Tom Clancy: Another View

Some time ago I came across Keith Pille's analysis of Tom Clancy's novels.

The approach he takes differs greatly from mine. Where in considering the sensibility of the books what was uppermost in my mind were the events of the day, and the larger tendency of America's political life (the ascent of "post-Vietnam" right-wing backlash, the neoliberal and neoconservative ascendancies that benefited so much from them, the "Second Cold War"), Pille's stress is more cultural and generational--examining Clancy's world view as that of a "boomer dad," for whom football and lawn-mowing are hugely important signifiers. Some of it works well, some less so. (Pille is entirely right when he remarks how wrong Clancy was about the effects that foreign attacks on U.S. soil would have on American political life, but has little to say about why Clancy was so wrong--and I think the cultural politics-minded use of the "boomer dad" world-view as a lens is simply less conducive to that.) Still, it is on the whole an interesting and worthwhile reader for anyone up for a critical take on the books.

On Irony

Recently writing about "the mood of the 1990s" it was completely unironic that the matter of irony came up in my research, and my writing--and so too why so many were drawn to it then and after.

The plain and simple fact of the matter is that in being ironic toward something or someone people get to feel superior, without that superiority conferring on them any responsibility whatsoever.

They get to stand back and watch as other people go over a cliff and be very pleased with themselves about how much smarter they are in not going over cliffs, and how they have no obligation to care about those going over the cliff.

Naturally the self-absorbed, the vain, the selfish, the mean, and of course the snobbish, find it irresistible--and in the process reveal themselves for what they really are.

I suppose there is irony in that too, albeit one they would vehemently deny even if it did not go over their heads.

"Is This Blog Written by an AI?"

In answer to the question that is the title of this post, "No."

It may well be that others are handing over the job of generating "content" to glorified autocompletes, but I have not done so. What you see here is all mine, and that is how you can expect it to stay, not because I have anything against Artificial Intelligence (quite the contrary), but because the whole reason for this blog's existence is to present my thoughts, my ideas, my posts, not someone or something else's. And so will it remain for the foreseeable future.

"Are You a Robot?"

Recently revisiting the theorizing about the "information age" of Alvin Toffler and others I have mainly focused on what has not happened. Our economic life did not "dematerialize" through the radically intensive substitution of information for material inputs in the way he described to produce a different form of civilization which has combined sustainability with abundance. This is exemplified by how where he thought the whole world would be running on renewables by 2025 well into 2024 we remain a long, long way away from any such future according to any conventional reckoning.

Rather than an information age civilization we have at best scraps of one--just as, contrary to the stupid technological hype ever-prevailing, we have only scraps of the imaginings of the past about how far technology was to have progressed by the early twenty-first century. Thus is it the case that, even though we are a long way from Blade Runner-style androids we find ourselves constantly subjected to a dumbed-down Voight-Kampff test as we roam the web, asked "Are You a Robot?" everywhere we go, precisely because a far, far more basic type of "bot" has proven enough to cause a very great deal of confusion within the very modest and ever more decrepit version of an Internet this civilization has managed to build.

The End of Fan Fiction?

The question of whether "fan fiction" is in decline has at this stage of things been debated for decades.

It has long seemed to me quite logical that it should be in decline.

One reason is that fan fiction thrives on fans being able to sink their teeth into an exciting world--and for many, many years now we have seen the media, instead of offering new worlds, serve up the same old thing again and again, to diminishing returns. (Consider how at FanFiction.net Harry Potter is far and away the biggest fandom. When was the last time we had a really new thing become a hit on that scale?)

There has also been the extreme fragmentation of pop culture--which means that fewer and fewer people are all likely to be looking at any one thing at the same time, about which only a very limited percentage is likely to be sufficiently moved to write stories (which, again, works against any Harry Potter-like phenomenon ever occurring), with all that means for the proportions the fandom is likely to achieve.

Let us also acknowledge that the base of fan fiction has always been the young--and that almost two decades into the age of the smart phone the young are probably less likely to read and write recreationally than their predecessors, with all that implies for their writing fan fiction. (It probably matters that the Internet was just taking off when Harry Potter arrived, that through many of the early years of the fandom when people did experience the Internet it was through a desktop--which left more time for books, for example, as compared with the life of someone young enough to not be able to remember not having had a smart phone in their hand.)

Apart from affecting the inclination to write fan fiction, all this also affects the inclination to read it--translating to the lack of an audience for those who do get something out there, which is no encouragement to keep writing and sharing such stories. Indeed, considering all this I find myself thinking of the reality that so many of those who write fan fiction are responding not to books, but to movies and TV shows. Even when employing the written word they have audiovisual media on their minds--and I suspect that for them a fan fiction story is a poor and distant second to what they would really like to produce, fan movies and shows of the kind only a few can make for lack of the financial and technical resources, which may well dampen their enthusiasm. In fact, it may well be that, just as the vlogger probably does their part in drawing attention away from the old-fashioned blog, the fan works that those who can produce them do generate are likely claiming the attention of the audience that would once have whiled away its time in the old fan fiction repositories.

Subscribe Now: Feed Icon