Thursday, February 27, 2025

Putting the Computing Revolution into Economic Perspective

For decades it has been common for economists to argue that the contributions of computing to productivity, value creation, growth must be far vaster than the figures suggest, to the point of rendering figures on manufacturing value added and even Gross Domestic Product meaningless as a basis for analysis of the economic trend as a result of their gross understatement of the progress they are sure we must be making--the more in, I suppose, because they are mesmerized (or want to be mesmerized) by images of wonders ceaselessly pouring forth from an ever-innovating Silicon Valley.

Alas, that image is at odds with the reality that the change in information technology is profoundly unrepresentative of the rest of the economy, as we see looking at the way our houses are built, our food is produced, our clothes are made--while even in areas where one can make a case for a revolution really being ongoing, like energy (where the rising productivity and plummeting cost of renewals has created enormous potentials), Big Oil and the rest have, with the help of their allies across the business world and government and the media, fought back ferociously and very successfully against that "disruption" that "business intellectuals" so love to talk about).

At the same time even in tech the progress is a far cry from the hype. Aside from the fact that really big developments have been few in recent years (just remember--the Iphone that is the favorite example of this innovation will be old enough to vote this year) there has been a very great deal to offset and limit the value delivered by these machines whether the issue is their contributions to the productivity of business, or the goods they bring to the consumer, sufficiently so as to dispel illusions of vast improvements in product and in life just not reported in the numbers. Below are ten of the more obvious:

1. Computing devices remain incapable of, on an autonomous basis, interacting with the physical world (comprehending it, navigating it, manipulating it with human dexterity, you know, like with a car that actually drives itself), with all that means when we are "living in a material world"--or even coping with situations of less physical kinds requiring great versatility (as you see every time you confront the vileness that companies serve up as customer service). The result is to severely limit the jobs they can do, as the pandemic reminded even the most obtuse--while the great wave of automation of physical and even mental work the pandemic was supposed to bring on has, as of 2025, yet to materialize, looking now like just another piece of techno-hype flogged by the hucksters in Silicon Valley and their courtiers in the media.

2. In most hands, at least, a computer all by itself is little more than a "brick." That is to say that computers' usefulness depends on (besides electricity, of course) computer software, Internet connections and, as a practical matter, outlays for a lot of the things you will hope to do online--all of which, of course, do register in the value added figures. (If you want online grocery delivery, for example, you are expected to pay a fee for the delivery, without which the computer renders no such service.) For comparison purposes, consider an analog-era television set--which so long as it was plugged into the wall could get a signal without the extras. Indeed, it seems plausible that the list of things one can do without paying extra charges is shrinking all the time in a period of ever more pervasive paywalls, decreasing site functionality, consumer exploitation-driven "platform decay," and of course, the general inflationary wave through which all but the super-rich are now suffering.

3. Many of the services that computing devices supply their users are imperfect substitutes for non-digital alternatives--and the disutility involved rarely remarked by such observers, and often not registering in the statistics. (Ordering your groceries online you are likely to find your selection more limited, and the method of ordering more cumbersome and less flexible, as many complain about getting a lower quality of item than they would get were they to shop in person.)

4. Where those things that computers can do are concerned there has been a tendency to diminishing returns that makes the "on paper" improvements in technological performance remote from the actual value delivered. Yes, your computer today is hundreds of times faster than those we had in the 1990s. But has it made you hundreds of times more productive as a user of e-mail? I doubt it. And the same could be said of a very great deal else, with all that means for the increased value delivered by the devices, and the productivity they yield--all as the list of really fundamental tasks they enable us to perform has not grown anywhere near so rapidly as the speeds and sizes in the performance specifications. (Indeed, many computer activities are less efficient than before, most obviously because of an increasing burden of computer security, more on which later in this list.)

5. Even when perfectly functional computers (and the software and Internet connections that go with them) remain far from user-friendly, and highly unreliable, with the fact exemplified by a recent Danish study showing that some 20 percent of users' time is absorbed in dealing with computer problems, with, again, obvious implications for any growth in value added, and the productivity gains yielded.

6. Computers' performance gains have been substantially absorbed by things that their users would regard as disutilities--specifically software devoted to the surveillance of the user, and the subjection of the user to advertising and other forms of marketing (the computer user does not want ads more "relevant" to their inferred needs, they want to not be spied on or marketed to at all)--and where not actual disutilities in the aforementioned matter, at best irrelevancies (as with many features of which almost all of the users will know and care nothing, but which absorb copious amounts of memory and at times interfere with the functioning of features they do care about, as happened when an update produced problems with Bitlocker for many last year).

7. The benefits of computer use go along with the high cost of computer crime--and the efforts required to combat such crime, reflected in the existence of an over $270 billion cybersecurity industry, and one might add, the amounts of users' computing power absorbed by such programs as antiviruses, and the periodic updates of software to patch up holes in the systems sold them. This extends to extreme inconvenience for computer users who as a matter of course are expected to display increasing sophistication in matters of "cybersecurity," whether coping with multiple strong passwords, understanding and using multi-factor authentication and passkeys, the operation of highly imperfect antivirus programs, assessing the risks in the unsolicited e-mail, QR codes and other material with which they are barraged, etc.. All of this too has significant costs from the standpoint of not just money, and time (which happens to be money as the adage has it), but well-being, with "cybersecurity fatigue" already a known problem a decade ago, and inducing many to make even less use of the Internet than they might otherwise do (avoiding this or that site just to not have to open another account with information the company will inevitably sell or leak, just to not have to deal with yet another wretched password).

8. The advent of the smart phone, which has admittedly been an important innovation, has entailed important trade-offs. Portability has come at the price of usability, with small touch screens no substitute for full-sized keyboards, while it is also the case that the mobility of the devices (with all the risk of damage or loss associated with carrying a little piece of plastic everywhere), and the usage of wireless communications that goes with them in public places, has exacerbated the security problem.

9. Computer hardware and software of all kinds tends to have short product cycles, translating to rapid depreciation--with this, again, detracting from the value delivered, as users are required to expensively replace the worn-out or simply outdated item. A fact that has been acknowledged as contributing to the difference between "Gross Domestic Product" and the less well-known but in many ways more illuminating "Net Domestic Product," it is also the case that replacement entails costs beyond simply buying and turning on the new equipment. One has to integrate that piece of equipment into an existing network (indeed, buy a personal computer and you are likely to find this part of the default version of the set-up process), transfer old files into it, and on top of that spend time learning to cope with what is likely to be a different model--and all that, again, the more often for the rapidity of depreciation.

10. Those enthusing over the benefits of computing for the public tends to overlook the still extant "digital divide"--a very large part of the population left out of the benefits of computer use because of poverty or old age (in societies which are both increasingly poor and increasingly elderly), a situation highlighted by the David Cameron government's cynically making the "Universal Credit" system with which it replaced much of its social safety net "online only" (rendering access more difficult for exactly that part of the population which needs the system most, as it is least equipped to deal with such a system).

Would I say that these hard facts of the computer age means computers have not, on the whole, brought net benefits? Even very significant net benefits? I would not. However, when we remember the limits as well as the wonders of the digital revolution, the costs of computing as well as the gains--and the fact that, in line with the calculation of economic growth using methods that make the proverbial divorced cancer patient an economic hero, many of the costs, as with fixing the problems of unwieldy and unreliable machines or the cost of computer security, are registered as gains--the view that computers are somehow yielding radical gains far beyond what is discernible in the economic statistics seem much more questionable.

Of course, in saying that I think it fair that one could imagine things to have been otherwise in a different situation--one where, for example, governments took steps to protect computer users' privacy rather than leaving information traffickers free to monetize their personal information, or businesses bravely pursued such potentially productivity-enhancing measures as telecommuting rather than fighting against them every step of the way (while the free rein of anticompetitive practice in an age in which antitrust is a dead letter, the locust-like short-termism that is the great legacy of "shareholder activism," and an intellectual property regime that has gone utterly out of control, were not all doing their parts to stifle any sort of meaningful technical progress in this sector as in so many others). However, intriguing as that possibility may be, and valid and indeed useful its consideration, we can only speculate about that--and such speculation a whole other discussion than this one, an appraisal of the place of "tech" in the world that we have actually got.

No comments:

Subscribe Now: Feed Icon