Popular expectations regarding technological progress in areas from artificial intelligence to fusion, from cloning to space, have commonly been overblown. The response on the part of many is simply to regard futurists' predictions with irony, but this has never seemed a useful position to me. The reality is that we can hardly avoid making guesses about what's to come, given that we are so often obliged to think in the long-term, and to plan, especially where science and technology are concerned. Guessing poorly can have serious implications for public policy (some of which I recently touched on in my
Space Review article,
"Space War and Futurehype Revisited"). Consequently, instead of irresponsible dismissal of the whole enterprise of prediction, the appropriate response is to try and get better at making predictions, and at responding to predictions appropriately. (Even if the best we can hope for is to get things somewhat closer to right somewhat more often, or simply better-position ourselves to cope with the inevitable surprises, this seems to me well worth our while.)
That said, the number of pitfalls in the way of those studying such issues is staggering. Today it is almost impossible to point to any discussion of public issues which is not carried on in a fog of misinformation, and disinformation spread by vested interests and ideologues. The situation is even worse when expert knowledge is crucial (as in virtually any discussion into which science enters), and the uncertainties are large (as when one makes guesses about things that haven't happened yet, exactly the issue here), because of how much more difficult it becomes for even informed observers to make their own judgments. (Think, for instance, of how Creationists succeeded in manufacturing a "debate" over "intelligent design.")
However, this falls far short of exhausting the list, and I would argue that one big stumbling block has been a deeply flawed "folk perception" of science and technology shaped, in large part, by media produced for (and often by) people without expertise in this area. Consider, for instance, what we get in so much pop science journalism. News items about developments in science and technology, just like news items about anything else, are typically intended to be brief and punchy and accessible (read: attention-grabbing and entertaining). Breathless stories about exciting new possibilities fit these criteria far better than critical, nuanced pieces attentive to the obstacles in the way of realizing those possibilities. (Indeed, many of them make it sound as if the possibility is already a reality, as does the questionable title of this otherwise useful piece:
"Quantum Keys Let Submarines Talk Securely.")
Such stories skew perceptions, making it seem as if a great many ideas are much more developed than they really are, and the press's lack of a memory worsens matters. After telling readers and viewers about the thrilling innovation that just might change all our lives (with the "might" sometimes in the small print), there is generally no proper follow-up; the story just fades away even as the impression it created remains. The failures lapse into obscurity, while successes are likely to get so loudly trumpeted that it can seem as if the latter are all that exist.
These attitudes may be reinforced by the endless repetition of the claim that we live in an era of unprecedentedly rapid technological progress, making historical precedents irrelevant, and implying the imminent bursting of all the old boundaries. They are reinforced, too, by a tendency to identify "technology" with "information technology" (which goes as far as Google's news aggregator compiling, under the heading of technology, numerous items that are not about technology as such, but rather the financial fortunes of IT firms), which has the effect of making the state-of-the-art in consumer electronics the yardstick of technological progress, a perception which can be very misleading given that other areas (like food and energy production, medicine and transport) have seen much slower change. Raymond Kurzweil's "Law of Accelerating Returns," which
and the technological Singularity with which it is associated, are a particularly extreme version of this kind of thinking, but this futurehyped, IT-centric outlook is indisputably mainstream, so much so that the correctives offered by observers like
Robert J. Gordon,
Jonathan Huebner, Bob Seidensticker and
Michael Lind have had little traction with Apple brand-worshipping consumers who believe that their new cell phone is the telos of human history.
Still, for all its flaws, it has to be admitted that pop science journalism is far less influential than fiction, and especially television and movies, which appear to play far and away the biggest role in shaping the popular image of science. Last year a
poll taken in Australia found that TV and movies were the primary source of information about science for three-quarters of those surveyed, a pattern likely typical for other developed countries. One likely consequence of this is our habituation to the thought of technologies that are only imaginary, with "hard," technologically-oriented, extrapolative science fiction set in the near future likely having the strongest impact on our thought as "rumors of the future" (as suggested by scholar Charles E. Gannon's study
Rumors of War and Infernal Machines).
Additionally, science fiction routinely dramatizes the processes of science and engineering, and in the process propagates a particular view of them. The realities of science and engineering as specialized, collaborative, often slow-moving activities, participants in which often cope with multiple, knotty problems at once, some of which may be theoretical in nature; as dependent on massive amounts of organization, equipment and money controlled by people who are not scientists (indeed, often are scientific illiterates), and so susceptible to the vagaries of economics, business, politics and even personal whim; as under even the best circumstances subject to fits and starts, to dead ends and flashes-in-the-pan, with projects to develop new technologies often concluding in nearly useless prototypes, on the drawing board, or even at the concept stage; are given short shrift.
As an example of what we are much more likely to see, consider the Syfy Channel show
Eureka, which centers on an imaginary town in the Pacific Northwest inhabited by a large portion of the United States' scientific elite.
Eureka makes some concessions to reality in its depiction of the Global Dynamics corporation running the town, and the involvement of Pentagon functionaries in the affairs of its researchers. However, this is superficial stuff, the episodes mostly playing like twenty-first century "Edisonades," with geniuses equally adept at fields as diverse as astrophysics and molecular biology flitting from one project to the next, many of them undertaken singlehandedly in their garages; casually tinkering with toys as exotic as androids and faster-than-light drives; and during the inevitable crisis, inventing on-the-spot solutions to staggeringly complex problems that somehow always work.
Taken together, all this leaves most people imagining science as nerd-magic, and picturing R & D as a matter of omnicompetent nerd-magicians pulling solutions out of thin air – so that if a problem does need solving all we have to do is get the nerds on it and, voila, it's solved. (And if the nerd-magicians don't work quickly enough, we just have to hector them until they do, the way Sheriff Jack Carter, like many another rugged sci-fi hero, hectors the science types for a fix when there's trouble.) It also leaves audiences thinking that, to use William Gibson's phrasing, "The future is already here – it's just not very evenly distributed" in the literal sense of thinking that every gadget one has ever heard of must be out there, somewhere.
In short, it has them believing in the "
Eureka Paradigm" of scientific and technological R& D.
Of course, there are lots of reasons why fiction still depicts science and technology in this way, long after such depictions have lost any credibility they may once have had. A fairly good one is that this simplistic approach is a better fit with dramatic requirements, easier to turn into a compact, intelligible, accessible story in which we get to focus on a few characters as interesting things happen – and to be fair, the point is to entertain, not inform. Yet, it is an appalling basis for actual consideration of how research actually proceeds, and we take such images seriously at our cost.