Wednesday, May 30, 2012

Looking Back: Ray Kurzweil’s 2009

Originally Published In THE FIX, January 1, 2009

Ray Kurzweil’s 1999 classic, The Age of Spiritual Machines, devoted a full chapter to his predictions for the year 2009. Given that this year is already upon us, it only seems appropriate to take a look back at the predictions made for it by a thinker whose influence on science fiction has been so vast.

I will not attempt here to discuss with every one of his prognostications, some of which were vague, did not lend themselves to easy verification, or were simply commonplaces that had little to do with the book's area of concentration. Instead I will focus on the relatively unambiguous claims, particularly those having to do with specific developments he assumes to have materialized not as clunky, cranky, buggy, unmarketable prototypes forever "in development," or novelties or luxuries for people with too much money and too little sense, but to have become refined enough and affordable enough to proliferate widely. (Readers of that book will remember that there are many of these, several of which he reasserted in his discussion of the "2010 scenario" in his more recent The Singularity is Near, which appeared in 2005.) Additionally, given that the changes in information and communications technology are the linchpin of everything else here, it would only be fair to focus on those.

It must be noted that some of his anticipations proved at least partially correct, such as his observation that

• Local area networks and wireless technology will become widespread.

• The downloading of content from the web would increasingly replace the transfer of physical objects.

• There would be increasing convergence between various media via the web.

• The transition from 2-D to 3-D chip technology would be clearly underway.

• A $1,000 PC ($1300 in 2008 dollars) would be capable of 1 trillion calculations per second.

However, this was greatly outweighed by the things he clearly got wrong.

• There would be a significant reduction in the size and weight of personal computers below that of the notebooks of 1999--banality, except that significant in this case means that PCs could now be embedded in clothes and jewelry.

• Rotating memories (like hard drives, CD-ROMs and DVDs) would be on their way out, in favor of "purely electronic" memory.

• Speech-recognition software will make text primarily voice-created, so that keyboards will be increasingly vestigial.

• Personal computers will be equipped with facial recognition hardware and software adequate to let them recognize their owners on sight.

• Print-to-speech and speech-to-text readers will become so capable, compact and affordable as to be standard equipment taken everywhere (and indeed, eliminate many of the handicaps of blindness and deafness in the process).

• Navigational devices will enable the blind to make their way using comparable devices, with which they can communicate verbally.

• Paraplegics will walk using computer-controlled orthotic devices.

• Long-distance driving will be automated by "intelligent roads."

• Translation software (letting you speak in English and be heard in Japanese, and vice-versa) will be standard in phones.

• "Smart paper"-quality visual displays will be standard in our electronic devices.

• Eyeglasses-based displays will routinely be used.

• Most reading (as in school) will be done off electronic displays, with e-book reader-style devices becoming the most common way of reading print media.

• Speakers will be replaced by miniaturized, chip-based devices generating 3-D sound.

• Not only will "responsive and precise" Language User Interfaces be commonly available, but "intelligent assistants" with "natural-language understanding, problem solving, and animated personalities [will] routinely assist with finding information, answering questions, and conducting transactions."

• Virtual reality, while still primitive, would be a standard consumer item (replacing, for instance, 1990s-style chat rooms), lacking only "surround" tactile environments.

• Autonomous nano-machines will have been demonstrated (though they will not yet be commercially useful).

Inevitably, his guesses about the direct consequences of technological change were off--for instance, his expectation that computing technology would be so powerful and widely available that teachers effectively become "human resources" workers concentrating on the motivation, psychological well-being and socialization of their students rather than the imparting of knowledge or intellectual training as such.

Naturally, this carried over to larger, more significant mistakes at the big-picture level. Notably Kurzweil argued that, at least in the U.S., the "ten years leading up to 2009 [will] have seen continuous economic expansion and prosperity," including a booming stock market and price deflation, with the continuing tech boom (despite occasional "corrections") leading the way. He also argued that the underclass will be stable in size, and politically neutralized by public assistance and the broad general affluence.

This prediction proved to be such utter nonsense that I will not bother to tear it apart here, virtually every word I have ever published about economics attesting to how differently things have turned out. The same goes for his guesses about the future of warfare, when he said that computer and communications security would be the main mission of the U.S. Defense Department; and that when there is violence of a more conventional kind,
Humans are generally far removed from the scene of battle. Warfare is dominated by unmanned intelligent airborne devices. Many of the flying weapons are the size of birds, or smaller.
For the moment, never mind the implicit dehumanization of the enemy, or the civilians they will not easily be extricated from, and consider the sense in which he meant this: that U.S. troops will not be there at the scene of battle. Given the attention lavished on aircraft like the Predator, it may seem that he had something there--but one need only look at the massive and costly U.S. commitment of ground troops to Iraq to see that this claim is total nonsense.

Admittedly, Kurzweil's errors will not surprise those who revel in such errors for their own sake. However, we simply cannot get away from the problem of predicting the future, as pressing issues like climate change and resource depletion remind us. For that reason, it is far more useful to think about why he was wrong rather than wallow in the "futility" of ever trying to think ahead, gratifying as some find this to be.

My assessment is that Kurzweil's predictions had two fundamental weaknesses. One is that he succumbed to "futurehype" about a few key technologies (in particular neural net-based pattern recognition), expecting that they would be much more advanced than they really were. The other is his facile view of the "softer" side of his subject, in particular politics, macroeconomics, and I daresay, ecology, given the role of oil supplies in our current troubles. Both are inextricable from the prevailing "wisdom" of the late ‘90s tech boom that was just about peaking when he was writing.

Of course, I should acknowledge that Richard Dooling, in his recent book Rapture for Geeks: When AI Outsmarts IQ, was more forgiving in his assessment of some of these same points. Additionally, one can argue that this is only the beginning of 2009, and that a truly fair assessment will have to wait for New Year's Day 2010; or failing this, that he might prove to have been just slightly off where some of his technological predictions are concerned, perhaps pointing to the recent proliferation of e-book readers.

Nonetheless, others seem to be clearly much further off, like the claims for telephone call translation and virtual reality. And the fact that so much of this change has come along much more slowly than he suggested is itself significant; the acceleration of technological evolution is the very foundation of the technological Singularity to which he has devoted so much ink, and with which he has become so closely identified.

Tuesday, May 22, 2012

Reading Bulldog Drummond

I was recently surprised to find that the Internet Movie Data Base lists a Bulldog Drummond movie as in development for 2013. There is so little information available about the project that it may just be a "myth of IMDB" (like The Brazilian Job). Even if that is the case, that there should be a myth at all is a reminder of the utter unwillingness of movie-land to completely let go of any franchise.

Until recently reading H.C. "Sapper" McNeile's original novel, Bulldog Drummond (1920), I knew the character only secondhand, from mentions of him in studies of spy fiction I'd read, and from a couple of movies that treated the material rather loosely – 1969's Some Girls Do (the second and last in a series of films that updated the character for the '60s), and 1983's Bullshot Crummond (a parody of the series I found very entertaining despite my limited familiarity with the butt of its jokes, and which seems to me underappreciated).

In that first book, Bulldog is the nickname of Captain Hugh Drummond, a decorated British Army officer of World War I who has since returned to private life as a "sportsman and a gentleman." Quickly getting bored with the leisure his wealth affords him, he decides, initially as something of a lark, to put an ad in the newspaper in which he indicates a hunger for "diversion. Legitimate, if possible; but crime, if of a comparatively humorous description, no objection. Excitement essential."

The ad is soon answered by Phyllis Benton, the tale's damsel in distress and romantic interest, whose father has been drawn into the machinations of one Carl Peterson, and his henchman Henry Lakington. Drummond finds plenty to interest him here, and soon enough, with the help of some of his ex-Army friends, is taking on a plot for a takeover of Britain by left-wing revolutionaries financed by a pack of greedy and vengeful foreigners.

It has often been said that Drummond can be thought of as a proto-James Bond. There is the protagonist's privileged social background, his wartime military experience, his penchant for sport and juvenile behavior (along with his buddies, he seems very much the upper-class hooligan), his knack for getting into trouble and then back out of it again. Like Bond he lives in a comfortable apartment in an upscale London district, tended by loyal personal servants (James Denny, and his wife), and is very particular about his cigarettes (with Virginia tobacco on one side of his case, and Turkish on the other).

There is, too, the feel of his adventures: the international criminal conspiracy headed by an evil genius (Carl Peterson) with an affinity for exotic and deadly pets, weapons and cronies (a cobra, a killer gorilla, knock-out gas), and a gadget-packed lair; his encounters with alluring girls good and bad (besides Phyllis, there is Peterson's daughter Irma); the combination of gentility and murderousness in his confrontations with the arch-villain (fully on display in such scenes as Peterson's visit to Drummond's apartment early in the story); the bad guys who in their arrogance or sadism prefer not to dispose of Drummond immediately, making bouts of unconsciousness and rounds of capture and escape a surprisingly regular feature of his life.1 The course of Drummond's investigations also anticipates Bond's, particularly his habit of baiting the bad guys into revealing bits of information, and his coincidentally eavesdropping on villains at the right time to catch crucial bits of their plans (often, during those periods when they think he's safely locked up). He even picks up an American detective as a helper in Jerome K. Green of the New York Police Department (a proto-Felix Lighter to Drummond's proto-Bond). And when it is all over, Peterson, like Ernst Stavro Blofeld in On Her Majesty's Secret Service (1963), manages to escape from justice at the end of the adventure, insuring that there will be a next time.

That is not to say that there are not significant differences. Bond is, after all, a professional operative working as part of a massive government organization rather than a restless amateur, and something of a cynic, world-weary, in a way that Drummond shows no sign of being. This attitude carries over to the treatments of politics in their respective adventures. While Fleming's plots were typically right-wing and nationalistic in their conception (so much so that Umberto Eco took them ironically), and the author was not at all shy about expressing his opinions about politics (or anything else), he never comes close to the long-winded, heavy-handed and contempt-filled denunciation of the left, the working class, and intellectuals in general to which Drummond and company devote most of the last tenth of the novel. (One might also add that Bulldog is much less debonair than 007, described by the author as possessed of a "cheerful ugliness," and at any rate is, despite Irma's charms, a one-woman man, marrying Phyllis and ending his first adventure on his honeymoon with her.) Nonetheless, Fleming's debt to Sapper is such a vast one that Julian Symons was substantially right when he described the James Bond novels as "the 'Sapper' pipe-dream converted to the mood of the fifties," and Bond as a "more sophisticated version of Bulldog Drummond" (270) in his classic study of the thriller genre, Mortal Consequences.

Sunday, May 20, 2012

Dirk Pitt Returns?

Recently reviewing the stats for this blog, I was struck by the number of web surfers who found their way to it looking for information about the possibility of Hollywood making more Dirk Pitt movies, whether sequels to Sahara (2005), or a new, "rebooted" series which starts with a clean slate (to judge by the frequency with which search keywords like "dirk pitt movie 2" and "any more dirk pitt movies" turn up in the section titled "traffic sources"). This seems to be mainly because of my earlier post, "Returning to Sahara: The Dirk Pitt Novels on Screen."

That post considered the pitfalls involved in translating Clive Cussler's novel Sahara from the page to the screen, rather than any plans for more Dirk Pitt films, about which I have heard, seen or read absolutely nothing – which is exactly what I would expect. The fact that Sahara was a commercial disappointment ordinarily would not rule out the studios taking another shot. After all, Hollywood has proven nothing short of fanatical about trying to squeeze every penny it can out of any and every established IP, and as fans can attest, the Dirk Pitt novels sure look like obvious source material for big-budget action-adventure films – especially as this franchise remains very much alive, new entries continuing to regularly appear on bestseller lists. Of course, as I pointed out in my previous post, adapting Pitt's adventures to the big screen is trickier than it looks, given how dated some of the plots have become (in light of changes in world politics, and industry sensibilities), but Hollywood has never been averse to seizing on a brand name and a few other select trappings (a character, a gimmick), and dispensing with everything else, and in the process launching a commercially viable film sequence, even as it leaves the purists unsatisfied. (They did it with James Bond, after all. And Jason Bourne. And Star Trek. And others far too numerous to list here.)

However, it is worth remembering that Sahara wasn't just a disappointment, but could be described without any hyperbole whatsoever as one of the biggest flops in movie history – and that this came on top of an earlier production of a Dirk Pitt film (1980's Raise the Titanic) which could be described in exactly the same terms. Indeed, one might say Cussler was lucky to see his series get a second chance, even if it took twenty-five years for this to happen. The way that extraordinary opportunity went has left the material seeming almost cursed – and silly as that may sound, one should not overestimate the rationality of this business. (In fact, when so much money is at stake, the factors to be weighed in the decisions regarding its use are so intangible, and every facet of commerce has come to be seeped in a casino mentality, one should bet on Hollywood's irrationality every time.) Perhaps even more frightening than any such curse, Cussler himself did not manage to have a good relationship with the producers, with the result a protracted, grinding legal battle in the aftermath of the film's financially ruinous shooting and release – and that is something that will frighten the least superstitious of Suits.

All that being the case, it may be a long time before anyone takes another crack at this series, if ever. However, for what it's worth, you can be sure that I will have something to say about any new developments in that story here.

Review: The Origin of Consciousness in the Breakdown of the Bicameral Mind, by Julian Jaynes

Boston: Houghton-Mifflin, 1976, pp. 467.

I first encountered Julian Jaynes ideas about bicameralism through Jack Dann's classic The Man Who Melted five years ago. I enjoyed the novel greatly, but was skeptical of the theory underlying it. Jaynes' theory, after all, holds that consciousness is not something intrinsic to life or humanity, but a cultural creation, which did not emerge until the second millennium B.C.. It seemed to me impossible that humanity could have ever been without consciousness, let alone reached an Iron Age level of civilization without it. How, for instance, could the "bicameral" mind have organized such a feat as the construction of the Great Pyramid of Cheops? How, too, could such a theory possibly be reconciled with the indications that even some higher animals (apes, dolphins) possess self-awareness?

Jaynes' book did not fully satisfy me with its answer to the first question, and didn't address the second question at all, but it did make the theory considerably harder to dismiss. In discussing the book, it seems appropriate to start with the author's definition of consciousness--which is not a capacity for sensation and observation, memory, or the ability to learn, think or reason. (Even these last, he argues, are predominantly unconscious processes.) Rather, consciousness is a model of the world around us we carry around in our heads enabling us to introspect, visualize and take decisions in a self-aware manner. Built upon the capacity of language to create metaphors, it consists of selected ("excerpted") details, arranged according to a sense of space and "spatialzed" time, and organized in narrative form, with dissonant information reconciled around an image of the self (as seen by oneself and by others)--and on the whole, a relatively small part of our mental life for all its complexity.

Prior to the arrival of consciousness, Jaynes posits the "bicameral" mind as the dominant thought mode. For bicameral persons, the left and right hemispheres of the brain worked rather more separately than is the norm today, with an important aspect of the separation the generation of auditory hallucinations, especially in moments when unconsciously acquired habit was inadequate, and stress high. Their mental life, in fact, is compared by Jaynes to contemporary observations of schizophrenia. A key difference, however, is that this was the norm rather than an exception. Indeed, he suggests society was actually structured hierarchically around such hallucinations, with priest-kings hallucinating the voices of gods (in many cases, originating in predecessors whose voices they continued to "hear" after their passing), and subordinates at each level the voices of the authority above them at such times, a control reinforced by such hallucinatory aids as centrally located houses of worship and ubiquitous depictions of deities--themselves organized as hierarchically as humans were. (This put the conflicting voices that torment so many schizophrenics in line, and as Jaynes has pointed out, this kind of sorting is not unknown among those suffering this condition today).

Jaynes suggests that bicameralism may have enabled the beginnings of civilization by making human beings subject to such a method of control. However, it did not endure. The intrinsic frailty of this system apart, it grew decreasingly effective as polities expanded--as with the empire of Hammurabi. An increased use of writing was one way of compensating, but the activity probably weakened bicameralism as well. The Bronze Age collapse (theorized by Jaynes as having been a result of the Theran catastrophe) knocked over these vulnerable structures, flinging together uprooted peoples forced to reckon with the inappropriateness of earlier modes of conduct, and with each others' differences (which may have led them to think that difference was internal for the first time), in harsh circumstances where deceit (thinking one thing and appearing to do another) had a high survival value. (Jaynes speculates also that there may have been an aspect of natural selection in the process, and that the development of narrative epics like Homer's Iliad contributed to the process as well.)

Over the millennium or so that followed, the conception of our behavior's driving forces was internalized, intellectualized (recognized as mental processes) and synthesized into consciousness as we now know it, which had clearly arrived by the time of such figures as Solon and Pythagoras, whose notions of mind, volition and responsibility are recognizable as similar to those we possess today. Still, a nostalgia for the certainties of divine voices speaking to us personally remained, in the laments of our myths of a "fall" which has distanced us from the gods, in methods of divination which attempt to reach those increasingly distant deities (like auguries, oracles, prophets and possession), and even in "scientism" (the quasi-religious treatment of scientific inquiry).1

In examining the argument over three decades after Jaynes' initial presentation of the theory, it seemed to me that, my initial objections aside, many of the smaller claims on which he has built his larger argument (like the nature of consciousness) remain a subject of dispute. His focus on the eastern Mediterranean region--he looks principally at Mesopotamia, Egypt, Greece and the Biblical Israelites--appears rather narrow. And the evidence available for his examination even of this history was necessarily limited, and full of ambiguities and gaps, from the uncertainty regarding the dating of particular works, to the meanings of key terms, to the question of how representative the works in question really were of how people lived and thought at the time (all of these evident in, for example, his reading of ancient Greek literature). Still, Jaynes is quite aware of the scantiness of some of the material, and the speculative nature of many of his claims, frequently acknowledging that what he presents is but a starting point for a fuller investigation.

Of course, the proof that would really convince a skeptic--proof that a society can actually function this way, let alone proof that this really was the way the species lived--may be unobtainable. However, what Jaynes does with the material available to him is genuinely impressive, the theory at the least fitting a great many facts. In the process, he makes a contribution to the indisputable view that our recognition and expression of abstract thought and subjectivity has grown enormously over these millennia (a view going back at least to G.W.F. Hegel), and a good case that schizophrenia and hallucination have played a larger role in antiquity than we appreciate. In the process he points the way to a significant part of the human story--as is so often the case with those who dare to offer Big Ideas, even when they do not persuade us completely of their version of the larger picture.

1. As Jaynes notes, in the second millennium B.C., human beings heard the voices directly; in the first millennium B.C., they were only able to access those voices through exceptional individuals, sites and efforts, like prophets and oracles; in the first millennium A.D., even these fell increasingly silent, leaving behind the texts based on them; and in the past thousand years, those texts have seen their authority eroded.

Friday, May 18, 2012

On the Word "Lifestyle": A Postscript

Half a year ago I discussed the contemporary use - and much more often, misuse, of the word "lifestyle" on this blog.

Your standard of living is a matter of how much money you make.

Your quality of life is an evaluation of your general well-being – of which income is only one part, as matters like physical health, leisure and social life enter into it.

A way of life is something you're born into, or adopted into, or adopt yourself. A culture has a way of life, and one shares that way of life when they belong to that culture.

A lifestyle is something an individual picks out of a catalog. It is something few people actually have, and it is not be confused with any of the others.

Saturday, May 5, 2012

The Anti-Humanism of Battlestar Galactica

Humanity, the reimagined Battlestar Galactica tells us, is Fallen, progress an illusion that dies when the creations of that "flawed creation," Man, finally turn on it – and do so successfully because those flawed human beings failed to understand that peace is just a period of preparation for the next round of war. (The first scene of the show, in fact, depicts the Cylons' strike on the Armistice Station that represents the Colonies' peace overture to them – adding insult to injury.)

Humanity also failed to appreciate that not only could the Enemy not be reasoned or bargained with, but that it had penetrated within, their ruthless agents and their hapless dupes undermined us to the point that we had little defense against the external assault that finished civilization off in hours.

Trying to examine the history leading up to it, to actually understand the origins of this situation, is pointless, and even traitorous. ("How this could've happened, why it happened – none of that matters right now. All that does matter is that as of this moment we are at war," Commander William Adama says in his call to arms during the pilot.) And that's not the only way in which too much truth is an inconvenience. In the wake of the disaster, the man who claims the mantle of leadership bolsters his legitimacy with a "useful" lie promulgating false hope (regarding Adama's knowledge of the location of Earth), and the forging of a consensus through appeals to the irrational, and the pressure to conform ("So say we all!").

Those who do not conform, like the dissenter and the radical (e.g. Tom Zarek), repeatedly justify the suspicion and contempt with which they are treated, as does the intellectual (e.g. Gaius Baltar), who is not only the very source of our predicament in his misuse of his technical expertise, but a self-seeking traitor at every turn. The rise of such men to a position of leadership is, of course, ruinous, not least because the masses are so easily led astray. Proving once more that they cannot be trusted to take care of themselves, that they are weak and soft and unwilling to face the Hard Facts of Life, the People grab the first chance to fly from the struggle – foolishly electing Baltar President (and foolishly permitted to do so by Adama and sitting President Laura Roslin when they decide not to falsify the election's results to head off this outcome), and leaving their ships for the soil of New Caprica, with the result that Democracy and Excessive Respect for the Law has led to the exchange of the bitter struggle for the even more bitter subjugation by the Enemy. (The second season, in fact, ends with a shot of Cylons marching down the settlement's single muddy street like the Wehrmacht parading through Paris after its capture in 1940 – an image all the more loaded in the American imagination by Anglo-Saxon perceptions of French arms during World War II – and the next begins with Baltar playing quisling to the Cylon occupiers.)

Naturally, it falls to the soldiers (who despite being the show's principal repositories of virtue are no exceptions to the spectacle of human degradation that is the cast of characters, from the pathetic Saul Tigh to the broken Kara Thrace to the tragic Felix Gaeta) to save the civilians from themselves yet again. However, even their heroics accomplish only so much. The Apocalypse runs its course, after which those who had indulged in the luxuries of the wicked cities finally return to the Land. Nonetheless, it is to be only a matter of time before they begin the same stupid cycle of rebuilding and destruction all over again – while the rather smug "angels" manipulate us toward some unknown end.

This was all taken for "gritty" – which is to say, "realistic." Yet one could more precisely, substantively and usefully describe it all as authoritarian, militarist, xenophobic, obscurantist, anti-rational and anti-modern, and just plain misanthropic, which is to say that it conforms to "reality" as seen only from a very specific part of the political spectrum.1 Such a reading of the show is, if anything, encouraged by the presentation of so much of what happened on the show as relevant to our contemporary politics (through its constant, shamelessly sensationalist evocations of the War on Terror).2

That such comments have not been far more common says a great deal about the political moment in which the show appeared, and what it takes to be "realistic."3 It also says a great deal about our moment that this is the show which has supplanted Star Trek as the template for science fiction television, its considerable influence evident in recent efforts like Stargate: Universe and the reimagined V (which reimagining is in itself a striking story of political inversion). Ken MacLeod has described this condition as well as anyone I can think of: "humanity as a rational and political animal died in 1979, and went to hell." The new Galactica, representative of the anti-humanist polar opposite to what Star Trek (once) stood for, is a story we told ourselves there.4

1. The original Galactica has also been noted for its right-wing politics. Still, many of the changes took it further in that direction, like the emphasis on humanity's nature as fallen, and the treatment of particular characters, like the transformation of Gaius Baltar from a power-hungry politician into a horny, status-seeking scientist with an accent the show's primarily North American audience would regard as foreign. There is, too, the addition of the civilian-military power struggle, which had Adama in a tug of war with President Laura Roslin, presented here as out of her depth, as she had been "only" a teacher earlier – not just a member of a profession held in low regard among Americans (and a lightning rod for the same anti-intellectualism we see in the treatment of Baltar), but one identified with the "liberal establishment" conservatives frequently demonize.
2. The show's first episode "33" had the heroes shooting down a hijacked passenger craft, while later in that same season Cylons were seen blowing themselves up like suicide bombers, captured Cylons got waterboarded, etc.. The result was to equate the September 11 terror attacks with the annihilation of a twelve-planet civilization, al-Qaida with the military might of the Cylons, and the context of a handful of refugees in a flotilla of life boats to that of the United States today – comparisons which entail so much exaggeration as to render any analogy useless (while the tone was usually that of FOX News in Space).
3. Admittedly, the show did not adhere to the view described with perfect consistency, at least after the first two seasons. Indeed, some argued that the politics were quite the opposite of those described here in season three. In particular, much was made of the show's protagonists becoming resistance fighters against the occupying Cylons in that season's first episodes, with many reviewers suggesting an equation of the human "insurgents" with present-day Iraqis, while subsequent episodes like "Dirty Hands" and "The Woman King" entered new territory by raising questions of class, labor and bigotry. (Indeed, Jonah Goldberg adored the first two seasons, and hated the later ones, which thoughts he shared in a post pointedly titled "How Politics Destroyed a Great TV Show" – which is to say, how the merest whiff of liberal politics made intolerable to him a show he'd once enjoyed because of its agreement with his world-view.) However, the extent and depth of the changes should not be overstated. They did not alter the basic premises of the series, and in any case, were quickly drowned in the pseudo-religious muddle already moving to the fore.
4. Ironically, the new show's creator, Ronald D. Moore, worked as a producer and writer on the Star Trek franchise, scripting nearly sixty episodes of Star Trek: The Next Generation, Star Trek: Deep Space Nine and Star Trek: Voyager, as well as the films Star Trek: Generations and Star Trek: First Contact.

Subscribe Now: Feed Icon