The Guardian's Steven Poole has just reviewed Anthony Horowitz's Forever and a Day--the next Bond novel, which is notable for two features. One is that it is the first time since Raymond Benson that an author has had a chance to pen a second of the Bond continuation books. The other is that it represents yet a new wrinkle in that series by offering a prequel to Casino Royale.
Poole remarks that Horowitz's novel serves up "disappointing" bits of exposition (like where we get his preference for his dry martinis shaken, not stirred, and even "the name is Bond, James Bond"), and "prose throughout is more verbose and cliched than the brutal efficiencies of Fleming." However, he also praises the choice of villain, remarks that Horowitz is "good at the action scenes," and declares the book on the whole "still an enjoyably compact thriller, with an absolutely killer last line . . . [with] some pleasingly echt Bond moments."
It seems a rather plausible assessment as far as it goes, given my impressions of Horowitz's prior effort, Trigger Mortis (which you can read about, here)--though Poole, perhaps predictably, skirts the issue of whether there is a point to his having made it a prequel (could the adventure have been just as satisfying as another '50s era entry in Bond's adventures?), and whether there is any point to writing more novels in a series where so much is modified (Poole acknowledges, among other things, the gender relations, again predictably, and Horowitz's apologies for the extent to which he carried forward Fleming's attitudes, even while, as Poole's observation suggests, exaggerating that extent). The book hits the market next week in the UK, but arrives in the States in November (according to Amazon, at least).
I expect to get in my two cents then.
Wednesday, May 23, 2018
Monday, May 21, 2018
John le Carrè and the Bestseller List
Recently going through the New York Times and Publisher's Weekly bestseller lists with an eye to the performance of spy fiction over recent decades for a recent paper, I must admit that what the hard data did was mostly confirm my more casual impressions—that the genre had done very well commercially in the '60s, the '70s, the '80s, and then became much less conspicuous in the '90s. (Predictably given not just the inevitability of changes in commercial fashions, but the damper the end of the Cold War put on the genre, but notable all the same.)
Still, it was something of a surprise just how well one of the bigger names sold—namely, John le Carrè. As the tables appended to my paper show, he was commercially on a level with such titans of the "airport novel" as Ian Fleming, Frederick Forsyth, Robert Ludlum and Tom Clancy. Going by the PW lists (the information from the year-end editions of which are conveniently gathered together by Wikipedia), from The Spy Who Came in From the Cold (the top-selling novel of 1965 in the United States) to The Russia House, just about all his spy novels were among their year's top ten sellers.
This adds up to a quarter of a century at the very top, an extraordinary run, especially back in those days when the uppermost ranks of the publishing world saw rather more flux than they do now.
That his sales were quite so strong is all the greater given that, reflecting his comparatively greater acclaim by critics, he was anything but a producer of the kind of crowd-pleasers that made those other authors such big names. Not only were his stories slow-paced and lacking in action, decidedly unglamorous and preoccupied with moral ambiguity, but they were so obliquely told that I suspect anyone who picks up anything by him from The Looking Glass War on and does not feel bewildered by the goings-on can count themselves a highly accomplished reader.
Indeed, I find myself wondering—is this a case of audiences having become less tolerant of such writing, or is it the case that people were buying his novels and just pretending to understand them, or even just pretending to read them, because it seemed fashionable to do so? All those copies of the volumes detailing the adventures of Smiley and company, merely purchased to make the buyer look sophisticated by sitting on their coffee table or their bookshelf?
Any thoughts?
Still, it was something of a surprise just how well one of the bigger names sold—namely, John le Carrè. As the tables appended to my paper show, he was commercially on a level with such titans of the "airport novel" as Ian Fleming, Frederick Forsyth, Robert Ludlum and Tom Clancy. Going by the PW lists (the information from the year-end editions of which are conveniently gathered together by Wikipedia), from The Spy Who Came in From the Cold (the top-selling novel of 1965 in the United States) to The Russia House, just about all his spy novels were among their year's top ten sellers.
This adds up to a quarter of a century at the very top, an extraordinary run, especially back in those days when the uppermost ranks of the publishing world saw rather more flux than they do now.
That his sales were quite so strong is all the greater given that, reflecting his comparatively greater acclaim by critics, he was anything but a producer of the kind of crowd-pleasers that made those other authors such big names. Not only were his stories slow-paced and lacking in action, decidedly unglamorous and preoccupied with moral ambiguity, but they were so obliquely told that I suspect anyone who picks up anything by him from The Looking Glass War on and does not feel bewildered by the goings-on can count themselves a highly accomplished reader.
Indeed, I find myself wondering—is this a case of audiences having become less tolerant of such writing, or is it the case that people were buying his novels and just pretending to understand them, or even just pretending to read them, because it seemed fashionable to do so? All those copies of the volumes detailing the adventures of Smiley and company, merely purchased to make the buyer look sophisticated by sitting on their coffee table or their bookshelf?
Any thoughts?
Review: Seeing is Believing: How Hollywood Taught Us to Stop Worrying and Love the '50s, by Peter Biskind
Today it seems that Peter Biskind is better known as journalist than academic, on the strength of his history of the "New Hollywood" of the '70s, Easy Riders and Raging Bulls, and his more recent history of the post-'70s independent film movement, Down and Dirty Pictures. However, Biskind's study of Hollywood film in the '50s, Seeing is Believing, is not a chronicle of tinsel town in that period, but a more conventional piece of film criticism, close-reading a selection of the period's cinematic classic (from The Blackboard Jungle to 12 Angry Men, and from My Darling Clementine to High Noon, from The Day the Earth Stood Still to On the Waterfront), and a sprinkling of less celebrated films (like Hell or High Water, or Underworld U.S.A.).
Biskind's examination of film in this decade (the '50s here are defined as extending between the end of World War II and the flowering of the 1960s as a recognizably different area) is deeply informed by his reading of the ideological balance of that period. Key to understanding his book, it is also one of the most incisive discussions of that topic I have ever run across, which makes it seem doubly worth discussing here at some length.
Central to Biskind's analysis is what he reads as the mainstream, centrist position, which he terms "corporate-liberalism." This was formed by the recognition of the increasingly large scale and complex character of modern life (the liberal recognizing the importance of the national and international above the local, the age of Big Organizations and of atom age, jet age, space age technology); by the dark view of human nature suggested by psychologists like Sigmund Freud, and the theory of totalitarianism propounded by figures like Hannah Arendt; and by the prestige that New Deal reformism and psychoanalysis/ psychotherapy enjoyed then as never before or since. Its response to this mix of influences, experiences and challenges was a view of the country as consisting of an array of diverse interests, all equally valid, and which it thought just had to get along with one another on terms of (to quote Daniel Bell) "pragmatic give-and-take rather than . . . wars to the death." It also held that the problem of doing so had been substantially solved already by the advent of the Affluent Society.
However, making the arrangement work required not amateurs (identifiable with the ignorant, panicky, demagogue-following mob) but technocratic professionals able to understand the complexities of the situation, and devise pragmatic solutions, implementing which required consensus among the group. Consequently, from this "pluralist" view many of the changes so often discussed at the time—the replacement of the rugged individualist by the "organization man," the "inner-directed" by the "outer-directed" personality, the self-interested, combative "Protestant Ethic" William Whyte described by the "Social Ethic," the classic tough guy by the Sensitive Man—were positive changes in keeping with the new reality.
In line with its dark view of human beings and suspicion of the prospects of radical change, the valorization of the status quo as having already largely solved the "social question," and the stress on everyone getting along (which made it at best uneasy with even discussing social class, class conflict, social power, social alternatives; with even fights over principle inclining the participants to "war to the death" rather than "pragmatic give and take"), the corporate-liberal viewed "ideology" as such with suspicion.1 Of course, one might ask how they could take this position when corporate-liberalism, clearly, was very obviously an ideology. The surprisingly simple answer is that they viewed their own position as not an ideology at all but as simply "facing the facts" in a value-free, pragmatic, "realistic" fashion, in contrast with the value-minded, principled "extremist." And corporate-liberalism's approach to building and maintaining that highly valued consensus reflected this "anti-ideological," technocratic outlook.
In line with its affinity for psychology (and its rejection of more radical views) it tended toward the treatment of dysfunction, and even simply disaffection with the status quo, as something akin to mental illness, requiring "therapy" (or, in Biskind's blunter language, manipulation). This entailed bringing others around to its understanding of the world by subtly inculcating its outlook in them, such that it would seem to them their own outlook (an "internalization of social control"), a practice further facilitated through a mobilization of peer pressure. They expected that the delinquent and even the "extremist" could be brought around in this manner, much more often than not. Where the real incorrigible had to be confronted, however, the corporate-liberal recognized that there might be no alternative to that use of force with which they were uncomfortable, tended to view as damaging to the user of force as well as its victim, and regarded as necessarily tragic, but went along with anyway on the presumption of TINA (There Is No Alternative)—reflected in its propensity for militant anti-Communism at home and abroad.
The other lines of thinking can be defined in relation to this (rather conservative) liberalism, with which they tended to compromise to one degree or another.2 Those to the right of center were militantly anti-Communist not because it was ideological or "extremist," but because it was contrary to their own ideals. They favored the rugged (and "Average") individual over the expert, over the Organization and its imperative of consensus, and trusted to laissez-faire-style private interest as not only the essence of freedom, but the surest way of maximizing the benefit of all. This did not mean their altogether dispensing with notions of community, but they preferred the small scale to the large, the small town to the big city, the local to the national (let alone the international). The dysfunctional and the disaffected were not seen as sick and in need of therapy, but as simply bad, and conservatives had far fewer qualms about using force against them. Indeed, the liberal's hesitation to use force was in itself suspect to them, their readiness to try to persuade or negotiate rather than fight faintly traitorous.
Nonetheless, there were profound differences not just with the liberal center, but to its right as well—the more moderate and "sophisticated" conservatives ready to align themselves with the corporate-liberals in the punch, while those to the right of them took the harder line. (Think the college-educated Fortune 500 executive graciously heeding the advice of an academic consultant, cordially negotiating with a union boss and accepting the New Deal as a necessary evil, as compared with a George Babbitt-style small-town small businessman for whom such a thing would be anathema.) Biskind terms the former "conservative," the latter "right-wing."3
Biskind has less occasion to discuss the left, simply because one could not make an overtly left-wing film in the Hollywood of not just the Hays' Code, but McCarthy, the House Un-American Activities Committee, and the blacklist. Still, his clearing up the confusion about just what, exactly, "liberal" meant does much to clarify this as well. The embrace of ideology rather than its denial, the preference for Marx over Freud, the position that there are such things as class and class conflict and power elites, the search for social alternatives at home and reserve or opposition toward anti-Communism abroad, the suspicion of Organization Man and Organization and the therapeutic society on these grounds rather than those of the right, and all with which these things are associated (the ideas of, for example, a Mills), clearly situate a filmmaker or a work on the left—and peering through the cover lent by science fiction trappings and historical periods, films like The Day the Earth Stood Still and High Noon give him occasion to analyze a leftist work.
Impressively, Biskind manages to convey this sophisticated analysis in a highly accessible manner, succinct and happily jargon-free. Of course, Biskind's work is, ultimately, a piece of film criticism rather than social history or social science in the narrower sense of these terms, and so his doing even an excellent job with this subject matter would mean only so much within a book that is not only about film, but overwhelmingly devoted to close reading of its collection of films—not only in its conception, but its use of the available space. At least where his selection of films is concerned—by no means small or unrepresentative—his discussion of '50s politics proves an excellent lens for understanding these movies. As he shows, while the hard-driving individualist, the tough guy, remained the darling of the radical right—Gary Cooper in The Fountainhead or The Court-Martial of Billy Mitchell—elsewhere they were giving way to the more social, sensitive man better-adapted to the world of organizations than the old tough guys (neurotics or monsters or just out of date), like Montgomery Clift in Red River, heir to a John Wayne who built up his empire under old rules that no longer applied. Likewise Wayne's hard-driving commander in the conservative Flying Leathernecks had a corporate-liberal counterpart in Gregory Peck in Twelve O'Clock High, while that actor later went on to become an icon of the change as the executive who chooses family over career in The Man in the Grey Flannel Suit.
On occasion Biskind may seem to overreach. For example, I am not entirely sure that 12 Angry Men is more interested in consensus than it is in justice. Still, even here I have found his argument hard to dismiss, so that a long time after first reading it I still find myself thinking about it—not only as an exploration of these films, but of the era that produced them. As Biskind makes very clear, that era did not last, could not have lasted, given the contradictions of its characteristic outlook, which by the 1960s could no longer be ignored or papered over the way they had before. Both the right and the left were to make themselves more strongly felt in the years that followed—and the center was no longer able to hold under that pressure, with film quickly reflecting this, not least in Anthony Perkins' career. One of the archetypal Sensitive Man leads in the '50s, the type was shortly presented as really a Psycho behind the Nice Guy facade in a role that quickly blotted out his earlier screen image—while Hollywood followed up its celebration of Billy Mitchell with its satire Dr. Strangelove. All the same, there has been continuity as well as rupture, enough that Biskind's analysis seems helpful not just in understanding that formative, earlier period, but the present as well, more so than the oceans of drivel we are apt to get about pop culture, politics and the link between the two today.
1. Where C. Wright Mills wrote of "the power elite," denied the existence of any such thing, democracy and the market diffusing power so much that the concept had no salience.
2. In its dark view of human nature, qualified confidence in reason, and aversion to social change, it seems philosophically more conservative than liberal, and rather conservative in many of its objectives as well. Indeed, it can be taken for a conservatism that, in line with the conservative intellectual tradition, was not about freezing change for all time, but making concessions to unavoidable change as life went on its way (one reason why it managed to get along with the modern big business conservatism described here, in contrast with the traditionalist kind).
3. Others have used different terminologies. In Mills' New Men of Power, the first category was dubbed "sophisticated conservative," the second "practical conservative."
Biskind's examination of film in this decade (the '50s here are defined as extending between the end of World War II and the flowering of the 1960s as a recognizably different area) is deeply informed by his reading of the ideological balance of that period. Key to understanding his book, it is also one of the most incisive discussions of that topic I have ever run across, which makes it seem doubly worth discussing here at some length.
Central to Biskind's analysis is what he reads as the mainstream, centrist position, which he terms "corporate-liberalism." This was formed by the recognition of the increasingly large scale and complex character of modern life (the liberal recognizing the importance of the national and international above the local, the age of Big Organizations and of atom age, jet age, space age technology); by the dark view of human nature suggested by psychologists like Sigmund Freud, and the theory of totalitarianism propounded by figures like Hannah Arendt; and by the prestige that New Deal reformism and psychoanalysis/ psychotherapy enjoyed then as never before or since. Its response to this mix of influences, experiences and challenges was a view of the country as consisting of an array of diverse interests, all equally valid, and which it thought just had to get along with one another on terms of (to quote Daniel Bell) "pragmatic give-and-take rather than . . . wars to the death." It also held that the problem of doing so had been substantially solved already by the advent of the Affluent Society.
However, making the arrangement work required not amateurs (identifiable with the ignorant, panicky, demagogue-following mob) but technocratic professionals able to understand the complexities of the situation, and devise pragmatic solutions, implementing which required consensus among the group. Consequently, from this "pluralist" view many of the changes so often discussed at the time—the replacement of the rugged individualist by the "organization man," the "inner-directed" by the "outer-directed" personality, the self-interested, combative "Protestant Ethic" William Whyte described by the "Social Ethic," the classic tough guy by the Sensitive Man—were positive changes in keeping with the new reality.
In line with its dark view of human beings and suspicion of the prospects of radical change, the valorization of the status quo as having already largely solved the "social question," and the stress on everyone getting along (which made it at best uneasy with even discussing social class, class conflict, social power, social alternatives; with even fights over principle inclining the participants to "war to the death" rather than "pragmatic give and take"), the corporate-liberal viewed "ideology" as such with suspicion.1 Of course, one might ask how they could take this position when corporate-liberalism, clearly, was very obviously an ideology. The surprisingly simple answer is that they viewed their own position as not an ideology at all but as simply "facing the facts" in a value-free, pragmatic, "realistic" fashion, in contrast with the value-minded, principled "extremist." And corporate-liberalism's approach to building and maintaining that highly valued consensus reflected this "anti-ideological," technocratic outlook.
In line with its affinity for psychology (and its rejection of more radical views) it tended toward the treatment of dysfunction, and even simply disaffection with the status quo, as something akin to mental illness, requiring "therapy" (or, in Biskind's blunter language, manipulation). This entailed bringing others around to its understanding of the world by subtly inculcating its outlook in them, such that it would seem to them their own outlook (an "internalization of social control"), a practice further facilitated through a mobilization of peer pressure. They expected that the delinquent and even the "extremist" could be brought around in this manner, much more often than not. Where the real incorrigible had to be confronted, however, the corporate-liberal recognized that there might be no alternative to that use of force with which they were uncomfortable, tended to view as damaging to the user of force as well as its victim, and regarded as necessarily tragic, but went along with anyway on the presumption of TINA (There Is No Alternative)—reflected in its propensity for militant anti-Communism at home and abroad.
The other lines of thinking can be defined in relation to this (rather conservative) liberalism, with which they tended to compromise to one degree or another.2 Those to the right of center were militantly anti-Communist not because it was ideological or "extremist," but because it was contrary to their own ideals. They favored the rugged (and "Average") individual over the expert, over the Organization and its imperative of consensus, and trusted to laissez-faire-style private interest as not only the essence of freedom, but the surest way of maximizing the benefit of all. This did not mean their altogether dispensing with notions of community, but they preferred the small scale to the large, the small town to the big city, the local to the national (let alone the international). The dysfunctional and the disaffected were not seen as sick and in need of therapy, but as simply bad, and conservatives had far fewer qualms about using force against them. Indeed, the liberal's hesitation to use force was in itself suspect to them, their readiness to try to persuade or negotiate rather than fight faintly traitorous.
Nonetheless, there were profound differences not just with the liberal center, but to its right as well—the more moderate and "sophisticated" conservatives ready to align themselves with the corporate-liberals in the punch, while those to the right of them took the harder line. (Think the college-educated Fortune 500 executive graciously heeding the advice of an academic consultant, cordially negotiating with a union boss and accepting the New Deal as a necessary evil, as compared with a George Babbitt-style small-town small businessman for whom such a thing would be anathema.) Biskind terms the former "conservative," the latter "right-wing."3
Biskind has less occasion to discuss the left, simply because one could not make an overtly left-wing film in the Hollywood of not just the Hays' Code, but McCarthy, the House Un-American Activities Committee, and the blacklist. Still, his clearing up the confusion about just what, exactly, "liberal" meant does much to clarify this as well. The embrace of ideology rather than its denial, the preference for Marx over Freud, the position that there are such things as class and class conflict and power elites, the search for social alternatives at home and reserve or opposition toward anti-Communism abroad, the suspicion of Organization Man and Organization and the therapeutic society on these grounds rather than those of the right, and all with which these things are associated (the ideas of, for example, a Mills), clearly situate a filmmaker or a work on the left—and peering through the cover lent by science fiction trappings and historical periods, films like The Day the Earth Stood Still and High Noon give him occasion to analyze a leftist work.
Impressively, Biskind manages to convey this sophisticated analysis in a highly accessible manner, succinct and happily jargon-free. Of course, Biskind's work is, ultimately, a piece of film criticism rather than social history or social science in the narrower sense of these terms, and so his doing even an excellent job with this subject matter would mean only so much within a book that is not only about film, but overwhelmingly devoted to close reading of its collection of films—not only in its conception, but its use of the available space. At least where his selection of films is concerned—by no means small or unrepresentative—his discussion of '50s politics proves an excellent lens for understanding these movies. As he shows, while the hard-driving individualist, the tough guy, remained the darling of the radical right—Gary Cooper in The Fountainhead or The Court-Martial of Billy Mitchell—elsewhere they were giving way to the more social, sensitive man better-adapted to the world of organizations than the old tough guys (neurotics or monsters or just out of date), like Montgomery Clift in Red River, heir to a John Wayne who built up his empire under old rules that no longer applied. Likewise Wayne's hard-driving commander in the conservative Flying Leathernecks had a corporate-liberal counterpart in Gregory Peck in Twelve O'Clock High, while that actor later went on to become an icon of the change as the executive who chooses family over career in The Man in the Grey Flannel Suit.
On occasion Biskind may seem to overreach. For example, I am not entirely sure that 12 Angry Men is more interested in consensus than it is in justice. Still, even here I have found his argument hard to dismiss, so that a long time after first reading it I still find myself thinking about it—not only as an exploration of these films, but of the era that produced them. As Biskind makes very clear, that era did not last, could not have lasted, given the contradictions of its characteristic outlook, which by the 1960s could no longer be ignored or papered over the way they had before. Both the right and the left were to make themselves more strongly felt in the years that followed—and the center was no longer able to hold under that pressure, with film quickly reflecting this, not least in Anthony Perkins' career. One of the archetypal Sensitive Man leads in the '50s, the type was shortly presented as really a Psycho behind the Nice Guy facade in a role that quickly blotted out his earlier screen image—while Hollywood followed up its celebration of Billy Mitchell with its satire Dr. Strangelove. All the same, there has been continuity as well as rupture, enough that Biskind's analysis seems helpful not just in understanding that formative, earlier period, but the present as well, more so than the oceans of drivel we are apt to get about pop culture, politics and the link between the two today.
1. Where C. Wright Mills wrote of "the power elite," denied the existence of any such thing, democracy and the market diffusing power so much that the concept had no salience.
2. In its dark view of human nature, qualified confidence in reason, and aversion to social change, it seems philosophically more conservative than liberal, and rather conservative in many of its objectives as well. Indeed, it can be taken for a conservatism that, in line with the conservative intellectual tradition, was not about freezing change for all time, but making concessions to unavoidable change as life went on its way (one reason why it managed to get along with the modern big business conservatism described here, in contrast with the traditionalist kind).
3. Others have used different terminologies. In Mills' New Men of Power, the first category was dubbed "sophisticated conservative," the second "practical conservative."
Review: The Sociological Imagination: 40th Anniversary Edition, by C. Wright Mills
New York: Oxford University Press, 2000, pp. 256.
Given how long ago I became acquainted with Mills' classics, The Power Elite and White Collar, I was slow to come to his book The Sociological Imagination. Less often mentioned, it simply seemed inessential. After having actually read it, though, I think I was wrong about that. Of course, his other books are quite self-contained and perfectly contained without reference to his thought on how sociologists can, do or should work. Still, the book is sufficiently rich to be worth a read for its own sake (while, I think, my appreciation of this book was strengthened by my having seen so much of his actual sociological work).
"The sociological imagination" to which the book's title refers entails the ability to relate the personal life to the collective life; the individual biography to human history. And it is underlain by the recognition that not only is there such a thing as society, but that without considering it we cannot understand the individual lives of the people in it; that we cannot usefully think about, let alone resolve, our problems without the benefit of such insight, recognizing in personal "troubles" the manifestation of bigger societal "issues." (That one does not have a job is a "trouble"; but it has to be understood in terms of the unemployment that is an "issue.") Indeed, this bestowed on social studies (he preferred the term to social science because of the latter's physics envy-evoking baggage) a great responsibility in relation to society—a key role as judges of the power-holding elite, and educators of any would-be democratic public, so that even if social scientists could not save the world, their trying to do so was an honorable and necessary endeavor, the more so given the confusion and dangers of the times as he saw them.
However, while Mills does explain this idea at length, most of the book is actually devoted to a critique of what sociologists were actually doing in his time—or as was often the case, not doing—and why, with several chapters devoted to what he saw as the problems in the way of their fulfilling their role. One of them, certainly, was the obsession with specialization—those jealousy guarded disciplinary boundaries obstacles that any researcher honestly trying to answer a meaningful question had to overstep, and which were, at least as problematic, the seed of a stifling bureaucratization and politicization of research work, as ridiculous as it was unfortunate.
Mills was also a sharp critic of the prominence of methodological controversy. In particular he criticized the extreme abstraction of what he called "Grand Theory," epitomized by the unreadable "concept"-mongering and term-coining of Talcott Parsons; and theoretically blinkered "abstracted empiricism," consisting of micro-studies without much sense of the relation or relevance of their questions and findings to the larger social scene, which had come to be so characteristic of corporate-government administrative research. Their other failings aside, the one was preoccupied with the question for methodology for its own sake, disconnected from any concrete work; while the other let a received methodology determine the problems to be studied (typically, not the really worthwhile ones), when in his view it ought to have been the other way around, methodological progress deriving from our grappling with the problems which leave us uncertain about how to deal with them. (Indeed, as Mills noted, Parsons abandoned his elaborate concepts when he actually endeavored to study a genuine sociological issue.)
Given how long ago I became acquainted with Mills' classics, The Power Elite and White Collar, I was slow to come to his book The Sociological Imagination. Less often mentioned, it simply seemed inessential. After having actually read it, though, I think I was wrong about that. Of course, his other books are quite self-contained and perfectly contained without reference to his thought on how sociologists can, do or should work. Still, the book is sufficiently rich to be worth a read for its own sake (while, I think, my appreciation of this book was strengthened by my having seen so much of his actual sociological work).
"The sociological imagination" to which the book's title refers entails the ability to relate the personal life to the collective life; the individual biography to human history. And it is underlain by the recognition that not only is there such a thing as society, but that without considering it we cannot understand the individual lives of the people in it; that we cannot usefully think about, let alone resolve, our problems without the benefit of such insight, recognizing in personal "troubles" the manifestation of bigger societal "issues." (That one does not have a job is a "trouble"; but it has to be understood in terms of the unemployment that is an "issue.") Indeed, this bestowed on social studies (he preferred the term to social science because of the latter's physics envy-evoking baggage) a great responsibility in relation to society—a key role as judges of the power-holding elite, and educators of any would-be democratic public, so that even if social scientists could not save the world, their trying to do so was an honorable and necessary endeavor, the more so given the confusion and dangers of the times as he saw them.
However, while Mills does explain this idea at length, most of the book is actually devoted to a critique of what sociologists were actually doing in his time—or as was often the case, not doing—and why, with several chapters devoted to what he saw as the problems in the way of their fulfilling their role. One of them, certainly, was the obsession with specialization—those jealousy guarded disciplinary boundaries obstacles that any researcher honestly trying to answer a meaningful question had to overstep, and which were, at least as problematic, the seed of a stifling bureaucratization and politicization of research work, as ridiculous as it was unfortunate.
Mills was also a sharp critic of the prominence of methodological controversy. In particular he criticized the extreme abstraction of what he called "Grand Theory," epitomized by the unreadable "concept"-mongering and term-coining of Talcott Parsons; and theoretically blinkered "abstracted empiricism," consisting of micro-studies without much sense of the relation or relevance of their questions and findings to the larger social scene, which had come to be so characteristic of corporate-government administrative research. Their other failings aside, the one was preoccupied with the question for methodology for its own sake, disconnected from any concrete work; while the other let a received methodology determine the problems to be studied (typically, not the really worthwhile ones), when in his view it ought to have been the other way around, methodological progress deriving from our grappling with the problems which leave us uncertain about how to deal with them. (Indeed, as Mills noted, Parsons abandoned his elaborate concepts when he actually endeavored to study a genuine sociological issue.)
And as might be guessed from a reading of Mills' other work, he was a critic not just of such academy-specific tendencies as overspecialization and the obsession with methodology, but the larger condition of society, and especially the prevailing, mainstream view—what he termed the "practical liberalism" or "pluralism," which would seem to have had connections with those academic trends he decried. Taking the facts of society for granted (especially as seen from a small-town, rural, Midwestern perspective), afraid to draw bold conclusions about the causes of social phenomena and hiding this behind an attribution of everything to innumerable minute causes, easily overwhelmed by the big questions, it inclined to a "psychologism" fixed on "the make-up of individuals"—and "rests upon an explicit metaphysical denial of the reality of social structure." The social world was so much a given as not to be considered at all; the troubled individual simply "maladjusted" and in need of being brought into "harmonious balance" with it.
If there was a positive recommendation to be derived from this line of thought, it was (as conveniently summed up in the appended essay "On Intellectual Craftsmanship") a call for "unpretentious intellectual craftsmen," with "every man . . . his own methodologist . . . his own theorist," working on significant problems without inhibition by specialization, conscious of human history and people as "historical and social actors" (as human beings in a particular time and social environment) and communicating it all clearly and simply, over "rigidity in procedure and fetishism of method and technique," the bureaucratization of research and unintelligible writing. Moreover, Mills had some optimism that the course was onward and upward. It seemed to him that the sociological imagination was becoming common currency, and that the social sciences were drawing together—with economics, for instance, becoming "political economy" again.
I cannot claim sufficient background in the state of the intellectual art in '50s sociology to confidently judge how prevalent "grand theory" and "abstracted empiricism" really were in his day. I must admit that all my knowledge of Talcott Parsons is secondhand (though that is also because those contacts with his ideas never left me with the impression that it was necessary to read his work the way it seemed necessary to read Mills, for example). Still, Mills' appraisal of the weaknesses of those approaches is compelling, and his argument for what sociology—and social science—ought to look like strikes me as axiomatic.
Still, it is also clear that he was overoptimistic about the trend in his field. As Todd Gitlin acknowledges in his afterword to this edition of the book, sociology "has slipped deeper into the troughs Mills described." To the extent that there is very much in the way of sociological imagination evident today, it is of the cheapened pop sociology kind—"sociological imagination lite, a fast-food version of nutriment, a sprinkling of holy water on the commercial trend of the moment, and a trivialization of insight" as Gitlin puts it, like the stultifyingly silly zeitgeist film criticism passing for profundity on so many review pages. Unintelligible concept-mongering, micro-studies—and of course, physics-envy—seem ever more widespread in an age of ultra-specialization. Economics has been a particularly troubling scene, with reputable and prominent economic thinkers time and time again over the years (you can see Lester Thurow do it here in 1983, Robert Heilbroner do it here in 1995) calling out that profession to no effect whatsoever, with even the crisis of 2008 having no lasting consequences—to the shock of anyone who thinks of mainstream economics as an actual would-be science rather than theology and apologetic.
I would go still further than Gitlin where sociology is concerned. While we have an abundance of cheap pop sociology, it can seem that the sociological imagination is as scarce as ever it was, and this unsurprising given the turn of economic life, politics, culture—our era of "no such thing as society" neoliberalism, "personal responsibility," "self-help," anti-intellectualism that may make what came before look like nothing, and consequent disdain for what social science has to teach. None of that has been for the better in a time that seems as much in need of the "sociological imagination" as it has ever been.
If there was a positive recommendation to be derived from this line of thought, it was (as conveniently summed up in the appended essay "On Intellectual Craftsmanship") a call for "unpretentious intellectual craftsmen," with "every man . . . his own methodologist . . . his own theorist," working on significant problems without inhibition by specialization, conscious of human history and people as "historical and social actors" (as human beings in a particular time and social environment) and communicating it all clearly and simply, over "rigidity in procedure and fetishism of method and technique," the bureaucratization of research and unintelligible writing. Moreover, Mills had some optimism that the course was onward and upward. It seemed to him that the sociological imagination was becoming common currency, and that the social sciences were drawing together—with economics, for instance, becoming "political economy" again.
I cannot claim sufficient background in the state of the intellectual art in '50s sociology to confidently judge how prevalent "grand theory" and "abstracted empiricism" really were in his day. I must admit that all my knowledge of Talcott Parsons is secondhand (though that is also because those contacts with his ideas never left me with the impression that it was necessary to read his work the way it seemed necessary to read Mills, for example). Still, Mills' appraisal of the weaknesses of those approaches is compelling, and his argument for what sociology—and social science—ought to look like strikes me as axiomatic.
Still, it is also clear that he was overoptimistic about the trend in his field. As Todd Gitlin acknowledges in his afterword to this edition of the book, sociology "has slipped deeper into the troughs Mills described." To the extent that there is very much in the way of sociological imagination evident today, it is of the cheapened pop sociology kind—"sociological imagination lite, a fast-food version of nutriment, a sprinkling of holy water on the commercial trend of the moment, and a trivialization of insight" as Gitlin puts it, like the stultifyingly silly zeitgeist film criticism passing for profundity on so many review pages. Unintelligible concept-mongering, micro-studies—and of course, physics-envy—seem ever more widespread in an age of ultra-specialization. Economics has been a particularly troubling scene, with reputable and prominent economic thinkers time and time again over the years (you can see Lester Thurow do it here in 1983, Robert Heilbroner do it here in 1995) calling out that profession to no effect whatsoever, with even the crisis of 2008 having no lasting consequences—to the shock of anyone who thinks of mainstream economics as an actual would-be science rather than theology and apologetic.
I would go still further than Gitlin where sociology is concerned. While we have an abundance of cheap pop sociology, it can seem that the sociological imagination is as scarce as ever it was, and this unsurprising given the turn of economic life, politics, culture—our era of "no such thing as society" neoliberalism, "personal responsibility," "self-help," anti-intellectualism that may make what came before look like nothing, and consequent disdain for what social science has to teach. None of that has been for the better in a time that seems as much in need of the "sociological imagination" as it has ever been.
Review: Principles of Economics: An Introductory Volume, 8th edition, by Alfred Marshall
I recently noticed on Amazon that the negative reviews of Alfred Marshall's classic Principles of Economics tended to focus on the writing, charging Marshall with being an inefficient, overly wordy writer, and the book a tough read in comparison with other classic treatises like Adam Smith's The Wealth of Nations, or even John Stuart Mill's The Principles of Political Economy (themselves not exactly light reads, with Mill's book like Marshall's a textbook, with all that implies for someone trying to read it straight through). I must admit that I found the book something of a slog myself--enough so that I gave some thought as to why this was the case.
As it happens, the essential concepts Marshall covers--cost and utility, marginal and otherwise; elasticity and substitution; equilibrium; diminishing and increasing returns--and their application to the production, consumption and distribution of wealth and income by way of supply and demand--are in themselves simple, intuitive and apt to be familiar to anyone who would pick up Marshall today. And Marshall can write with great lucidity and conciseness, not only when recounting history (as he does with great polish in the appendices), but when dealing with more intricately theoretical questions (as in his admirable summing up of scientific method in the early part of Book I, Chapter III). Additionally the gloss and summary chapters offer ample assistance for anyone who loses the thread of the discussion along the way.
However, the material being covered--an economic theory based on utility--is still rather more abstract and less "common sense" or empirical than Smith or Mill's earlier work. Its presentation is also more difficult than Stanley Jevons' earlier, founding exposition of marginal utility (he called it "final utility") in The Theory of Political Economy, also heavily reliant on long verbal descriptions of microeconomic models accompanied by a great abundance of algebraic equations and graphical demonstrations of the principles and examples discussed. The reason is that where Jevons merely presented the principal line of thought, and with relative brevity, Marshall devotes many hundreds of pages to exhaustively working out the implications, in the process creating (in Marshall's own words) not a "few long chains of reasoning," but a mass of "many short chains and single connecting links" that sacrifice clarity for completeness as they bury the reader in hypothetical yet gory details. (In fact, in Book V, Marshall himself tells the majority of his readers to skip over the section's second half--altogethersome 77 pages, equal to an eighth of the main text!)
Perhaps especially given the theory's "elegance" this exhaustive working out of the implications may be less essential, less rewarding than with less compact, less tidy theories. And of course, the superabundance of mathematical content accompanying the exhaustive verbal explanation just conveys the same exact information as the verbal descriptions in a different manner. Bluntly put--if one gets the verbal descriptions, they find that working through the equations, curves and schedules doesn't really add anything to what they already knew. This is of course the norm for economics, but still, the meticulous instinctively do it, and here are bound to do it frequently and lengthily.
Accordingly, after plodding through these hundreds of very densely written pages, a reader already well-acquainted with these ideas (so much more widely diffused now than in Marshall's time) is apt to feel that they have not got much return on their effort. Knowing this does not make what may feel like an obligatory task for a student of economic theory or history any easier, but it is better to acknowledge why a book is "difficult" than just mouth about its "being hard," and if one can make allowance for all that, there is a good deal to admire here. Dated and problematic as the "Marshallian" view of economics is in many ways for readers from across the ideological spectrum (he was very much a Victorian strongly inclined to orthodoxy), Marshall, like most thinkers whose names get thrown around this way, was a rather more rigorous, nuanced thinker than his adherents, conscious of at least some of the limits of his analysis. His legacy doubtless includes some harmful misperceptions, the term he chose to use in the very title of the book—Principles of Economics (in place of the earlier term "Political Economy")--reflecting the tendency to treat economics as if it were physics, making the economic process seem"depoliticized" and above social constructs, rather than an arrangement grounded "in the contingent historical and political requirements of the prevailing social order."
Still, Marshall recognized the unavoidable imprecision of a "science of human nature" relative to classical mechanics (he compared it with the tides, not gravitation); the reality that economics derives its interest from the practical questions it addresses, and must be based on the hard facts of real life (no economic "Scholasticism" or anti-empiricism for him); the fact that people live in society, which is more than the sum of individuals, and that collective action is a thing that has to be analyzed as more than individual action writ large; that the role of time must be considered, and not simply the maximization of values but the cultivation of productive power must be taken into account in fostering national well-being; that in actual life labor is not simply another commodity, however much it may suit some to treat it that way, and that even those most narrowly committed to growth along orthodox lines cannot avoid facing the Social Question--he acknowledges all these early on, and endeavors to adhere to them subsequently. Of course, the conclusions to which these realizations lead will not satisfy those whose economic views tend in other directions, but it does show up any claim that a doctrinaire model-monger adequately lives up to Marshall's precedent, or the best in his work.
As it happens, the essential concepts Marshall covers--cost and utility, marginal and otherwise; elasticity and substitution; equilibrium; diminishing and increasing returns--and their application to the production, consumption and distribution of wealth and income by way of supply and demand--are in themselves simple, intuitive and apt to be familiar to anyone who would pick up Marshall today. And Marshall can write with great lucidity and conciseness, not only when recounting history (as he does with great polish in the appendices), but when dealing with more intricately theoretical questions (as in his admirable summing up of scientific method in the early part of Book I, Chapter III). Additionally the gloss and summary chapters offer ample assistance for anyone who loses the thread of the discussion along the way.
However, the material being covered--an economic theory based on utility--is still rather more abstract and less "common sense" or empirical than Smith or Mill's earlier work. Its presentation is also more difficult than Stanley Jevons' earlier, founding exposition of marginal utility (he called it "final utility") in The Theory of Political Economy, also heavily reliant on long verbal descriptions of microeconomic models accompanied by a great abundance of algebraic equations and graphical demonstrations of the principles and examples discussed. The reason is that where Jevons merely presented the principal line of thought, and with relative brevity, Marshall devotes many hundreds of pages to exhaustively working out the implications, in the process creating (in Marshall's own words) not a "few long chains of reasoning," but a mass of "many short chains and single connecting links" that sacrifice clarity for completeness as they bury the reader in hypothetical yet gory details. (In fact, in Book V, Marshall himself tells the majority of his readers to skip over the section's second half--altogethersome 77 pages, equal to an eighth of the main text!)
Perhaps especially given the theory's "elegance" this exhaustive working out of the implications may be less essential, less rewarding than with less compact, less tidy theories. And of course, the superabundance of mathematical content accompanying the exhaustive verbal explanation just conveys the same exact information as the verbal descriptions in a different manner. Bluntly put--if one gets the verbal descriptions, they find that working through the equations, curves and schedules doesn't really add anything to what they already knew. This is of course the norm for economics, but still, the meticulous instinctively do it, and here are bound to do it frequently and lengthily.
Accordingly, after plodding through these hundreds of very densely written pages, a reader already well-acquainted with these ideas (so much more widely diffused now than in Marshall's time) is apt to feel that they have not got much return on their effort. Knowing this does not make what may feel like an obligatory task for a student of economic theory or history any easier, but it is better to acknowledge why a book is "difficult" than just mouth about its "being hard," and if one can make allowance for all that, there is a good deal to admire here. Dated and problematic as the "Marshallian" view of economics is in many ways for readers from across the ideological spectrum (he was very much a Victorian strongly inclined to orthodoxy), Marshall, like most thinkers whose names get thrown around this way, was a rather more rigorous, nuanced thinker than his adherents, conscious of at least some of the limits of his analysis. His legacy doubtless includes some harmful misperceptions, the term he chose to use in the very title of the book—Principles of Economics (in place of the earlier term "Political Economy")--reflecting the tendency to treat economics as if it were physics, making the economic process seem"depoliticized" and above social constructs, rather than an arrangement grounded "in the contingent historical and political requirements of the prevailing social order."
Still, Marshall recognized the unavoidable imprecision of a "science of human nature" relative to classical mechanics (he compared it with the tides, not gravitation); the reality that economics derives its interest from the practical questions it addresses, and must be based on the hard facts of real life (no economic "Scholasticism" or anti-empiricism for him); the fact that people live in society, which is more than the sum of individuals, and that collective action is a thing that has to be analyzed as more than individual action writ large; that the role of time must be considered, and not simply the maximization of values but the cultivation of productive power must be taken into account in fostering national well-being; that in actual life labor is not simply another commodity, however much it may suit some to treat it that way, and that even those most narrowly committed to growth along orthodox lines cannot avoid facing the Social Question--he acknowledges all these early on, and endeavors to adhere to them subsequently. Of course, the conclusions to which these realizations lead will not satisfy those whose economic views tend in other directions, but it does show up any claim that a doctrinaire model-monger adequately lives up to Marshall's precedent, or the best in his work.
Friday, November 24, 2017
David Walsh on the Oscar Contenders
Earlier this month I mentioned David Walsh's review of Django Unchained, and his remarks on Zero Dark Thirty. Interestingly, just two days before the Academy Awards ceremony, he has produced a follow-up to these reviews in which he extends his analysis of the weaknesses of those films, and also why they have attracted such enthusiasm among the "psuedo-left" (in comparison with, for instance, Steven Spielberg's Lincoln), noting that this
You can read Walsh's full remarks here.
Also of interest to those interested in this line of cultural criticism: David Walsh's review of the recent film Not Fade Away, in which he debunks the romanticizing of the 1960s as a "golden age in which everything seemed possible and revolution was in the air."
well-heeled social layer, conditioned by decades of academic anti-Marxism, identity politics and self-absorption, rejects the notion of progress, the appeal of reason, the ability to learn anything from history, the impact of ideas on the population, mass mobilizations and centralized force. It responds strongly to irrationality, mythologizing, the "carnivalesque," petty bourgeois individualism, racialism, gender politics, vulgarity and social backwardness.This is as succinct and forceful a description of the politics of postmodernism, and the way in which it manifests itself in the taste for a certain kind of "dark and gritty" storytelling, as I have seen in some time.
You can read Walsh's full remarks here.
Also of interest to those interested in this line of cultural criticism: David Walsh's review of the recent film Not Fade Away, in which he debunks the romanticizing of the 1960s as a "golden age in which everything seemed possible and revolution was in the air."
Monday, September 18, 2017
Reassessing Kurzweil's 2009 Predictions
Raymond Kurzweil's 1999 book The Age of Spiritual Machines remains a touchstone of futurologists and their critics not only because of Kurzweil's continuing presence within the information technology and futurology worlds, but because the book also offered multiple, lengthy, even comprehensive lists of forecasts. Many of his forecasts were about subjects far outside his realm of expertise (macroeconomics, geopolitics, military science, etc.), and unsurprisingly were commonplaces, but a good many of them were precise technological forecasts lending themselves to testing against the day's technological state of the art--and thereby checking the progress of the technologies with which the book was concerned, and perhaps, also our progress in the direction he claimed we were going in, toward a technological Singularity.
Naturally a good many observers (myself included) reviewed Kurzweil's 1999 predictions for 2009 when that year arrived. Different writers, depending on which forecasts they chose to focus on, and how they judged them, came to different conclusions about his accuracy. I emphasized those precise, easily tested technological forecasts, and saw that many had not come to pass. I noted, too, that while not accounting for each and every error, there was a recognizable pattern in a good many of these, namely that Kurzweil assumed advances in neural networks enabling the kind of "bottom-up" machine learning that made for better and improving pattern recognition, as with the speech recognition supposed to make for the Language User Interfaces we never really got.
Looking back on those predictions from almost a decade on, however, it seems worth remarking that the improvement in neural nets was indeed disappointing in the first decade of the twenty-first century--but got a good deal faster in the second decade. And right along with it there has been improvement in most of the areas he seemed to have been overoptimistic about, like translation technology. Indeed, as one of those who looked at his 1999 predictions for 2009 and was underwhelmed, I find myself increasingly suspecting that Kurzweil was accurate enough in guessing what would happen, and how, but, due to overoptimism about the rate of improvement in the underlying technology, was off the mark in regard to when by a rough decade.
Moreover, this does not seem to have been the only area where this was the case. The same may go for the use of carbon nanotubes, another area where after earlier disappointments this decade technologists achieved successes that may plausibly open the way to the new designs of 3-D chips he described, permitting vast improvements in computer performance. (As it happens, Kurzweil guessed that by 2019 3-D chips based on a nanomaterial substrate would be the norm. If IBM is right, he will have been off by rather less than a decade in this case.) It may also go for virtual reality--which likewise seems to be running a rough decade behind his guesses.
The advance in all these areas has, along with a burst of progress in other areas in which Kurzweil took much less interest (like self-driving cars of a different type than he conceived, and electricity production from renewable sources), contributed to a greater bullishness about these technologies--a techno-optimism such as I do not think we have had since the heady years of the late 1990s when access to personal computing, the Internet and mobile telephony began the proliferation and convergence that has people watching movies off Netflix on their "smart" phones while sitting on the bus.1
Of course, it should still be remembered that much of what has been talked about here is not yet an accomplished fact. The advances in AI have yet to make for proven improvements in consumer goods, while those 3-D nanotube-based chips have yet to enter production. At best, the first true self-driving car will not hit the market until 2020, while renewable energy (hydroelectric excepted) is just beginning to play a major role in the world's energy mix. Moreover, that the recent past has seen things pick up does not mean that the new pace of progress will remain the new norm. We might, in line with Kurzweil's expectation of accelerating returns, see it get faster still--or slow down once more.
1. Kurzweil's self-driving cars were based on "intelligent roads," whereas the cars getting the attention today are autonomous.
Naturally a good many observers (myself included) reviewed Kurzweil's 1999 predictions for 2009 when that year arrived. Different writers, depending on which forecasts they chose to focus on, and how they judged them, came to different conclusions about his accuracy. I emphasized those precise, easily tested technological forecasts, and saw that many had not come to pass. I noted, too, that while not accounting for each and every error, there was a recognizable pattern in a good many of these, namely that Kurzweil assumed advances in neural networks enabling the kind of "bottom-up" machine learning that made for better and improving pattern recognition, as with the speech recognition supposed to make for the Language User Interfaces we never really got.
Looking back on those predictions from almost a decade on, however, it seems worth remarking that the improvement in neural nets was indeed disappointing in the first decade of the twenty-first century--but got a good deal faster in the second decade. And right along with it there has been improvement in most of the areas he seemed to have been overoptimistic about, like translation technology. Indeed, as one of those who looked at his 1999 predictions for 2009 and was underwhelmed, I find myself increasingly suspecting that Kurzweil was accurate enough in guessing what would happen, and how, but, due to overoptimism about the rate of improvement in the underlying technology, was off the mark in regard to when by a rough decade.
Moreover, this does not seem to have been the only area where this was the case. The same may go for the use of carbon nanotubes, another area where after earlier disappointments this decade technologists achieved successes that may plausibly open the way to the new designs of 3-D chips he described, permitting vast improvements in computer performance. (As it happens, Kurzweil guessed that by 2019 3-D chips based on a nanomaterial substrate would be the norm. If IBM is right, he will have been off by rather less than a decade in this case.) It may also go for virtual reality--which likewise seems to be running a rough decade behind his guesses.
The advance in all these areas has, along with a burst of progress in other areas in which Kurzweil took much less interest (like self-driving cars of a different type than he conceived, and electricity production from renewable sources), contributed to a greater bullishness about these technologies--a techno-optimism such as I do not think we have had since the heady years of the late 1990s when access to personal computing, the Internet and mobile telephony began the proliferation and convergence that has people watching movies off Netflix on their "smart" phones while sitting on the bus.1
Of course, it should still be remembered that much of what has been talked about here is not yet an accomplished fact. The advances in AI have yet to make for proven improvements in consumer goods, while those 3-D nanotube-based chips have yet to enter production. At best, the first true self-driving car will not hit the market until 2020, while renewable energy (hydroelectric excepted) is just beginning to play a major role in the world's energy mix. Moreover, that the recent past has seen things pick up does not mean that the new pace of progress will remain the new norm. We might, in line with Kurzweil's expectation of accelerating returns, see it get faster still--or slow down once more.
1. Kurzweil's self-driving cars were based on "intelligent roads," whereas the cars getting the attention today are autonomous.
Review: The Wages of Destruction: The Making and Breaking of the Nazi Economy, by Adam Tooze
New York, Penguin, 2006, pp. 799.
I remember that reading Paul Kennedy's The Rise and Fall of Great Powers back in college I pored over the figures for Germany's World War II-era income and production and felt that something didn't quite fit. German arms output seemed a good deal less impressive than I expected given the country's image as an ultra-efficient manufacturing colossus, and the fact of its control over nearly the whole of the European continent.
The common explanation for this was political--that an irrational Nazi regime, for various reasons (its slowness to move to a total war footing out of fear of a repeat of the Revolution of 1918, a self-defeating brutality in its exploitation of occupied territories, Hitler's self-interestedly playing various interests off against each other at the expense of the war effort, etc.) failed to properly utilize its potentially overwhelming industrial capacity (at least, prior to the arrival of Albert Speer, who came along too late to accomplish very much).
By and large, I accepted this explanation.
More recently Adam Tooze's Wolfson prize winning study The Wages of Destruction: The Making and Breaking of the Nazi Economy offered a different view--that the question was not a matter of political failure of this type, but the material limitations of the German economic base. Germany was simply not big enough or rich enough to hold its own against an alliance of the United States, the British Empire and the Soviet Union--the coalescence of which Tooze contends was virtually inevitable, so that Hitler's attempt to create a continental empire in the east in the face of their opposition was simply not explicable in any rational terms. (Indeed, he holds it to be comprehensible only in terms of Hitler's bizarre, conspiratorial, apocalyptic racial theories.)
To the end of making that case Tooze devotes his book to a study of the German economy through the crucial period (1933-1945), less on the basis of new data than old and well-known data (more or less like what I saw reading Kennedy) that has simply been marginalized in a great deal of analysis of the war. As he notes Germany was the world's second-largest industrial power in the early twentieth century, well ahead of its European neighbors in manufacturing productivity and output, and "world-class" in several key heavy industrial and high-tech lines like steel, chemicals, machine tools and electrical equipment.
However, German manufacturing appears less impressive when one looks at the sector overall. While Germany still excelled its European neighbors in productivity when taken this way, its lead was slighter than might be imagined, so much so that the difference between Germany, and for example, Britain, was negligible in comparison with the difference between Germany and the much more efficient (because much more "Fordist") United States (America getting twice as much per-worker output in many lines). There were even key heavy/high-tech lines where Germany remained a relatively small producer--like motor vehicles, the volume of output of Germany's automotive industry far behind that of the U.S..
Moreover, manufacturing was just one sector of the economy--with the other sectors even worse off. The country's comparably small, densely populated and less than resource-rich territory was problematic from the standpoint of its extractive sectors, which were also less productive than they might have been. The country's agricultural sector, notably, lagged not just the U.S. but even Britain in productivity, suffering from having too many of its workers on too little land divided into too many farms. All of this contributed not only to the country's per capita and national income being less than its manufacturing successes implied, but the harsh reality that those manufacturing successes depended heavily on imports of food, energy and raw materials, to the point that the country's trade was scarcely balanced by its manufacturing exports.
The result was that not only was Germany poorer than is usually appreciated, but its resource-financial situation differing from that of the U.S. (resource-rich) or Britain (resource-poor but financially powerful, with the trading privileges empire brings) in this manner left it with less room than they to export less or import more, as a major military build-up demanded. Moreover, the attempts to alleviate this constraint by developing domestic sources of imported materials like iron, and inventing synthetic substitutes for the oil and rubber Germany could not produce locally, on top of entailing additional inefficiencies, fell far short of eliminating the problem, leaving it very vulnerable to the cut-off of essential imports, and something have to give, fairly quickly.
This reading of the history is borne out by what actually did happen, the German government actually making ruthlessly efficient use of its domestic resources almost immediately on taking power--the facts about this refuting such clichés of the historiography as the unwillingness of the German government to sacrifice its people's living standards or put women to work for the sake of its military effort. (Indeed, already in the '30s the German government's system of central control of foreign exchange and key materials was gutting the production of consumer items like clothing in favor of armaments, while Germany had more of its women in the work force than Britain did.) However, for all that, Germany's merely pre-war preparations--its drive for maximum short-term output in the key military-industrial lines--translated to severe, unsustainable economic strain before the fighting even began, in such ways as the neglect of key infrastructure like the railroads, the price distortions caused by intricate central control, and the emergence of crises of cash flow and foreign exchange in spite of all the managerial feats and painful sacrifices. The inescapable recognition that the German economy by 1938 was a house of cards forced the government to curtail its rearmament programs at a point at which it was still nowhere near to being a match for Britain at sea, and disadvantaged in the face of combined Anglo-French air and ground forces in key respects (like numbers of up-to-date tanks), with the balance already swinging away from it to its opponents.1 And then even the cutbacks still left it in an economically precarious position.
Of course, the fighting did begin in 1939, and Germany did make successful conquests during the first year that did enlarge its resource base. However, in hindsight the conquests added less than might be imagined because of the limited size, resource bases and development of the countries it overran, not just the nations of Eastern Europe, but in Western Europe (Norway and Denmark, the Low Countries, and even France) as well.2 The result was that even the conquests fell far short of putting Germany on an even footing with the combined Big Three, the disparity still so wide as to moot many of the claims about how this or that more careful use of Axis resources by the Nazi leadership might have turned the tide of the war.3 Tooze reinforces this contention by showing that, for the most part, the Germans generally did what could be done with what was at their disposal (puncturing the "myth" of Albert Speer).4
A few questionable analogies apart (as when Tooze compares Germany's economic development in the 1930s to that of mid-income countries like South Africa today) Tooze offers a comprehensive and convincing image of the German economy, and especially its critical weaknesses. Moreover, while he pays less detailed attention to the other side of the balance sheet (it seemed to me that he was insufficiently attentive not just to France's demographic and industrial vulnerabilities but to Britain's financial and industrial weaknesses and problems of military overstretch in the late '30s), he is quite correct to contend that Germany, and even a Germany in control of much of continental Europe, was at a grave disadvantage in a prolonged war with a U.S.-U.K.-Soviet alliance.
Nonetheless, this emphasis on the inability of Germany to hold its own against the combined Allied Big Three obscures a weakness much more important than the deficiencies of the economic analysis--namely his depiction of the anti-Nazi alliance's coming together the way it did as a virtual certainty, his case for which is superficial and conventional compared with his economic analysis.5 He slights the leeriness of the U.K. and France about allying themselves with the Soviet Union that actually did prevent their coming together in 1939, with disastrous consequences. (Even after Poland, French conservatives were much more interested in fighting the Soviets than fighting Hitler. Indeed, the response of many of them to French defeat appears to have been a George Costanza-like "restrained jubilation" at the prospect of dispensing with democracy and setting up a fascist state in William Shirer's account of the Third Republic's collapse.)
Continuing along these lines, Tooze then goes on to slight the reality that, after the shock of the Battle of France (which brought Italy into the war on Germany's side), the hard-pressed, cash-strapped U.K. might have sought a negotiated peace, or that isolationism might have won the day in the United States, depriving Britain of American support. (In its absence the country would have had to settle with Germany--and the U.S. would have had few plausible options against Germany afterward, in spite of Germany's failures in the Battle of Britain and its naval limitations.) These developments would not have changed Germany's asset base--but they would have taken Britain out of the war, and likely kept Germany from having to fight the United States as well, leaving it to focus on just the Soviet Union, which would have changed the balance between them dramatically. Equally Tooze does not acknowledge, let alone refute, the possibility that in spite of the balance of power being so heavily against it as it was different German military decisions (most obviously "the Mediterranean strategy") could have produced a different outcome.
Additionally, these oft-studied "What ifs" aside, it is worth remembering the way the war did go. After Britain decided to stick it out against Germany, American support was exceedingly slow in coming, Britain bankrupt in March 1941 before the aid spigot was turned on fully. Moreover, even with Britain still against it, even after a delay by one crucial month due to the German invasion of the Balkans, mid-course changes to the plan of attack that cost additional time, and famously inimical weather, German troops managed to come within sight of Moscow before the invasion's first really significant repulse. And even afterward a depleted Soviet Union remained vulnerable (while any Anglo-American "second front" remained a long way off), permitting Germany to continue taking the offensive through the year, up to and even after the turning point of Stalingrad--following which it was another two-and-a-half more years of fighting before V-E Day.
One should also remember the cost of this protracted effort. The Soviet Union lost some twenty-five million lives. Britain lost what remained of the foundations of its status as a world empire, a great power, and even an independent force in world affairs (in such a weak position that it had to take a $4 billion U.S. loan immediately after the war). The U.S. did not suffer anything comparable, but it is worth remembering that its extraordinary industrial effort went far beyond what even the optimists thought possible at the time--and was a thing the Allies could not have taken for granted in the earlier part of the war, even after the American entry.
Consequently, while in hindsight it appears clear that Germany's chances for attaining anything resembling its maximum war aims were never more than low (indeed, would probably be deemed delusional but for the astonishing Allied timidity and incompetence in the key September 1939-May 1940 period), the attainment of those aims became increasingly implausible again after the failures of German forces against the Soviet Union in late 1941, and virtually inconceivable after 1942 came to a close, a victory so long, horrifically costly and deeply exhausting cannot be regarded with the complacency Tooze shows. Indeed, in his failure to critically examine this side of the issue, it appears that while Tooze takes on, and debunks, one myth of World War II (of German economic might), he promotes others that have themselves already been debunked--most importantly, that of the political establishments of the future members of the wartime United Nations having been steadfast opponents of Nazism from the very beginning.6
1. The initial plans had called for an air force of 21,000 planes when Germany went to war. The Luftwaffe never had more than 5,000, however.
2. Paul Bairoch's oft-cited 1982 data set on world manufacturing output through history gives Germany 13 percent of world industrial capacity. All the rest of non-Soviet, continental Europe together was another 13 percent, widely dispersed among its various nations, not all of which Germany controlled (Sweden and Switzerland remained independent), while the productivity of Germany and occupied Europe was hampered by the cut-off from imported inputs by the Allies' control of the oceans. By contrast, Bairoch's data gives the U.S. alone 31 percent of world manufacturing output in 1938 (more than all non-Soviet Europe combined).
3. To return to Bairoch's figures, the U.S. in coalition with the U.K. (another 11 percent) and the Soviet Union (another 9 percent) possessed 51 percent of world manufacturing capacity--a 4-to-1 advantage over Germany, and a 2-to-1 advantage over all of non-Soviet, continental Europe combined (independent states that merely traded with Germany included). Moreover, even these figures understate the margin of Allied advantage, given how the U.S. economy boomed during the war years (its economy growing about three-quarters), and Britain brought its broader Empire with its additional resources into the alliance. Considering the access to the food, energy and raw materials needed to keep all that capacity humming would seem likely to widen the disparity rather than diminish it.
In fairness, this was somewhat offset by two factors: German conquest of much of the western Soviet Union, and the absorption of a portion of Allied resources by the simultaneous war in the Asia-Pacific region. However, much of the Soviet industrial base in the conquered parts was hurriedly relocated eastward ahead of their advance, and contributed to the Soviet productive effort (and more still denied the Germans). At the same time the diversion of Allied resources by the Pacific War was limited by Japan's more modest military-industrial capability (it had just 5 percent of world manufacturing in 1938), China's weight on the Allied side (much of the Japanese army and air force tied up fighting against the Chinese), and the Allies' prioritization of the war in Europe (the "Europe First" or "Germany First" policy).
4. Tooze contends that the rise of German output in the last part of the war was largely the result of decisions taken earlier in it, and Speer simply around when those decisions paid off in the form of production gains. Tooze also offers a critical view of those initiatives Speer did undertake, like the effort to mass-produce U-boats late in the war.
5. While the critique of German strategy is solid, Tooze treats Allied strategy much less critically--a particular weakness when it comes to his assessment of the Combined Bomber Offensive.
6. The body of work regarding Britain in this respect can be found distilled in Clive Ponting's impressive 1940: Myth and Reality.
I remember that reading Paul Kennedy's The Rise and Fall of Great Powers back in college I pored over the figures for Germany's World War II-era income and production and felt that something didn't quite fit. German arms output seemed a good deal less impressive than I expected given the country's image as an ultra-efficient manufacturing colossus, and the fact of its control over nearly the whole of the European continent.
The common explanation for this was political--that an irrational Nazi regime, for various reasons (its slowness to move to a total war footing out of fear of a repeat of the Revolution of 1918, a self-defeating brutality in its exploitation of occupied territories, Hitler's self-interestedly playing various interests off against each other at the expense of the war effort, etc.) failed to properly utilize its potentially overwhelming industrial capacity (at least, prior to the arrival of Albert Speer, who came along too late to accomplish very much).
By and large, I accepted this explanation.
More recently Adam Tooze's Wolfson prize winning study The Wages of Destruction: The Making and Breaking of the Nazi Economy offered a different view--that the question was not a matter of political failure of this type, but the material limitations of the German economic base. Germany was simply not big enough or rich enough to hold its own against an alliance of the United States, the British Empire and the Soviet Union--the coalescence of which Tooze contends was virtually inevitable, so that Hitler's attempt to create a continental empire in the east in the face of their opposition was simply not explicable in any rational terms. (Indeed, he holds it to be comprehensible only in terms of Hitler's bizarre, conspiratorial, apocalyptic racial theories.)
To the end of making that case Tooze devotes his book to a study of the German economy through the crucial period (1933-1945), less on the basis of new data than old and well-known data (more or less like what I saw reading Kennedy) that has simply been marginalized in a great deal of analysis of the war. As he notes Germany was the world's second-largest industrial power in the early twentieth century, well ahead of its European neighbors in manufacturing productivity and output, and "world-class" in several key heavy industrial and high-tech lines like steel, chemicals, machine tools and electrical equipment.
However, German manufacturing appears less impressive when one looks at the sector overall. While Germany still excelled its European neighbors in productivity when taken this way, its lead was slighter than might be imagined, so much so that the difference between Germany, and for example, Britain, was negligible in comparison with the difference between Germany and the much more efficient (because much more "Fordist") United States (America getting twice as much per-worker output in many lines). There were even key heavy/high-tech lines where Germany remained a relatively small producer--like motor vehicles, the volume of output of Germany's automotive industry far behind that of the U.S..
Moreover, manufacturing was just one sector of the economy--with the other sectors even worse off. The country's comparably small, densely populated and less than resource-rich territory was problematic from the standpoint of its extractive sectors, which were also less productive than they might have been. The country's agricultural sector, notably, lagged not just the U.S. but even Britain in productivity, suffering from having too many of its workers on too little land divided into too many farms. All of this contributed not only to the country's per capita and national income being less than its manufacturing successes implied, but the harsh reality that those manufacturing successes depended heavily on imports of food, energy and raw materials, to the point that the country's trade was scarcely balanced by its manufacturing exports.
The result was that not only was Germany poorer than is usually appreciated, but its resource-financial situation differing from that of the U.S. (resource-rich) or Britain (resource-poor but financially powerful, with the trading privileges empire brings) in this manner left it with less room than they to export less or import more, as a major military build-up demanded. Moreover, the attempts to alleviate this constraint by developing domestic sources of imported materials like iron, and inventing synthetic substitutes for the oil and rubber Germany could not produce locally, on top of entailing additional inefficiencies, fell far short of eliminating the problem, leaving it very vulnerable to the cut-off of essential imports, and something have to give, fairly quickly.
This reading of the history is borne out by what actually did happen, the German government actually making ruthlessly efficient use of its domestic resources almost immediately on taking power--the facts about this refuting such clichés of the historiography as the unwillingness of the German government to sacrifice its people's living standards or put women to work for the sake of its military effort. (Indeed, already in the '30s the German government's system of central control of foreign exchange and key materials was gutting the production of consumer items like clothing in favor of armaments, while Germany had more of its women in the work force than Britain did.) However, for all that, Germany's merely pre-war preparations--its drive for maximum short-term output in the key military-industrial lines--translated to severe, unsustainable economic strain before the fighting even began, in such ways as the neglect of key infrastructure like the railroads, the price distortions caused by intricate central control, and the emergence of crises of cash flow and foreign exchange in spite of all the managerial feats and painful sacrifices. The inescapable recognition that the German economy by 1938 was a house of cards forced the government to curtail its rearmament programs at a point at which it was still nowhere near to being a match for Britain at sea, and disadvantaged in the face of combined Anglo-French air and ground forces in key respects (like numbers of up-to-date tanks), with the balance already swinging away from it to its opponents.1 And then even the cutbacks still left it in an economically precarious position.
Of course, the fighting did begin in 1939, and Germany did make successful conquests during the first year that did enlarge its resource base. However, in hindsight the conquests added less than might be imagined because of the limited size, resource bases and development of the countries it overran, not just the nations of Eastern Europe, but in Western Europe (Norway and Denmark, the Low Countries, and even France) as well.2 The result was that even the conquests fell far short of putting Germany on an even footing with the combined Big Three, the disparity still so wide as to moot many of the claims about how this or that more careful use of Axis resources by the Nazi leadership might have turned the tide of the war.3 Tooze reinforces this contention by showing that, for the most part, the Germans generally did what could be done with what was at their disposal (puncturing the "myth" of Albert Speer).4
A few questionable analogies apart (as when Tooze compares Germany's economic development in the 1930s to that of mid-income countries like South Africa today) Tooze offers a comprehensive and convincing image of the German economy, and especially its critical weaknesses. Moreover, while he pays less detailed attention to the other side of the balance sheet (it seemed to me that he was insufficiently attentive not just to France's demographic and industrial vulnerabilities but to Britain's financial and industrial weaknesses and problems of military overstretch in the late '30s), he is quite correct to contend that Germany, and even a Germany in control of much of continental Europe, was at a grave disadvantage in a prolonged war with a U.S.-U.K.-Soviet alliance.
Nonetheless, this emphasis on the inability of Germany to hold its own against the combined Allied Big Three obscures a weakness much more important than the deficiencies of the economic analysis--namely his depiction of the anti-Nazi alliance's coming together the way it did as a virtual certainty, his case for which is superficial and conventional compared with his economic analysis.5 He slights the leeriness of the U.K. and France about allying themselves with the Soviet Union that actually did prevent their coming together in 1939, with disastrous consequences. (Even after Poland, French conservatives were much more interested in fighting the Soviets than fighting Hitler. Indeed, the response of many of them to French defeat appears to have been a George Costanza-like "restrained jubilation" at the prospect of dispensing with democracy and setting up a fascist state in William Shirer's account of the Third Republic's collapse.)
Continuing along these lines, Tooze then goes on to slight the reality that, after the shock of the Battle of France (which brought Italy into the war on Germany's side), the hard-pressed, cash-strapped U.K. might have sought a negotiated peace, or that isolationism might have won the day in the United States, depriving Britain of American support. (In its absence the country would have had to settle with Germany--and the U.S. would have had few plausible options against Germany afterward, in spite of Germany's failures in the Battle of Britain and its naval limitations.) These developments would not have changed Germany's asset base--but they would have taken Britain out of the war, and likely kept Germany from having to fight the United States as well, leaving it to focus on just the Soviet Union, which would have changed the balance between them dramatically. Equally Tooze does not acknowledge, let alone refute, the possibility that in spite of the balance of power being so heavily against it as it was different German military decisions (most obviously "the Mediterranean strategy") could have produced a different outcome.
Additionally, these oft-studied "What ifs" aside, it is worth remembering the way the war did go. After Britain decided to stick it out against Germany, American support was exceedingly slow in coming, Britain bankrupt in March 1941 before the aid spigot was turned on fully. Moreover, even with Britain still against it, even after a delay by one crucial month due to the German invasion of the Balkans, mid-course changes to the plan of attack that cost additional time, and famously inimical weather, German troops managed to come within sight of Moscow before the invasion's first really significant repulse. And even afterward a depleted Soviet Union remained vulnerable (while any Anglo-American "second front" remained a long way off), permitting Germany to continue taking the offensive through the year, up to and even after the turning point of Stalingrad--following which it was another two-and-a-half more years of fighting before V-E Day.
One should also remember the cost of this protracted effort. The Soviet Union lost some twenty-five million lives. Britain lost what remained of the foundations of its status as a world empire, a great power, and even an independent force in world affairs (in such a weak position that it had to take a $4 billion U.S. loan immediately after the war). The U.S. did not suffer anything comparable, but it is worth remembering that its extraordinary industrial effort went far beyond what even the optimists thought possible at the time--and was a thing the Allies could not have taken for granted in the earlier part of the war, even after the American entry.
Consequently, while in hindsight it appears clear that Germany's chances for attaining anything resembling its maximum war aims were never more than low (indeed, would probably be deemed delusional but for the astonishing Allied timidity and incompetence in the key September 1939-May 1940 period), the attainment of those aims became increasingly implausible again after the failures of German forces against the Soviet Union in late 1941, and virtually inconceivable after 1942 came to a close, a victory so long, horrifically costly and deeply exhausting cannot be regarded with the complacency Tooze shows. Indeed, in his failure to critically examine this side of the issue, it appears that while Tooze takes on, and debunks, one myth of World War II (of German economic might), he promotes others that have themselves already been debunked--most importantly, that of the political establishments of the future members of the wartime United Nations having been steadfast opponents of Nazism from the very beginning.6
1. The initial plans had called for an air force of 21,000 planes when Germany went to war. The Luftwaffe never had more than 5,000, however.
2. Paul Bairoch's oft-cited 1982 data set on world manufacturing output through history gives Germany 13 percent of world industrial capacity. All the rest of non-Soviet, continental Europe together was another 13 percent, widely dispersed among its various nations, not all of which Germany controlled (Sweden and Switzerland remained independent), while the productivity of Germany and occupied Europe was hampered by the cut-off from imported inputs by the Allies' control of the oceans. By contrast, Bairoch's data gives the U.S. alone 31 percent of world manufacturing output in 1938 (more than all non-Soviet Europe combined).
3. To return to Bairoch's figures, the U.S. in coalition with the U.K. (another 11 percent) and the Soviet Union (another 9 percent) possessed 51 percent of world manufacturing capacity--a 4-to-1 advantage over Germany, and a 2-to-1 advantage over all of non-Soviet, continental Europe combined (independent states that merely traded with Germany included). Moreover, even these figures understate the margin of Allied advantage, given how the U.S. economy boomed during the war years (its economy growing about three-quarters), and Britain brought its broader Empire with its additional resources into the alliance. Considering the access to the food, energy and raw materials needed to keep all that capacity humming would seem likely to widen the disparity rather than diminish it.
In fairness, this was somewhat offset by two factors: German conquest of much of the western Soviet Union, and the absorption of a portion of Allied resources by the simultaneous war in the Asia-Pacific region. However, much of the Soviet industrial base in the conquered parts was hurriedly relocated eastward ahead of their advance, and contributed to the Soviet productive effort (and more still denied the Germans). At the same time the diversion of Allied resources by the Pacific War was limited by Japan's more modest military-industrial capability (it had just 5 percent of world manufacturing in 1938), China's weight on the Allied side (much of the Japanese army and air force tied up fighting against the Chinese), and the Allies' prioritization of the war in Europe (the "Europe First" or "Germany First" policy).
4. Tooze contends that the rise of German output in the last part of the war was largely the result of decisions taken earlier in it, and Speer simply around when those decisions paid off in the form of production gains. Tooze also offers a critical view of those initiatives Speer did undertake, like the effort to mass-produce U-boats late in the war.
5. While the critique of German strategy is solid, Tooze treats Allied strategy much less critically--a particular weakness when it comes to his assessment of the Combined Bomber Offensive.
6. The body of work regarding Britain in this respect can be found distilled in Clive Ponting's impressive 1940: Myth and Reality.
Reading Revisionism
It seems to me that I have been running across a great deal of revisionist World War II-era economic history as of late, focused primarily on how the balance stood between two of the war's major participants--Britain and Germany.
The old view was that the British Empire was stodgy and enfeebled, and looked it when it went up against sleek, ultra-modern, ultra-efficient Germany.
Much recent historiography has taken a different view--that Germany was not really so tough, nor Britain's performance so shabby. That it was really Germany that was the economic underdog against the trading and financial colossus--and, for all its dings, the techno-industrial colossus--Britain happened to be.
Of course, there is nothing wrong with revisionism as such. New information, and the availability of new ways of looking at, should prompt rethinkings of old perceptions and assumptions. However, revisionism also has a way of pandering to the fashion of the times--and this seems to be one of those cases.
This line of argument seems to say that Britain didn't do so badly out of free trade. That the country's weaknesses in manufacturing just didn't matter that much--that the emphasis on finance instead was just fine. That, by implication, the conservative establishment did just fine, overseeing the economy, and managing the country's foreign affairs, and then fighting the war--not needing any help from a bunch of lefties and Laborites, thank you very much, and certainly not disgracing itself the way they claim it had clearly done post-Dunkirk.
It seems more than coincidence that this is all very congenial to a conservative outlook--in particular the one that today says free trade, financialization, deindustrialization, the balance of payments (and all the rest of the consequences the neoliberal turn has had for the economies of the developed world) simply do not matter, that our future can be trusted to economic orthodoxy, that critical and dissenting views have nothing to offer. And also the one that says Britain can do just fine on its own, without any allies, continental or otherwise.
It is no more convincing in regard to the past than it is in regard to the present.
Still, even if one doubts the more extreme claims (there is simply no refuting Britain's failures as a manufacturing and trading nation), there is a considerable body of hard fact making it clear that Germany had some grave weaknesses--and Britain, some strengths that have to be acknowledged in any proper appraisal of the way the war was fought. So does it go in two books I have just reviewed here--Adam Tooze's study of the German economy in the Nazi period, The Wages of Destruction; and David Edgerton's more self-explanatory title, Britain's War Machine.
The old view was that the British Empire was stodgy and enfeebled, and looked it when it went up against sleek, ultra-modern, ultra-efficient Germany.
Much recent historiography has taken a different view--that Germany was not really so tough, nor Britain's performance so shabby. That it was really Germany that was the economic underdog against the trading and financial colossus--and, for all its dings, the techno-industrial colossus--Britain happened to be.
Of course, there is nothing wrong with revisionism as such. New information, and the availability of new ways of looking at, should prompt rethinkings of old perceptions and assumptions. However, revisionism also has a way of pandering to the fashion of the times--and this seems to be one of those cases.
This line of argument seems to say that Britain didn't do so badly out of free trade. That the country's weaknesses in manufacturing just didn't matter that much--that the emphasis on finance instead was just fine. That, by implication, the conservative establishment did just fine, overseeing the economy, and managing the country's foreign affairs, and then fighting the war--not needing any help from a bunch of lefties and Laborites, thank you very much, and certainly not disgracing itself the way they claim it had clearly done post-Dunkirk.
It seems more than coincidence that this is all very congenial to a conservative outlook--in particular the one that today says free trade, financialization, deindustrialization, the balance of payments (and all the rest of the consequences the neoliberal turn has had for the economies of the developed world) simply do not matter, that our future can be trusted to economic orthodoxy, that critical and dissenting views have nothing to offer. And also the one that says Britain can do just fine on its own, without any allies, continental or otherwise.
It is no more convincing in regard to the past than it is in regard to the present.
Still, even if one doubts the more extreme claims (there is simply no refuting Britain's failures as a manufacturing and trading nation), there is a considerable body of hard fact making it clear that Germany had some grave weaknesses--and Britain, some strengths that have to be acknowledged in any proper appraisal of the way the war was fought. So does it go in two books I have just reviewed here--Adam Tooze's study of the German economy in the Nazi period, The Wages of Destruction; and David Edgerton's more self-explanatory title, Britain's War Machine.
Friday, August 18, 2017
Animated Films at the U.S. Box Office, 1980-2016: The Stats
Considering the stats on the action film (such films now typically six of the ten top-grossing films in a given year), I naturally found myself wondering--what about the other four? Naturally I thought of animated films--and again noticed a trend that exploded a little later. In the '80s there was not a single animated feature in the top ten of any year, apart from the partial exception of Who Framed Roger Rabbit? (1988). (Even the celebrated The Little Mermaid only made the #13 spot.) However, starting with 1991's Beauty and the Beast it became common in the '90s for there to be at least one such film in the top ten (six years had at least one, for an average of 0.8), much more standard in the 2000s (1.7 films a year, with 2008 having four such movies), and yet again in the 2010s. Every single year from 2010 on has had at least one, every year but 2011 at least two, and 2010 an amazing five of the top ten--clearly, at the expense of the action movies which had an especially poor showing that year.
In the 2010s that means that between the action movies, and the animated films, together accounting for eight to nine of the top ten movies in any given year, they pretty much have the uppermost reaches of the box office monopolized in a way that no two genres have ever had before, so far as I can tell. That one to two spots at most? One is often taken up by something similar--like a live-action version of an animated hit of yesteryear's (like 2014's Maleficent or 2015's Cinderella), leaving at most one other for any and everything else, from adult-oriented comedies (like 2012's Ted) to the occasional mainstream drama (like 2014's American Sniper).
Feel free to check my data (and my math) for yourself.
The Data Set
1980-1989: 0.
1990-1999: 8. (0.8 a year.)
1991-1: Beauty and the Beast.
1992-1: Aladdin.
1994-1: The Lion King.
1995-2: Toy Story, Pocahontas.
1998-1: A Bug's Life.
1999-2: Toy Story 2, Tarzan.
2000-2009: 17 films (1.7 a year).
2000-0
2001-2: Shrek, Monsters Inc.
2002-1: Ice Age.
2003-1: Finding Nemo.
2004-3: Shrek 2, The Incredibles, Polar Express.
2005-1: Madagascar.
2006-3: Cars, Happy Feet, Ice Age 2.
2007-1: Shrek 3.
2008-4: Wall-E, Kung Fu Panda, Madagascar, Dr. Suess' Horton Hears a Who.
2009-1: Up.
2010-2016: 19 films (2.7 a year).
2010-5: Toy Story 3, Despicable Me, Shrek 4, How to Train Your Dragon, Tangled.
2011-1: Cars 2.
2012-2: Brave, Madagascar 3.
2013-3: Frozen, Despicable Me 2, Monsters University.
2014-2: The LEGO Movie, Big Hero 6.
2015-2: Inside Out, Minions.
2016-4: Finding Dory, The Secret Life of Pets, Zootopia, Sing.
In the 2010s that means that between the action movies, and the animated films, together accounting for eight to nine of the top ten movies in any given year, they pretty much have the uppermost reaches of the box office monopolized in a way that no two genres have ever had before, so far as I can tell. That one to two spots at most? One is often taken up by something similar--like a live-action version of an animated hit of yesteryear's (like 2014's Maleficent or 2015's Cinderella), leaving at most one other for any and everything else, from adult-oriented comedies (like 2012's Ted) to the occasional mainstream drama (like 2014's American Sniper).
Feel free to check my data (and my math) for yourself.
The Data Set
1980-1989: 0.
1990-1999: 8. (0.8 a year.)
1991-1: Beauty and the Beast.
1992-1: Aladdin.
1994-1: The Lion King.
1995-2: Toy Story, Pocahontas.
1998-1: A Bug's Life.
1999-2: Toy Story 2, Tarzan.
2000-2009: 17 films (1.7 a year).
2000-0
2001-2: Shrek, Monsters Inc.
2002-1: Ice Age.
2003-1: Finding Nemo.
2004-3: Shrek 2, The Incredibles, Polar Express.
2005-1: Madagascar.
2006-3: Cars, Happy Feet, Ice Age 2.
2007-1: Shrek 3.
2008-4: Wall-E, Kung Fu Panda, Madagascar, Dr. Suess' Horton Hears a Who.
2009-1: Up.
2010-2016: 19 films (2.7 a year).
2010-5: Toy Story 3, Despicable Me, Shrek 4, How to Train Your Dragon, Tangled.
2011-1: Cars 2.
2012-2: Brave, Madagascar 3.
2013-3: Frozen, Despicable Me 2, Monsters University.
2014-2: The LEGO Movie, Big Hero 6.
2015-2: Inside Out, Minions.
2016-4: Finding Dory, The Secret Life of Pets, Zootopia, Sing.
Wednesday, July 5, 2017
The Action Film Becomes King of the Box Office: Actual Numbers
It seems generally understood that the action movie took off in the '80s, and, while going through some changes (becoming less often R-rated and more often PG-13-rated, setting aside the cops and commandos to focus on fantasy and science fiction themes), captured a bigger and bigger share of the box office until it reached its predominant position today.
Still, this is something people seem to take as a given without generally checking it for themselves, and so I decided to do just that, looking at the ten highest-grossing films of every year from 1980 to the present, as listed by that ever-handy database, Box Office Mojo.
Of course, this was a trickier endeavor than it might sound. I had to decide, after all, just what to count as an action film. In my particular count I strove to leave aside action-comedies where the accent is on comedy (Stakeout, Kindergarten Cop), sports dramas (the Rocky series), and suspense thrillers (Witness, Silence of the Lambs, Basic Instinct), war films (Saving Private Ryan, Pearl Harbor), historical epics (Titanic) and fantasy and science fiction movies (Harry Potter) that may contain action but strike me as not really being put together as action films. I also left out animated films of all types even when they might have qualified (The Incredibles) to focus solely on live-action features. Some "subjectivity" was unavoidably involved in determining whether or not films made the cut even so. (I included Mr. and Mrs. Smith in the count for 2005. Some, in my place, probably would not.) Still, I generally let myself be guided by Box Office Mojo when in doubt, and I suspect that, especially toward the latter part of the period, I undercounted rather than overcounted, and in the process understated just how much the market had shifted in the direction of action films (and to be frank, thought that the safer side to err on).
Ultimately I concluded that 2.2 such films a year were average in the '80s (1980--1989), 3.6 in '90s (1990-1999), 4.8 in the '00s (2000-2009), and 5.4 for the current decade so far (2010-2016). Moreover, discarding the unusual year of 2010 (just 2 action movies), a half dozen action films in the top ten have been the the norm for the last period (2011-2016). This amounts not just to a clear majority of the most-seen movies, but close to triple the proportion seen in the '80s.
Moreover, the upward trend was steady through the whole period. It does not seem insignificant that, while 2 a year were average for the '80s, 1989 was the first year in which the three top spots were all claimed by action movies (Batman, Indiana Jones and Lethal Weapon #1, #2 and #3 respectively). Nor that no year after 1988 had fewer than two such movies in the top ten, and only 3 of the 28 years that followed (1991, 1992 and 2010) had under three. Indeed, during the 2000-2009 period, not one year had less than four movies of the type in its top ten, and after 2010, not one year had under six.
Especially considering that I probably undercounted rather than overcounted, there is no denying the magnitude of that shift by this measure. And while at the moment I have not taken an equally comprehensive approach to the top 20 (which would provide an even fuller picture), my impression is that it would reinforce this assessment, rather than undermine it. If in 1980 we look at the top 20, we still see just one action movie--Empire Strikes Back. But if we do the same in 2016 we add to the six films in the top ten Doctor Strange, Jason Bourne, Star Trek Beyond and X-Men: Apocalypse, four more films (even while we leave out such films as Fantastic Beasts and Where to Find Them, and Kung Fu Panda 3).
Anyone who cares to can check my counting below and see if they come up with anything much different.
The Data
1980-1989: 22 films (2.2 a year).
1980-top 10 films, only 1 action movie-Empire Strikes Back.
1981-2: Raiders of the Lost Ark and Superman II.
1982-2: Star Trek 2 and 48 HRS.
1983-3: Return of the Jedi, Octopussy, Sudden Impact.
1984-3: Beverly Hills Cop, Indiana Jones, Star Trek 3.
1985-2: Rambo 2, Jewel of the Nile.
1986-2: Top Gun, Aliens.
1987-3: Beverly Hills Cop 2, Lethal Weapon, The Untouchables.
1988-1: Die Hard.
1989-3: Batman, Indiana Jones, Lethal Weapon 2 (all in the top 3 slots, the first time this ever happened).
1990-1999: 36 films (3.6 a year).
1990-4: Teenage Mutant Ninja Turtles, The Hunt for Red October, Total Recall, Die Hard 2.
1991-2: Terminator 2, Robin Hood: Prince of Thieves.
1992-2: Batman 2, Lethal Weapon 3.
1993-4: Jurassic Park, The Fugitive, In the Line of Fire, Cliffhanger.
1994-3: True Lies, Clear and Present Danger, Speed.
1995-5: Batman 3, Goldeneye, Die Hard 3, Apollo 13, Jumanji.
1996-4: ID, Twister, Mission: Impossible, The Rock.
1997-5: Men in Black, The Lost World, Air Force One, Star Wars (Episode IV reissue), Tomorrow Never Dies.
1998-4: Armageddon, Rush Hour, Deep Impact, Godzilla.
1999-3: Star Wars: Episode I, The Matrix, The Mummy.
2000-2009: 48 films (4.8 a year).
2000-4: Mission: Impossible 2, Gladiator, A Perfect Storm, X-Men.
2001-5: The Lord of the Rings (LOTR) 1, Rush Hour 2, Mummy, Jurassic 3, The Planet of the Apes.
2002-4: Spiderman, Star Wars: Episode II, LOTR 2, Men In Black 2.
2003-6: LOTR 3, The Pirates of the Caribbean, The Matrix 2, X-Men 2, Terminator 3, The Matrix 3.
2004-4: Spiderman 2, The Day After Tomorrow, The Bourne Supremacy, National Treasure.
2005-5: Star Wars: Episode III, The War of the Worlds, King Kong, Batman Begins, Mr and Mrs. Smith.
2006-4: Pirates of the Caribbean 2, X-Men: The Last Stand, Superman Returns, Casino Royale.
2007-7: Spiderman 3, The Transformers, Pirates of the Caribbean 3, National Treasure 2, The Bourne Ultimatum, I Am Legend Legend, 300.
2008-5: The Dark Knight, Indiana Jones and the Kingdom of the Crystal Skull, Iron Man, Hancock, Quantum of Solace.
2009-4: Avatar, Transformers 2, Star Trek, Sherlock Holmes.
2010-2016: 38 films (5.4 a year).
2010-2: Iron Man 2, Inception.
2011-6: Transformers 3, Pirates 4, Fast Five, MI 4, Sherlock 2, Thor.
2012-6: The Avengers, The Dark Knight Rises, Hunger Games, Skyfall, Hobbit 1, Spiderman.
2013-6: Hunger Games 2, Iron Man 3, Man of Steel, Hobbit 2, Fast 6, Oz the Great and Powerful.
2014-6: Hunger Games 3, Guardians of the Galaxy, Captain America 2, Hobbit 3, Transformers 4, X-Men: Days of Future Past.
2015-6: Star Wars, Jurassic, Avengers 2, Furious 7, Hunger Games 4, Spectre.
2016-6: Rogue One, Captain America III, Jungle Book, Deadpool, Batman vs. Superman, Suicide Squad.
Still, this is something people seem to take as a given without generally checking it for themselves, and so I decided to do just that, looking at the ten highest-grossing films of every year from 1980 to the present, as listed by that ever-handy database, Box Office Mojo.
Of course, this was a trickier endeavor than it might sound. I had to decide, after all, just what to count as an action film. In my particular count I strove to leave aside action-comedies where the accent is on comedy (Stakeout, Kindergarten Cop), sports dramas (the Rocky series), and suspense thrillers (Witness, Silence of the Lambs, Basic Instinct), war films (Saving Private Ryan, Pearl Harbor), historical epics (Titanic) and fantasy and science fiction movies (Harry Potter) that may contain action but strike me as not really being put together as action films. I also left out animated films of all types even when they might have qualified (The Incredibles) to focus solely on live-action features. Some "subjectivity" was unavoidably involved in determining whether or not films made the cut even so. (I included Mr. and Mrs. Smith in the count for 2005. Some, in my place, probably would not.) Still, I generally let myself be guided by Box Office Mojo when in doubt, and I suspect that, especially toward the latter part of the period, I undercounted rather than overcounted, and in the process understated just how much the market had shifted in the direction of action films (and to be frank, thought that the safer side to err on).
Ultimately I concluded that 2.2 such films a year were average in the '80s (1980--1989), 3.6 in '90s (1990-1999), 4.8 in the '00s (2000-2009), and 5.4 for the current decade so far (2010-2016). Moreover, discarding the unusual year of 2010 (just 2 action movies), a half dozen action films in the top ten have been the the norm for the last period (2011-2016). This amounts not just to a clear majority of the most-seen movies, but close to triple the proportion seen in the '80s.
Moreover, the upward trend was steady through the whole period. It does not seem insignificant that, while 2 a year were average for the '80s, 1989 was the first year in which the three top spots were all claimed by action movies (Batman, Indiana Jones and Lethal Weapon #1, #2 and #3 respectively). Nor that no year after 1988 had fewer than two such movies in the top ten, and only 3 of the 28 years that followed (1991, 1992 and 2010) had under three. Indeed, during the 2000-2009 period, not one year had less than four movies of the type in its top ten, and after 2010, not one year had under six.
Especially considering that I probably undercounted rather than overcounted, there is no denying the magnitude of that shift by this measure. And while at the moment I have not taken an equally comprehensive approach to the top 20 (which would provide an even fuller picture), my impression is that it would reinforce this assessment, rather than undermine it. If in 1980 we look at the top 20, we still see just one action movie--Empire Strikes Back. But if we do the same in 2016 we add to the six films in the top ten Doctor Strange, Jason Bourne, Star Trek Beyond and X-Men: Apocalypse, four more films (even while we leave out such films as Fantastic Beasts and Where to Find Them, and Kung Fu Panda 3).
Anyone who cares to can check my counting below and see if they come up with anything much different.
The Data
1980-1989: 22 films (2.2 a year).
1980-top 10 films, only 1 action movie-Empire Strikes Back.
1981-2: Raiders of the Lost Ark and Superman II.
1982-2: Star Trek 2 and 48 HRS.
1983-3: Return of the Jedi, Octopussy, Sudden Impact.
1984-3: Beverly Hills Cop, Indiana Jones, Star Trek 3.
1985-2: Rambo 2, Jewel of the Nile.
1986-2: Top Gun, Aliens.
1987-3: Beverly Hills Cop 2, Lethal Weapon, The Untouchables.
1988-1: Die Hard.
1989-3: Batman, Indiana Jones, Lethal Weapon 2 (all in the top 3 slots, the first time this ever happened).
1990-1999: 36 films (3.6 a year).
1990-4: Teenage Mutant Ninja Turtles, The Hunt for Red October, Total Recall, Die Hard 2.
1991-2: Terminator 2, Robin Hood: Prince of Thieves.
1992-2: Batman 2, Lethal Weapon 3.
1993-4: Jurassic Park, The Fugitive, In the Line of Fire, Cliffhanger.
1994-3: True Lies, Clear and Present Danger, Speed.
1995-5: Batman 3, Goldeneye, Die Hard 3, Apollo 13, Jumanji.
1996-4: ID, Twister, Mission: Impossible, The Rock.
1997-5: Men in Black, The Lost World, Air Force One, Star Wars (Episode IV reissue), Tomorrow Never Dies.
1998-4: Armageddon, Rush Hour, Deep Impact, Godzilla.
1999-3: Star Wars: Episode I, The Matrix, The Mummy.
2000-2009: 48 films (4.8 a year).
2000-4: Mission: Impossible 2, Gladiator, A Perfect Storm, X-Men.
2001-5: The Lord of the Rings (LOTR) 1, Rush Hour 2, Mummy, Jurassic 3, The Planet of the Apes.
2002-4: Spiderman, Star Wars: Episode II, LOTR 2, Men In Black 2.
2003-6: LOTR 3, The Pirates of the Caribbean, The Matrix 2, X-Men 2, Terminator 3, The Matrix 3.
2004-4: Spiderman 2, The Day After Tomorrow, The Bourne Supremacy, National Treasure.
2005-5: Star Wars: Episode III, The War of the Worlds, King Kong, Batman Begins, Mr and Mrs. Smith.
2006-4: Pirates of the Caribbean 2, X-Men: The Last Stand, Superman Returns, Casino Royale.
2007-7: Spiderman 3, The Transformers, Pirates of the Caribbean 3, National Treasure 2, The Bourne Ultimatum, I Am Legend Legend, 300.
2008-5: The Dark Knight, Indiana Jones and the Kingdom of the Crystal Skull, Iron Man, Hancock, Quantum of Solace.
2009-4: Avatar, Transformers 2, Star Trek, Sherlock Holmes.
2010-2016: 38 films (5.4 a year).
2010-2: Iron Man 2, Inception.
2011-6: Transformers 3, Pirates 4, Fast Five, MI 4, Sherlock 2, Thor.
2012-6: The Avengers, The Dark Knight Rises, Hunger Games, Skyfall, Hobbit 1, Spiderman.
2013-6: Hunger Games 2, Iron Man 3, Man of Steel, Hobbit 2, Fast 6, Oz the Great and Powerful.
2014-6: Hunger Games 3, Guardians of the Galaxy, Captain America 2, Hobbit 3, Transformers 4, X-Men: Days of Future Past.
2015-6: Star Wars, Jurassic, Avengers 2, Furious 7, Hunger Games 4, Spectre.
2016-6: Rogue One, Captain America III, Jungle Book, Deadpool, Batman vs. Superman, Suicide Squad.
Friday, June 16, 2017
The Superhero Film Gets a Makeover
As regular readers of this blog (all two of you, unless I'm miscounting by two) know, I have been watching the superhero movie bubble for years expecting it to pop. Of course it hasn't, so far--this, seventeen summers after Bryan Singer's X-Men (2000) (which in turn came a decade after 1989's Batman became the biggest hit of the year, and its franchise the most successful of the 1989-1995 period). And the conventional wisdom seems to be that this longest-running of action movie fashions can go on indefinitely.
I'm less sure of what to think than before. The studios have slates packed with superhero films through 2020, and by now probably beyond it as well, and at this moment I wouldn't care to bet on their shelving those plans because of an untoward change in the market--still less because of three approaches which are proving commercially viable.
R-Rated Superhero Movies
Of course, there have been plenty of R-rated superhero movies before--like the Blade movies, and Wanted and Watchmen. The difference was that they tended to not be made about first-string characters, or given first-rate budgets, with the expectation justified by their small prospect of first-rate grosses.
So did it also go with Deadpool, who was not a first-string character (Deadpool reminds us of this himself, rather crudely, in the early part of the film), and who didn't get the really big budget (Deadpool gabbing endlessly about this too). Rather than original or fresh or subversive (it had nothing on Watchmen here, or even the 2015 reboot of Fantastic Four) it struck me as a mediocre second-stringer, notable only in how heavily it relied on tired independent film-type shtick. Still, the film exploded at the box office ($363 million in North America,nearly $800 million worldwide--a feat the more impressive for it not having played in the ever-more important Chinese market), and has already had a sort of follow-up in a genuine first-stringer--Wolverine--getting an R-rated film, Logan, earlier this year. (The budget was, again, limited next to the full-blown X-Men movies at under $100 million, but the character and the budget were a bigger investment than Deadpool represented, and with over $600 million banked it seems likely to encourage even bolder moves of this kind later.)
What's going on here?
One possibility is the change in the makeup of the market. It seems that R-rated films (and not just raunchy comedies), while far from where they were in the '80s and even '90s in terms of market share, have done better recently than at any time this century, and superhero films have reflected the turn. (In 2014, American Sniper topped the box office and Gone Girl also did well, while 2015 saw The Revenant and Fifty Shades of Grey become top 20 hits at the American box office, and Mad Max: Fury Road claim the #21 spot.) I might add that reading the Box Office Guru's biweekly reports, I'm struck by how much the audience for recent films has been dominated by the over-25 crowd--younger people perhaps going to the movies less often. (Living life online, while having less access to cars and less money in general, I suspect they don't go to the theater so much as they used to do.) This might make an R-rating less prohibitive than it used to be for an action film producer, perhaps especially in this case. By this point anyone who is twenty-five probably has little memory of a time when the action genre was not dominated by superheroes. They grew up on superheroes, are used to superheroes, and so R-rated films about people with unlikely powers dressed in colorful spandex seem less silly to them, and so not such a tough sell as they would have been once upon a time.
Woman-Centered Superhero Movies
Just as we have had plenty of R-rated superhero movies in the past, we have had plenty of superhero films with female protagonists. The difference was that, as I recently wrote here, while there were movies with female superhero protagonists (Elektra, for example), and first-string superhero movies prominently including female superheroes in their casts (Black Widow in the Avengers), the flops of the early 2000s left Hollywood discouraged about presenting really first-string superhero movies centered on female protagonists. The grosses of YA dystopia films like The Hunger Games, Lucy and Mad Max have made the studios readier to go down this road, however--and the success of Wonder Woman is reinforcing this. And I suspect that at the very least the trend will endure for a while, if not prove here to stay.
Mega-Franchises
Where the decision to make big-budget films centered on female superheroes is a case of the superhero movie following trends set by others, in this case the superheroes have led the way. The Marvel studio gambled big and won big with a larger Marvel Comics Universe, which regularized Avengers-style grosses to such a degree that even an Iron Man or Captain America sequel could deliver them. Inspired by this course Warner Brothers gambled (but has not yet won big) with a comparable Justice League franchise. Since then Disney (Marvel's current owner) has given Star Wars the same treatment (and so far, won big again), while Universal, not to be left behind, has decided to do the same with a film franchise based on its classic movie monsters. (The first effort, the recent reboot of The Mummy, is a commercial disappointment, but as the WB demonstrated in plowing ahead with its Justice League despite the reservations about Man of Steel and Superman vs. Batman, the project is too big to be shut down by a single setback.)
A credulous postmodernist might gush at the possibilities for intertextuality. But the reality is that this is of principally commercial rather than artistic significance--as is the case with the other two trends discussed here. Indeed, by upping the commercial pressure (studios now want not a series delivering a solid hit every two or three years on the strength of the built-in audience, but a hit machine delivering record breaking-blockbusters once or twice a year) the greater synchronization, the higher financial stakes would seem likely to tie the artists' hands to a degree those who still lament the decline of New Hollywood can scarcely fathom as yet. Still, superficial as most of this is (we are not talking about a reinvention of superhero films here), the success of Deadpool says something about how little it might take to keep the boom going longer than the decades it has already managed.
I'm less sure of what to think than before. The studios have slates packed with superhero films through 2020, and by now probably beyond it as well, and at this moment I wouldn't care to bet on their shelving those plans because of an untoward change in the market--still less because of three approaches which are proving commercially viable.
R-Rated Superhero Movies
Of course, there have been plenty of R-rated superhero movies before--like the Blade movies, and Wanted and Watchmen. The difference was that they tended to not be made about first-string characters, or given first-rate budgets, with the expectation justified by their small prospect of first-rate grosses.
So did it also go with Deadpool, who was not a first-string character (Deadpool reminds us of this himself, rather crudely, in the early part of the film), and who didn't get the really big budget (Deadpool gabbing endlessly about this too). Rather than original or fresh or subversive (it had nothing on Watchmen here, or even the 2015 reboot of Fantastic Four) it struck me as a mediocre second-stringer, notable only in how heavily it relied on tired independent film-type shtick. Still, the film exploded at the box office ($363 million in North America,nearly $800 million worldwide--a feat the more impressive for it not having played in the ever-more important Chinese market), and has already had a sort of follow-up in a genuine first-stringer--Wolverine--getting an R-rated film, Logan, earlier this year. (The budget was, again, limited next to the full-blown X-Men movies at under $100 million, but the character and the budget were a bigger investment than Deadpool represented, and with over $600 million banked it seems likely to encourage even bolder moves of this kind later.)
What's going on here?
One possibility is the change in the makeup of the market. It seems that R-rated films (and not just raunchy comedies), while far from where they were in the '80s and even '90s in terms of market share, have done better recently than at any time this century, and superhero films have reflected the turn. (In 2014, American Sniper topped the box office and Gone Girl also did well, while 2015 saw The Revenant and Fifty Shades of Grey become top 20 hits at the American box office, and Mad Max: Fury Road claim the #21 spot.) I might add that reading the Box Office Guru's biweekly reports, I'm struck by how much the audience for recent films has been dominated by the over-25 crowd--younger people perhaps going to the movies less often. (Living life online, while having less access to cars and less money in general, I suspect they don't go to the theater so much as they used to do.) This might make an R-rating less prohibitive than it used to be for an action film producer, perhaps especially in this case. By this point anyone who is twenty-five probably has little memory of a time when the action genre was not dominated by superheroes. They grew up on superheroes, are used to superheroes, and so R-rated films about people with unlikely powers dressed in colorful spandex seem less silly to them, and so not such a tough sell as they would have been once upon a time.
Woman-Centered Superhero Movies
Just as we have had plenty of R-rated superhero movies in the past, we have had plenty of superhero films with female protagonists. The difference was that, as I recently wrote here, while there were movies with female superhero protagonists (Elektra, for example), and first-string superhero movies prominently including female superheroes in their casts (Black Widow in the Avengers), the flops of the early 2000s left Hollywood discouraged about presenting really first-string superhero movies centered on female protagonists. The grosses of YA dystopia films like The Hunger Games, Lucy and Mad Max have made the studios readier to go down this road, however--and the success of Wonder Woman is reinforcing this. And I suspect that at the very least the trend will endure for a while, if not prove here to stay.
Mega-Franchises
Where the decision to make big-budget films centered on female superheroes is a case of the superhero movie following trends set by others, in this case the superheroes have led the way. The Marvel studio gambled big and won big with a larger Marvel Comics Universe, which regularized Avengers-style grosses to such a degree that even an Iron Man or Captain America sequel could deliver them. Inspired by this course Warner Brothers gambled (but has not yet won big) with a comparable Justice League franchise. Since then Disney (Marvel's current owner) has given Star Wars the same treatment (and so far, won big again), while Universal, not to be left behind, has decided to do the same with a film franchise based on its classic movie monsters. (The first effort, the recent reboot of The Mummy, is a commercial disappointment, but as the WB demonstrated in plowing ahead with its Justice League despite the reservations about Man of Steel and Superman vs. Batman, the project is too big to be shut down by a single setback.)
A credulous postmodernist might gush at the possibilities for intertextuality. But the reality is that this is of principally commercial rather than artistic significance--as is the case with the other two trends discussed here. Indeed, by upping the commercial pressure (studios now want not a series delivering a solid hit every two or three years on the strength of the built-in audience, but a hit machine delivering record breaking-blockbusters once or twice a year) the greater synchronization, the higher financial stakes would seem likely to tie the artists' hands to a degree those who still lament the decline of New Hollywood can scarcely fathom as yet. Still, superficial as most of this is (we are not talking about a reinvention of superhero films here), the success of Deadpool says something about how little it might take to keep the boom going longer than the decades it has already managed.
Thoughts on the Wonder Woman Movie Actually Happening
As is well known by now to anyone who pays much attention to films of the type, DC got the Wonder Woman film made, and got it out this summer, and it has already pulled in enough money to be safely confirmed as a commercial success. (Its $460 million is just over half the $873 million the "disappointing" Superman vs. Batman made back in early 2016, and the final tally will probably fall well short of that figure--the Box Office Guru figuring something on the order $750 million. But it was not quite so big an investment, making it a very healthy return.)
In the process DC has realized what I described a few years ago as a long shot.
Of course, quite a lot has happened since--some of it, what seemed to me to be prerequisites for a Wonder Woman film. Warner Brothers firmly committed itself to a Justice League megafranchise comparable to the Marvel Comics Universe, and used the "backdoor" Justice League movie Superman vs. Batman: The Dawn of Justice to introduce new characters, Wonder Woman included.
There has, too, been something of a resurgence in big-budget action movies with female protagonists. Again, the female action hero didn't go away in the preceding years. There were still plenty of really high-profile, big-budget movies featuring action heroines--if as part of an ensemble, like the Black Widow (featured in five movies to date). There were still plenty of second-string action movies with female leads getting by on lower budgets and lower grosses--like the films of the Underworld and Resident Evil franchises (which have continued up to the present).
What there wasn't a lot of were movies combining a first-string production with a female lead between the early 2000s (the underperformance of the Charlie's Angels and Tomb Raider sequels in the summer of 2003, Catwoman winding up a flop in 2004, Aeon Flux becoming another disappointment in 2005, putting studios off) and the middle of this decade, when they began to crop up again. The Hunger Games exploded (2012), and was followed up by its mega-budgeted sequels (2013, 2014, 2015), the initially cautious but later bigger-budgeted Divergent films (2013, 2014, 2015) and the $150 million Mad Max: Fury Road (2015) (with the low-budget but high-grossing Lucy in 2014) reinforcing the trend. Still, even if this makes clear what an exaggeration the claims for Wonder Woman as something unprecedented are (both by the studios, and the sycophantic entertainment press), it has some claim to looking like a watershed moment.
That said, the question would seem to be whether such films will now be a commonplace of the cinematic landscape, or prone to the kind of boom and bust seen in the early 2000s. What do you think?
In the process DC has realized what I described a few years ago as a long shot.
Of course, quite a lot has happened since--some of it, what seemed to me to be prerequisites for a Wonder Woman film. Warner Brothers firmly committed itself to a Justice League megafranchise comparable to the Marvel Comics Universe, and used the "backdoor" Justice League movie Superman vs. Batman: The Dawn of Justice to introduce new characters, Wonder Woman included.
There has, too, been something of a resurgence in big-budget action movies with female protagonists. Again, the female action hero didn't go away in the preceding years. There were still plenty of really high-profile, big-budget movies featuring action heroines--if as part of an ensemble, like the Black Widow (featured in five movies to date). There were still plenty of second-string action movies with female leads getting by on lower budgets and lower grosses--like the films of the Underworld and Resident Evil franchises (which have continued up to the present).
What there wasn't a lot of were movies combining a first-string production with a female lead between the early 2000s (the underperformance of the Charlie's Angels and Tomb Raider sequels in the summer of 2003, Catwoman winding up a flop in 2004, Aeon Flux becoming another disappointment in 2005, putting studios off) and the middle of this decade, when they began to crop up again. The Hunger Games exploded (2012), and was followed up by its mega-budgeted sequels (2013, 2014, 2015), the initially cautious but later bigger-budgeted Divergent films (2013, 2014, 2015) and the $150 million Mad Max: Fury Road (2015) (with the low-budget but high-grossing Lucy in 2014) reinforcing the trend. Still, even if this makes clear what an exaggeration the claims for Wonder Woman as something unprecedented are (both by the studios, and the sycophantic entertainment press), it has some claim to looking like a watershed moment.
That said, the question would seem to be whether such films will now be a commonplace of the cinematic landscape, or prone to the kind of boom and bust seen in the early 2000s. What do you think?
Saturday, June 3, 2017
On the Historiography of Science Fiction: Info-Dumping and Incluing
When it comes to "info-dumping" and "incluing," a considerable current of thought about science fiction hews to the standards of mainstream literature, down to those misconceptions and prejudices summed up in the truism "Show, don't tell." Simply put, info-dumps (telling) are regarded as bad, incluing (showing) as good; and the fact that we are more likely to do the latter than before is taken as a case of "Now We Know Better," and therefore are better than our more enthusiastically info-dumping forbears. Additionally John Campbell and his colleagues (like Robert Heinlein), while getting less respect from the more "literary" than they otherwise might, are still credited, at times lavishly, with "showing us the light" on this point.
However, Campbell was merely a dedicated and influential promoter of incluing, rather than its inventor. Others had done it before he and his writers came onto the scene. For example, E.M. Forster did so in his short story "The Machine Stops" (1909), and perhaps inspired by this example, so did one of the genre's founding fathers in one of its foundational works, Hugo Gernsback in Ralph 124 C 41+ (1911) (at least in his account of Ralph's use of the Telephot, which seems to echo the opening of Forster's own story). Indeed, the first chapter of Gernsback's oft-mentioned but rarely read classic uses the technique so heavily--and to my surprise, artfully--that it can be treated as a model for doing so, presenting us with a barrage of obliquely treated technological novelties in the first meeting between the titular hero and his love interest, Alice, that manages to be packed with action and novelty, but also lucidly conceived and smoothly written.
Yet, Gernsback shifted to info-dumping for the rest of the text. One reason, obviously, might be that info-dumping--telling--is simply the easier mode from a technical standpoint, setting aside the iron cage of what we call "style," and thus saving an author from having to spend five days on the same page while searching for "le mot juste." However, this does not seem the sole concern. There were particular reasons for the author to go in for incluing in that first chapter--above all, to keep cumbersome explanations from getting in the way of the action. These concerns were less operative later in the text, especially as a big part of their interest was that mainstay of science fiction from at least Francis Bacon's The New Atlantis on, the Tour of the World of Tomorrow--on which it is appropriate to have a Tour Guide along.
Indeed, Gernsback was a strong believer in the value of a good info-dump, as an editor at Amazing Stories strongly encouraging his writers to produce them. Simply put, there are things that cannot be shown intelligibly or concisely, only told, but which it is worthwhile to convey anyway; things that are worth info-dumping when one cannot inclue them; and while one may take issue with the use made of them in some of the fiction he edited, the principle is a valid one, in both science fiction and fiction generally (as H.G. Wells, appropriately, noticed and argued eloquently). Naturally, just as the author who always shows and never tells is a lot less common than the truism would have it (even Flaubert had to tell us some things straight out), so is it very rare to find a really substantial work of science fiction totally bereft of info-dumps.
However, Campbell was merely a dedicated and influential promoter of incluing, rather than its inventor. Others had done it before he and his writers came onto the scene. For example, E.M. Forster did so in his short story "The Machine Stops" (1909), and perhaps inspired by this example, so did one of the genre's founding fathers in one of its foundational works, Hugo Gernsback in Ralph 124 C 41+ (1911) (at least in his account of Ralph's use of the Telephot, which seems to echo the opening of Forster's own story). Indeed, the first chapter of Gernsback's oft-mentioned but rarely read classic uses the technique so heavily--and to my surprise, artfully--that it can be treated as a model for doing so, presenting us with a barrage of obliquely treated technological novelties in the first meeting between the titular hero and his love interest, Alice, that manages to be packed with action and novelty, but also lucidly conceived and smoothly written.
Yet, Gernsback shifted to info-dumping for the rest of the text. One reason, obviously, might be that info-dumping--telling--is simply the easier mode from a technical standpoint, setting aside the iron cage of what we call "style," and thus saving an author from having to spend five days on the same page while searching for "le mot juste." However, this does not seem the sole concern. There were particular reasons for the author to go in for incluing in that first chapter--above all, to keep cumbersome explanations from getting in the way of the action. These concerns were less operative later in the text, especially as a big part of their interest was that mainstay of science fiction from at least Francis Bacon's The New Atlantis on, the Tour of the World of Tomorrow--on which it is appropriate to have a Tour Guide along.
Indeed, Gernsback was a strong believer in the value of a good info-dump, as an editor at Amazing Stories strongly encouraging his writers to produce them. Simply put, there are things that cannot be shown intelligibly or concisely, only told, but which it is worthwhile to convey anyway; things that are worth info-dumping when one cannot inclue them; and while one may take issue with the use made of them in some of the fiction he edited, the principle is a valid one, in both science fiction and fiction generally (as H.G. Wells, appropriately, noticed and argued eloquently). Naturally, just as the author who always shows and never tells is a lot less common than the truism would have it (even Flaubert had to tell us some things straight out), so is it very rare to find a really substantial work of science fiction totally bereft of info-dumps.
Tell, Don't Show--Again
In his book How Fiction Works James Wood early on sings the praises of Gustave Flaubert as the founder of "modern realist narration," what is often glibly summed up as "show, don't tell" done right. This, as Wood remarks,
1. I cite here the Eleanor Marx-Aveling translation.
favors the telling and brilliant detail . . . privileges a high degree of visual noticing . . . maintains an unsentimental composure and knows how to withdraw, like a good valet, from superfluous commentary . . . and that the author's fingerprints on all this are, paradoxically, traceable but not visible.Indeed, Gustave Flaubert was a genuine and highly influential master of the technique, frequently managing to convey some quite difficult content with ease and precision in many a scene in classics like Madame Bovary. However, it was worth remarking that even he used a good deal of telling--as you find if you actually pick up the book. To fill in his picture, he did not just rely on the "visual noticing" of such things as the characters' facial expressions, casual remarks and the like, but time and again delved into his characters' heads and pasts and generalized and grew abstract in relating such things as Emma Bovary's school days, and the romantic side of her that developed but was never to find fulfillment in the workaday world into which she was born and in which she had her life.
"This nature, positive in the midst of its enthusiasms, that had loved the church for the sake of the flowers, and music for the words of the songs, and literature for its passional stimulus, rebelled against the mysteries of faith as it grew irritated by discipline, a thing antipathetic to her constitution . . ."1It is elegant telling, but telling all the same.
1. I cite here the Eleanor Marx-Aveling translation.
Subscribe to:
Posts (Atom)