Education and Pete Butigieg
Education and Pete Butigieg
Education and Butegieg
Education and Pete Butigieg
How College
Became a Commodity
Market-based thinking is at the heart
of how academe thinks of itself.
That’s a travesty.
This past summer, Alaska’s Republican governor, Mike Dunleavy, announced a draconian plan to slash appropriations for the university system by 41 percent. Defending the decision, he repeated a phrase that increasingly accompanies budget cuts: that the university couldn’t continue being “all things for all people.” Dunleavy, who insisted that the state’s deficit be closed without raising taxes, argued that Alaska must “turn the university into a smaller, leaner, but still very positive, productive university in the Northern Hemisphere.”
Pete Buttigieg has made a similar notion the center of his opposition to universal free college in the 2020 Democratic primary. “Americans who have a college degree earn more than Americans who don’t,” Buttigieg said. “As a progressive, I have a hard time getting my head around the idea of a majority who earn less because they didn’t go to college subsidizing a minority who earn more because they did.” Buttigieg has continued to hammer the point that universality equals upward redistribution. Lis Smith, a senior adviser for his campaign, tweeted, “If you think that a worker who didn’t go to college should pay for college for a CEO’s kid, then @PeteButtigieg isn’t your candidate.”
These statements capture a bipartisan sea change in the way Americans thinkabout higher education. Universities can’t be “all things to all people,” hence they should focus on what politicians determine to be their most “productive” activities. Governments not only cannot but should not provide higher education to everyone: People who can afford to invest in their own future should pay for themselves, and only those who really need it should receive help. We shouldn’t force “poor” Americans to pay for “rich” college students — even though broader-based funding of public higher education overwhelmingly and disproportionately helps the poor.
This line of argument has been dominant for decades, but it is not how politicians — especially progressive ones — always thought about higher education. The story of how the language of scarcity and individual investment became bipartisan orthodoxy begins with the marginal ideas of neoliberal economists in the years after World War II. Those ideas received a shot of polemical adrenaline and political influence from the student-protest movement of the late 1960s.
The economic literature on education would undoubtedly cause many professors to recoil.
The campus upheavals of the 1960s brought a wave of responses from the professoriate, but one in particular stood out. Written by two economists, James M. Buchanan and Nicos E. Devletoglou, Academia in Anarchy (Basic Books, 1970) opened with a law-and-order quote from Richard Nixon and was dedicated to “the taxpayer.” The authors explained that they wrote with “indignation” after observing the bombing of the UCLA economics department, where Buchanan taught, and the “groveling of the UCLA administrative authorities” to a “handful of revolutionary terrorists.”
Buchanan and Devletoglou suggested an overhaul of higher education aimed at bringing the student movement to heel. At the time, California had proposed a master plan of universal free higher education across its system. But the authors of Academia in Anarchy argued that the proposal suffered from a lack of basic economics — meaning not simply economic calculation, but Buchanan’s conception of economics as an all-encompassing moral and behavioral philosophy. “Almost alone among social scientists,” they wrote, “the economist brings with him a model of human behavior which allows predictions about human action.”
Their recommendations cut sharply against the spirit of the times. To Buchanan and Devletoglou, the students’ bad behavior was the grim result of the overabundance of education. Treating education as a “free good” meant that those who received it had no incentive to value it, and thus spent their years at university behaving as “man-children” playing a “psychedelic game.” Buchanan and Devletoglou recommended a student-loan system. “The scarcity value of a university education would at least be brought home to the student,” they wrote. Students would be forced to take a harder look at what they studied and how, avoid protest, and develop an appreciation for “property rights” as paying consumers of education.
In a twist that would become characteristic of later libertarian arguments, with softer echoes among technocratic liberals like Buttigieg, economically disciplining students was a matter of social justice for society at large. Free tuition that was intended to provide a path of mobility for less-fortunate citizens, especially racial minorities who suffered from centuries of accumulated exclusion, was actually a “gift to the gifted” — “a transfer of wealth from the poor to the rich” — with the “poor taxpayer” cleverly but dubiously painted by Buchanan and Devletoglou as working-class citizens excluded from higher education.
In retrospect, their predictions seem prescient: “As the chaos mounts, however, and as the reformers’ feeble but sometimes dangerous responses to revolutionary pressure continue, as taxpayers rise in anger, perhaps the time will come when the economist’s pedestrian explanations will command respect.”
Within the decade, wealthier white Americans were revolting against taxes used to fund integrated public schools and fleeing to the suburbs, where they received racially preferential home loans. Anti-tax clauses were written into many state constitutions, hamstringing the public funding of higher education. Economists in the emerging world of neoclassical economics, along with a new constellation of conservative think tanks, were advocating school-voucher systems and theories of “human capital” that reframed education as a consumer choice, an investment individuals should have to make in their future earnings. The federal government retreated from universalizing higher education, and the student-loan system Buchanan and Devletoglou favored eventually became a preferred tool of even liberal policy makers.
The neoliberal economists won: education became a commodity, and a large swath of the reshaping of higher education that economists like Buchanan and Devletoglou championed took place. But to their intellectual descendants, the economic disciplining of higher ed has not gone far enough: Federal and state governments still “waste” precious taxpayer dollars on a dysfunctional and possibly even pointless enterprise structured around “bad incentives,” clinging to economically “worthless” subjects, and defined by the charade-like pretense of learning.
If such claims sound extreme, they are far closer to the mainstream of education-policy thinking today than is often acknowledged. Two recent books by scholars closely associated with libertarian economics all but rip off the mask. In The Case Against Education (Princeton University Press, 2018), Bryan Caplan, an economist at George Mason, argues that the American higher-education system is a massive misuse of federal funds. In Cracks in the Ivory Tower (Oxford University Press, 2019), Jason Brennan, a philosopher at Georgetown, and Phillip Magness, an economic historian at George Mason, apply Buchanan’s theory of incentives to various aspects of the modern research university. That these authors often write as if they were inhabitants of the 1960s, facing down a redistributionary education system stubbornly indifferent to basic economics, ironically reveals the extent to which their worldview has triumphed — and how far they still want to go.
Caplan’s book intervenes in a theoretical debate between economists over why education has economic benefits. Human-capital theory sees these benefits as the result of an individual’s investment in themselves. They become bearers of immaterial “capital,” equipped with ideas, skills, and productivity.
“Signaling,” by contrast, argues that education does not produce this kind of capital, but merely certifies it for employers: a relevant degree “signals” that an individual has the required abilities and skills to do the job. Few economists fully believe the “pure” version of either theory, and the debate tends to focus on calculating the percentage that can be attributed to each one. Caplan offers a “cautious” estimate that signaling explains 50 percent of the return on education, and a “reasonable” estimate that it is 80 percent or even higher.
The starting point of Caplan’s case for signaling is that the overwhelming majority of what students learn, from elementary school to college, is “otherworldly” academic knowledge that is “irrelevant to the modern labor market.” While plenty of economists agree that the connection between more humanistic subjects and job skills are indirect, many also think that such connections are unobservable and thus difficult to measure, especially because they are so highly varied across types of employment.
If education is about content, why doesn’t everyone sign up for Harvard MOOCs?
Caplan unconvincingly solves the problem with anecdotal evidence, and with an idiosyncratic classification of the “usefulness” of various academic disciplines. Technical fields — engineering, law, and medicine — rank as “highly” useful, while the humanities and social sciences receive the designation of “low” usefulness. Even the subjects Caplan ranks as highly useful are suspect to his mind, since they teach academic knowledge rather than directly relevant job skills, making them “less practical than they sound.”
Caplan is witheringly dismissive of the obvious rejoinder that education is about broader and more amorphous things like “critical thinking” or “learning how to think.” He turns to research in educational psychology that supposedly shows weak evidence of learning “transfers,” suggesting that students are generally bad at applying knowledge outside of the narrow bounds in which they practice it. The literature Caplan presents tests students’ abilities to solve puzzles across different domains or to apply formal statistical education to everyday problems.
It is immediately apparent, however, that such experiments are unable to test even a fraction of the types of knowledge and practices that students are taught in college, and even less to demonstrate falsifiably that such knowledge has no usefulness to their future employment. Caplan responds to such limitations with rhetoric unlikely to convince those not already determined to agree with him: “‘No one knows if this trash will come in handy’ is a terrible argument for hoarding trash.”
The economic literature on education, almost entirely focused on job skills and financial returns, would undoubtedly cause many professors to recoil. It ignores the things most teachers value, namely the personal, cultural, and civic role of education. But economic arguments can be put to a wide variety of political ends, and these humanist and civic concerns are shared even by economists, most of whom see the education system as serviceable, if imperfect. (Even some well-known libertarian economists, like Caplan’s friend Tyler Cowen, disagree with him on aspects of education.) For them, education works regardless of the precise breakdown between human capital or signaling, and its rationality can be assumed from the fact that no other way of performing these basic functions of the labor market has successfully challenged it.
Caplan diverges from this view mostly because he dislikes education as “compulsory enlightenment” and is willing to entertain a host of far-fetched alternatives. If education is about content rather than the signals sent by degrees, why doesn’t everyone audit classes for free? Why doesn’t everyone sign up for Harvard MOOCs? Why can’t we just put teenagers to work and teach them job skills that way? (An entire section of the book asks, “What’s Wrong With Child Labor?”)
Caplan pretends to be sympathetic to anti-economistic arguments about education’s value as an introduction to civic and cultural knowledge, but hastily concludes that education fails at those things just as thoroughly as it fails at job training. For colleges to plausibly claim that they enrich students’ personal lives, Caplan argues, it requires “worthy content,” “skillful pedagogy,” and “eager students.” They don’t, he declares by fiat: “Most teachers are boring,” he grandly opines. “The students are worse.” It goes without saying that such unsupported claims could be countered with equally persuasive anecdotes.
Having reduced the purview of education to a narrow conception of job skills, Caplan argues for the total defunding of higher education and perhaps even secondary education. “I have a strong moral presumption against taxpayer support for anything. Why? Because I have a strong moral presumption in favor of leaving others alone — and consider taxation to be a prime example of failing to leave others alone.”
Signaling serves less as an economic theory than as a tool of Caplan’s opposition to state support for higher education. He fully acknowledges, even delights in, the undisputed economic return to education, because by defining it as largely signaling, he is able to argue that education’s benefits accrue entirely to the individual, not to society — “private profit, social waste.” If education merely signals to employers rather than building actually productive skills, then the vast sums that federal and state governments spend subsidizing education represent, in his view, a redistribution from taxpayers to students building their future riches.
This is a reformulation of Buchanan’s old, disingenuous argument that students are a rent-seeking special interest group building wealth on the backs of the “poor taxpayer.” Caplan echoes Buchanan on what he calls “the hidden wonder of high tuition and student debt,” but argues that the current student-loan system is insufficiently punitive. While he is correct that the bipartisan consensus in favor of the student-loan regime has led to self-defeating credential inflation, his response is almost identical to that of Academia in Anarchy: Impose draconian austerity on higher education and burden students even more heavily to force them to put more “skin in the game.”
he central tenet of Buchanan’s social theory is that individuals act according to their rational self-interest, even in institutions that are ostensibly devoted to the impartial pursuit of public goods. Through this lens, public servants, activists, and students became “rent-seekers” and “special interests” trying to commandeer the machinery of government for their own gain. Thus, the thinking goes, the impartiality of state institutions is undermined.
Brennan and Magness’ Cracks in the Ivory Tower situates itself in Buchanan’s intellectual lineage while keeping his smoldering anti-government polemic mostly submerged. Its authors argue that examining incentives and institutional structures is useful for approaching the various problems of higher education. Furthermore, they avoid attributing such problems to ideological villains like “neoliberalism” or “political correctness.”
“Big trends emerge from individual behavior without anyone running the show,” they write. “Institutions create incentives, and incentives determine behavior.” Unsurprisingly they find universities are a case study in bad incentives, with competing populations each responding rationally to their position within it in, but with disastrous results.
Using the theory of incentives to examine the incoherent jumble of institutions that make up the modern university, Cracks in the Ivory Tower yields occasional insight. Brennan and Magness show how administrators’ incentives conflict with those of faculty and students. Administrators want to maximize their own budgets, grow the footprint or prestige of their particular fiefdom, and increase the percentage of nontenured faculty (in order to better control and manage them).
Perverse incentives may cause problems for faculty as well. In a chapter on Ph.D. “overproduction,” Brennan and Magness describe the incentives of senior faculty — teaching more interesting seminars, bolstering their own scholarly reputations, maintaining sources of research assistance — as explanations for why some faculty fight to defend low-ranked graduate programs that are a professional dead end for their alumni. While Brennan and Magness point out that grad students relieve professors of their academic drudgery, they blame this exploitative relationship on professors' incentives rather than on broader, more systemic issues.
More often, though, Brennan and Magness deploy incentives in a manner more analogous to the role it played in Buchanan’s thought: as a delegitimizing ideological weapon. As with many libertarian arguments, petty misanthropy is never far beneath the surface. Just as it’s easy to read Caplan’s book as an elaborate scholarly transposition of puerile distaste for centralized education, Brennan and Magness hint at their own carefully cataloged grievances.
In an opening slate of anecdotes, they fume against a diversity-conscious provost who resisted an overwhelmingly male department’s vote for a male candidate over a similarly qualified female one, arguing that the provost had bad legal incentives to favor a woman over a man with a “more original, higher-stakes research trajectory.” They note with satisfaction that the woman turned down the job and is now an untenured assistant professor at a liberal arts college. To illustrate the absurd incentives of university budgeting, Brennan brags that he spitefully wasted money on a standing desk because his research budget didn’t roll over to the next year.
Adjuncts, like Uber drivers or immigrant tomato-pickers, should apparently be happy with whatever the market deems their appropriate compensation.
Despite the studied attempt at apolitical empirics, Brennan and Magness’ applications of incentives theory to more serious cases yields political ideology thinly veiled as scientific analysis. In a particularly blinkered chapter, Brennan and Magness calculate that adjunct professors are not actually underpaid or exploited — they are simply paid for a specific task that is much more limited than those of tenured faculty. “While adjuncts at research universities do spend some time on research,” they write, “they are not expected or paid to do so. Such activities are outside the scope of adjuncts’ jobs.” The “adjuncts’ rights movement,” thus, turns out to be a form of rent-seeking in which “morality disguises self-interest.”
But while Brennan and Magness write as if they had laid bare an insidious con, they have simply noticed the obvious fact that all labor struggles demand more favorable material conditions. Universities’ moves to employ vast reserves of contingent labor are managerial decisions, not the immutable dictates of natural law. But the libertarian worldview systematically evacuates power from its field of vision, at least when it is pursued from below. Adjuncts, like Uber drivers or immigrant tomato-pickers, should apparently be happy with whatever the market deems their appropriate compensation. As Brennan infamously wrote in a series of online attacks on the adjuncts’ rights movement, “Adjuncts are victims of their own bad choices.”
Like Caplan, Brennan and Magness vaunt the demystifying, scientific virtues of economics, but are silent on the broader political-economic context that shapes so much of contemporary higher education. They consider “systemic” diagnoses of the problems facing academe, only to dismiss them on the flimsiest of pretexts.
The university cannot be suffering from “corporatization” because it is insulated from market incentives in a way corporations are not. While this is true in certain respects, Brennan and Magness have nothing to say about the strenuous efforts of politicians and administrators over the past half-century to submit them to the market, which has deeply altered the academic mission, organizational practices, and the attitudes of students toward education.
Since the 1970s, and especially since the 1990s, research universities have become major economic actors, not only in market-oriented competition for students but in the commodification of ever greater swaths of their activities, including faculty research. Tightening relationships with industry allows businesses to capture and monetize the intellectual property created by publicly funded research. These changes have been accompanied by major shifts in administrative philosophy, such as the application of metrics to scholars’ and departments’ “performance” — a direct borrowing from corporate-management fads.
Brennan and Magness dismiss the other frequent descriptor of the contemporary university — “neoliberal” — as a ghost story. There cannot be a “neoliberal university,” they argue, because it is impossible to find self-described neoliberals among the largely progressive ranks of university faculty and administrators. This almost laughably circumvents the issue.
Neoliberalism was a diverse but coherent and influential body of theory championed by neoclassical economists and politicians — including, of course, the authors’ own intellectual progenitor, James Buchanan. The reimagining of education as a commodity purchased by individuals, rather than a universal public good provided by the state, was an explicit project of neoliberal economists and politicians on both the right and left as they moved to slice and reorganize the welfare state along leaner, more punitive lines. Neoliberalism hardly explains everything about contemporary higher education, but it explains a lot.
Cracks in the Ivory Tower ends without any proposed solutions, but Brennan and Magness are concerned, ultimately, with delivering social justice for taxpayers. (Their concluding chapter is titled, “Answering Taxpayers.”) Because universities consume public resources that could be spent on other priorities, they argue, justice demands that we put a price on education. Universities are bad at education, they are populated by rent-seekers responding to perverse incentives, and they do not provide public goods, make better citizens, or preserve culture.
“Perhaps the best argument for continuing to fund higher ed at its current levels is cynical,” the authors conclude. “Federal spending priorities are so awful that, realistically speaking, if the money wasn’t spent on possibly useless education, it might be spent on something much worse.”
Although these authors tend to write as if they are crying out from a distant wilderness against an all-powerful leviathan, in the real world their ideas have already colonized the mainstream of American thinking about higher education. The current problems are, in part, the wages of those ideas put into practice.
Although these authors tend to write as if they are crying out from a distant wilderness against an all-powerful leviathan, in the real world their ideas have already colonized the mainstream of American thinking about higher education. The current problems are, in part, the wages of those ideas put into practice.
In the work of thinkers like Friedrich Hayek, Milton Friedman, and James Buchanan, neoliberalism was always a response to calls for democratization and redistribution from below. As critics both of communism and the postwar Keynesian state, neoliberals pushed for the legal insulation of the market from democratic control, which made them the natural enemies of economic redistribution. Since it would have been impossible to discharge the commitments of the postwar welfare state entirely, the solution was to reimagine them as services provided to consumers. While such thinking was marginal in the 1950s and 1960s, it met its moment in the inflationary crisis of the 1970s and the widespread discrediting of Keynesian economics. The jargon of “incentives,” “choice,” “special interests,” and “unintended consequences” were vehicles by which universal public services with egalitarian intent were attacked as the root of the decade’s economic woes.
By the 1990s, neoliberal ideas were not only a project of the right, but were enthusiastically adopted by progressives and became the common language of public policy. The consequences of this ideological shift are all too apparent. Public funding of higher education remains dramatically lower than it was a half-century ago. It has been made up for by a scramble for the highest-paying students, student debt, and the commodification of as much as possible of the university’s activities. Attracting student-consumers has driven an explosion of spending on noninstructional staff and facilities, something reflected in students’ sense that they are buying a credential, not receiving an education.
To all this, libertarians charge that public disinvestment from higher education has not gone far enough — that even existing levels of spending distort the market and produce perverse outcomes. All attempts to produce more egalitarian outcomes are doomed to crash into the immovable barriers of rent-seeking and bad incentives.
“Why is college so expensive?” Brennan and Magness ask. “Because politicians want to help the poor.” The anti-democratic undercurrent of these arguments barely hides its contempt for the average citizen, made all the more evident by the fact that both Brennan and Caplan have authored books expressing dim views of the masses’ ability to participate in democracy.
In a series of “chats” with skeptical interlocutors at the end of his book, Caplan acknowledges that the only kind of public schools he deems justified would be “dystopian” and that he would never send his own children to them. He simply believes that most students are happily incorrigible and unfit for anything better — at least not at any expense taxpayers should be required to bear. “I believe wholeheartedly in the life of the mind,” Caplan quips. “What I’m cynical about is people.”
That the unquestionable result would be the further economic stratification and racial segregation of higher education is apparently simply the price of obeying the incontrovertible laws of human behavior and the market. On the bright side, Americans would be freed from “compulsory enlightenment.”
Such arguments are the natural end point, the reductio ad absurdum, of the market-oriented ideas that dominate American thinking on the purpose and value of higher education. At least since the early 20th century, universities have been intimately interconnected with the capitalist economy and have reproduced many of their inherent inequalities and exclusions, not to mention their ideologies. Even the most redistributionary higher-education policy will never be a replacement for a robust welfare state. But compared to libertarian fantasies, even the radically unjust system we have looks like utopia.
David Sessions is a Ph.D. candidate in history at Boston College.
Correction (1/15/2020, 5:30 p.m.): A previous version of this essay suggested that Cracks in the Ivory Tower does not consider the cheap labor that doctoral programs provide. The text has been updated to clarify that it does.
Became a Commodity
Market-based thinking is at the heart
of how academe thinks of itself.
That’s a travesty.
This past summer, Alaska’s Republican governor, Mike Dunleavy, announced a draconian plan to slash appropriations for the university system by 41 percent. Defending the decision, he repeated a phrase that increasingly accompanies budget cuts: that the university couldn’t continue being “all things for all people.” Dunleavy, who insisted that the state’s deficit be closed without raising taxes, argued that Alaska must “turn the university into a smaller, leaner, but still very positive, productive university in the Northern Hemisphere.”
Pete Buttigieg has made a similar notion the center of his opposition to universal free college in the 2020 Democratic primary. “Americans who have a college degree earn more than Americans who don’t,” Buttigieg said. “As a progressive, I have a hard time getting my head around the idea of a majority who earn less because they didn’t go to college subsidizing a minority who earn more because they did.” Buttigieg has continued to hammer the point that universality equals upward redistribution. Lis Smith, a senior adviser for his campaign, tweeted, “If you think that a worker who didn’t go to college should pay for college for a CEO’s kid, then @PeteButtigieg isn’t your candidate.”
These statements capture a bipartisan sea change in the way Americans thinkabout higher education. Universities can’t be “all things to all people,” hence they should focus on what politicians determine to be their most “productive” activities. Governments not only cannot but should not provide higher education to everyone: People who can afford to invest in their own future should pay for themselves, and only those who really need it should receive help. We shouldn’t force “poor” Americans to pay for “rich” college students — even though broader-based funding of public higher education overwhelmingly and disproportionately helps the poor.
This line of argument has been dominant for decades, but it is not how politicians — especially progressive ones — always thought about higher education. The story of how the language of scarcity and individual investment became bipartisan orthodoxy begins with the marginal ideas of neoliberal economists in the years after World War II. Those ideas received a shot of polemical adrenaline and political influence from the student-protest movement of the late 1960s.
The economic literature on education would undoubtedly cause many professors to recoil.
The campus upheavals of the 1960s brought a wave of responses from the professoriate, but one in particular stood out. Written by two economists, James M. Buchanan and Nicos E. Devletoglou, Academia in Anarchy (Basic Books, 1970) opened with a law-and-order quote from Richard Nixon and was dedicated to “the taxpayer.” The authors explained that they wrote with “indignation” after observing the bombing of the UCLA economics department, where Buchanan taught, and the “groveling of the UCLA administrative authorities” to a “handful of revolutionary terrorists.”
Buchanan and Devletoglou suggested an overhaul of higher education aimed at bringing the student movement to heel. At the time, California had proposed a master plan of universal free higher education across its system. But the authors of Academia in Anarchy argued that the proposal suffered from a lack of basic economics — meaning not simply economic calculation, but Buchanan’s conception of economics as an all-encompassing moral and behavioral philosophy. “Almost alone among social scientists,” they wrote, “the economist brings with him a model of human behavior which allows predictions about human action.”
Their recommendations cut sharply against the spirit of the times. To Buchanan and Devletoglou, the students’ bad behavior was the grim result of the overabundance of education. Treating education as a “free good” meant that those who received it had no incentive to value it, and thus spent their years at university behaving as “man-children” playing a “psychedelic game.” Buchanan and Devletoglou recommended a student-loan system. “The scarcity value of a university education would at least be brought home to the student,” they wrote. Students would be forced to take a harder look at what they studied and how, avoid protest, and develop an appreciation for “property rights” as paying consumers of education.
In a twist that would become characteristic of later libertarian arguments, with softer echoes among technocratic liberals like Buttigieg, economically disciplining students was a matter of social justice for society at large. Free tuition that was intended to provide a path of mobility for less-fortunate citizens, especially racial minorities who suffered from centuries of accumulated exclusion, was actually a “gift to the gifted” — “a transfer of wealth from the poor to the rich” — with the “poor taxpayer” cleverly but dubiously painted by Buchanan and Devletoglou as working-class citizens excluded from higher education.
In retrospect, their predictions seem prescient: “As the chaos mounts, however, and as the reformers’ feeble but sometimes dangerous responses to revolutionary pressure continue, as taxpayers rise in anger, perhaps the time will come when the economist’s pedestrian explanations will command respect.”
Within the decade, wealthier white Americans were revolting against taxes used to fund integrated public schools and fleeing to the suburbs, where they received racially preferential home loans. Anti-tax clauses were written into many state constitutions, hamstringing the public funding of higher education. Economists in the emerging world of neoclassical economics, along with a new constellation of conservative think tanks, were advocating school-voucher systems and theories of “human capital” that reframed education as a consumer choice, an investment individuals should have to make in their future earnings. The federal government retreated from universalizing higher education, and the student-loan system Buchanan and Devletoglou favored eventually became a preferred tool of even liberal policy makers.
The neoliberal economists won: education became a commodity, and a large swath of the reshaping of higher education that economists like Buchanan and Devletoglou championed took place. But to their intellectual descendants, the economic disciplining of higher ed has not gone far enough: Federal and state governments still “waste” precious taxpayer dollars on a dysfunctional and possibly even pointless enterprise structured around “bad incentives,” clinging to economically “worthless” subjects, and defined by the charade-like pretense of learning.
If such claims sound extreme, they are far closer to the mainstream of education-policy thinking today than is often acknowledged. Two recent books by scholars closely associated with libertarian economics all but rip off the mask. In The Case Against Education (Princeton University Press, 2018), Bryan Caplan, an economist at George Mason, argues that the American higher-education system is a massive misuse of federal funds. In Cracks in the Ivory Tower (Oxford University Press, 2019), Jason Brennan, a philosopher at Georgetown, and Phillip Magness, an economic historian at George Mason, apply Buchanan’s theory of incentives to various aspects of the modern research university. That these authors often write as if they were inhabitants of the 1960s, facing down a redistributionary education system stubbornly indifferent to basic economics, ironically reveals the extent to which their worldview has triumphed — and how far they still want to go.
Caplan’s book intervenes in a theoretical debate between economists over why education has economic benefits. Human-capital theory sees these benefits as the result of an individual’s investment in themselves. They become bearers of immaterial “capital,” equipped with ideas, skills, and productivity.
“Signaling,” by contrast, argues that education does not produce this kind of capital, but merely certifies it for employers: a relevant degree “signals” that an individual has the required abilities and skills to do the job. Few economists fully believe the “pure” version of either theory, and the debate tends to focus on calculating the percentage that can be attributed to each one. Caplan offers a “cautious” estimate that signaling explains 50 percent of the return on education, and a “reasonable” estimate that it is 80 percent or even higher.
The starting point of Caplan’s case for signaling is that the overwhelming majority of what students learn, from elementary school to college, is “otherworldly” academic knowledge that is “irrelevant to the modern labor market.” While plenty of economists agree that the connection between more humanistic subjects and job skills are indirect, many also think that such connections are unobservable and thus difficult to measure, especially because they are so highly varied across types of employment.
If education is about content, why doesn’t everyone sign up for Harvard MOOCs?
Caplan unconvincingly solves the problem with anecdotal evidence, and with an idiosyncratic classification of the “usefulness” of various academic disciplines. Technical fields — engineering, law, and medicine — rank as “highly” useful, while the humanities and social sciences receive the designation of “low” usefulness. Even the subjects Caplan ranks as highly useful are suspect to his mind, since they teach academic knowledge rather than directly relevant job skills, making them “less practical than they sound.”
Caplan is witheringly dismissive of the obvious rejoinder that education is about broader and more amorphous things like “critical thinking” or “learning how to think.” He turns to research in educational psychology that supposedly shows weak evidence of learning “transfers,” suggesting that students are generally bad at applying knowledge outside of the narrow bounds in which they practice it. The literature Caplan presents tests students’ abilities to solve puzzles across different domains or to apply formal statistical education to everyday problems.
It is immediately apparent, however, that such experiments are unable to test even a fraction of the types of knowledge and practices that students are taught in college, and even less to demonstrate falsifiably that such knowledge has no usefulness to their future employment. Caplan responds to such limitations with rhetoric unlikely to convince those not already determined to agree with him: “‘No one knows if this trash will come in handy’ is a terrible argument for hoarding trash.”
The economic literature on education, almost entirely focused on job skills and financial returns, would undoubtedly cause many professors to recoil. It ignores the things most teachers value, namely the personal, cultural, and civic role of education. But economic arguments can be put to a wide variety of political ends, and these humanist and civic concerns are shared even by economists, most of whom see the education system as serviceable, if imperfect. (Even some well-known libertarian economists, like Caplan’s friend Tyler Cowen, disagree with him on aspects of education.) For them, education works regardless of the precise breakdown between human capital or signaling, and its rationality can be assumed from the fact that no other way of performing these basic functions of the labor market has successfully challenged it.
Caplan diverges from this view mostly because he dislikes education as “compulsory enlightenment” and is willing to entertain a host of far-fetched alternatives. If education is about content rather than the signals sent by degrees, why doesn’t everyone audit classes for free? Why doesn’t everyone sign up for Harvard MOOCs? Why can’t we just put teenagers to work and teach them job skills that way? (An entire section of the book asks, “What’s Wrong With Child Labor?”)
Caplan pretends to be sympathetic to anti-economistic arguments about education’s value as an introduction to civic and cultural knowledge, but hastily concludes that education fails at those things just as thoroughly as it fails at job training. For colleges to plausibly claim that they enrich students’ personal lives, Caplan argues, it requires “worthy content,” “skillful pedagogy,” and “eager students.” They don’t, he declares by fiat: “Most teachers are boring,” he grandly opines. “The students are worse.” It goes without saying that such unsupported claims could be countered with equally persuasive anecdotes.
Having reduced the purview of education to a narrow conception of job skills, Caplan argues for the total defunding of higher education and perhaps even secondary education. “I have a strong moral presumption against taxpayer support for anything. Why? Because I have a strong moral presumption in favor of leaving others alone — and consider taxation to be a prime example of failing to leave others alone.”
Signaling serves less as an economic theory than as a tool of Caplan’s opposition to state support for higher education. He fully acknowledges, even delights in, the undisputed economic return to education, because by defining it as largely signaling, he is able to argue that education’s benefits accrue entirely to the individual, not to society — “private profit, social waste.” If education merely signals to employers rather than building actually productive skills, then the vast sums that federal and state governments spend subsidizing education represent, in his view, a redistribution from taxpayers to students building their future riches.
This is a reformulation of Buchanan’s old, disingenuous argument that students are a rent-seeking special interest group building wealth on the backs of the “poor taxpayer.” Caplan echoes Buchanan on what he calls “the hidden wonder of high tuition and student debt,” but argues that the current student-loan system is insufficiently punitive. While he is correct that the bipartisan consensus in favor of the student-loan regime has led to self-defeating credential inflation, his response is almost identical to that of Academia in Anarchy: Impose draconian austerity on higher education and burden students even more heavily to force them to put more “skin in the game.”
he central tenet of Buchanan’s social theory is that individuals act according to their rational self-interest, even in institutions that are ostensibly devoted to the impartial pursuit of public goods. Through this lens, public servants, activists, and students became “rent-seekers” and “special interests” trying to commandeer the machinery of government for their own gain. Thus, the thinking goes, the impartiality of state institutions is undermined.
Brennan and Magness’ Cracks in the Ivory Tower situates itself in Buchanan’s intellectual lineage while keeping his smoldering anti-government polemic mostly submerged. Its authors argue that examining incentives and institutional structures is useful for approaching the various problems of higher education. Furthermore, they avoid attributing such problems to ideological villains like “neoliberalism” or “political correctness.”
“Big trends emerge from individual behavior without anyone running the show,” they write. “Institutions create incentives, and incentives determine behavior.” Unsurprisingly they find universities are a case study in bad incentives, with competing populations each responding rationally to their position within it in, but with disastrous results.
Using the theory of incentives to examine the incoherent jumble of institutions that make up the modern university, Cracks in the Ivory Tower yields occasional insight. Brennan and Magness show how administrators’ incentives conflict with those of faculty and students. Administrators want to maximize their own budgets, grow the footprint or prestige of their particular fiefdom, and increase the percentage of nontenured faculty (in order to better control and manage them).
Perverse incentives may cause problems for faculty as well. In a chapter on Ph.D. “overproduction,” Brennan and Magness describe the incentives of senior faculty — teaching more interesting seminars, bolstering their own scholarly reputations, maintaining sources of research assistance — as explanations for why some faculty fight to defend low-ranked graduate programs that are a professional dead end for their alumni. While Brennan and Magness point out that grad students relieve professors of their academic drudgery, they blame this exploitative relationship on professors' incentives rather than on broader, more systemic issues.
More often, though, Brennan and Magness deploy incentives in a manner more analogous to the role it played in Buchanan’s thought: as a delegitimizing ideological weapon. As with many libertarian arguments, petty misanthropy is never far beneath the surface. Just as it’s easy to read Caplan’s book as an elaborate scholarly transposition of puerile distaste for centralized education, Brennan and Magness hint at their own carefully cataloged grievances.
In an opening slate of anecdotes, they fume against a diversity-conscious provost who resisted an overwhelmingly male department’s vote for a male candidate over a similarly qualified female one, arguing that the provost had bad legal incentives to favor a woman over a man with a “more original, higher-stakes research trajectory.” They note with satisfaction that the woman turned down the job and is now an untenured assistant professor at a liberal arts college. To illustrate the absurd incentives of university budgeting, Brennan brags that he spitefully wasted money on a standing desk because his research budget didn’t roll over to the next year.
Adjuncts, like Uber drivers or immigrant tomato-pickers, should apparently be happy with whatever the market deems their appropriate compensation.
Despite the studied attempt at apolitical empirics, Brennan and Magness’ applications of incentives theory to more serious cases yields political ideology thinly veiled as scientific analysis. In a particularly blinkered chapter, Brennan and Magness calculate that adjunct professors are not actually underpaid or exploited — they are simply paid for a specific task that is much more limited than those of tenured faculty. “While adjuncts at research universities do spend some time on research,” they write, “they are not expected or paid to do so. Such activities are outside the scope of adjuncts’ jobs.” The “adjuncts’ rights movement,” thus, turns out to be a form of rent-seeking in which “morality disguises self-interest.”
But while Brennan and Magness write as if they had laid bare an insidious con, they have simply noticed the obvious fact that all labor struggles demand more favorable material conditions. Universities’ moves to employ vast reserves of contingent labor are managerial decisions, not the immutable dictates of natural law. But the libertarian worldview systematically evacuates power from its field of vision, at least when it is pursued from below. Adjuncts, like Uber drivers or immigrant tomato-pickers, should apparently be happy with whatever the market deems their appropriate compensation. As Brennan infamously wrote in a series of online attacks on the adjuncts’ rights movement, “Adjuncts are victims of their own bad choices.”
Like Caplan, Brennan and Magness vaunt the demystifying, scientific virtues of economics, but are silent on the broader political-economic context that shapes so much of contemporary higher education. They consider “systemic” diagnoses of the problems facing academe, only to dismiss them on the flimsiest of pretexts.
The university cannot be suffering from “corporatization” because it is insulated from market incentives in a way corporations are not. While this is true in certain respects, Brennan and Magness have nothing to say about the strenuous efforts of politicians and administrators over the past half-century to submit them to the market, which has deeply altered the academic mission, organizational practices, and the attitudes of students toward education.
Since the 1970s, and especially since the 1990s, research universities have become major economic actors, not only in market-oriented competition for students but in the commodification of ever greater swaths of their activities, including faculty research. Tightening relationships with industry allows businesses to capture and monetize the intellectual property created by publicly funded research. These changes have been accompanied by major shifts in administrative philosophy, such as the application of metrics to scholars’ and departments’ “performance” — a direct borrowing from corporate-management fads.
Brennan and Magness dismiss the other frequent descriptor of the contemporary university — “neoliberal” — as a ghost story. There cannot be a “neoliberal university,” they argue, because it is impossible to find self-described neoliberals among the largely progressive ranks of university faculty and administrators. This almost laughably circumvents the issue.
Neoliberalism was a diverse but coherent and influential body of theory championed by neoclassical economists and politicians — including, of course, the authors’ own intellectual progenitor, James Buchanan. The reimagining of education as a commodity purchased by individuals, rather than a universal public good provided by the state, was an explicit project of neoliberal economists and politicians on both the right and left as they moved to slice and reorganize the welfare state along leaner, more punitive lines. Neoliberalism hardly explains everything about contemporary higher education, but it explains a lot.
Cracks in the Ivory Tower ends without any proposed solutions, but Brennan and Magness are concerned, ultimately, with delivering social justice for taxpayers. (Their concluding chapter is titled, “Answering Taxpayers.”) Because universities consume public resources that could be spent on other priorities, they argue, justice demands that we put a price on education. Universities are bad at education, they are populated by rent-seekers responding to perverse incentives, and they do not provide public goods, make better citizens, or preserve culture.
“Perhaps the best argument for continuing to fund higher ed at its current levels is cynical,” the authors conclude. “Federal spending priorities are so awful that, realistically speaking, if the money wasn’t spent on possibly useless education, it might be spent on something much worse.”
Although these authors tend to write as if they are crying out from a distant wilderness against an all-powerful leviathan, in the real world their ideas have already colonized the mainstream of American thinking about higher education. The current problems are, in part, the wages of those ideas put into practice.
Although these authors tend to write as if they are crying out from a distant wilderness against an all-powerful leviathan, in the real world their ideas have already colonized the mainstream of American thinking about higher education. The current problems are, in part, the wages of those ideas put into practice.
In the work of thinkers like Friedrich Hayek, Milton Friedman, and James Buchanan, neoliberalism was always a response to calls for democratization and redistribution from below. As critics both of communism and the postwar Keynesian state, neoliberals pushed for the legal insulation of the market from democratic control, which made them the natural enemies of economic redistribution. Since it would have been impossible to discharge the commitments of the postwar welfare state entirely, the solution was to reimagine them as services provided to consumers. While such thinking was marginal in the 1950s and 1960s, it met its moment in the inflationary crisis of the 1970s and the widespread discrediting of Keynesian economics. The jargon of “incentives,” “choice,” “special interests,” and “unintended consequences” were vehicles by which universal public services with egalitarian intent were attacked as the root of the decade’s economic woes.
By the 1990s, neoliberal ideas were not only a project of the right, but were enthusiastically adopted by progressives and became the common language of public policy. The consequences of this ideological shift are all too apparent. Public funding of higher education remains dramatically lower than it was a half-century ago. It has been made up for by a scramble for the highest-paying students, student debt, and the commodification of as much as possible of the university’s activities. Attracting student-consumers has driven an explosion of spending on noninstructional staff and facilities, something reflected in students’ sense that they are buying a credential, not receiving an education.
To all this, libertarians charge that public disinvestment from higher education has not gone far enough — that even existing levels of spending distort the market and produce perverse outcomes. All attempts to produce more egalitarian outcomes are doomed to crash into the immovable barriers of rent-seeking and bad incentives.
“Why is college so expensive?” Brennan and Magness ask. “Because politicians want to help the poor.” The anti-democratic undercurrent of these arguments barely hides its contempt for the average citizen, made all the more evident by the fact that both Brennan and Caplan have authored books expressing dim views of the masses’ ability to participate in democracy.
In a series of “chats” with skeptical interlocutors at the end of his book, Caplan acknowledges that the only kind of public schools he deems justified would be “dystopian” and that he would never send his own children to them. He simply believes that most students are happily incorrigible and unfit for anything better — at least not at any expense taxpayers should be required to bear. “I believe wholeheartedly in the life of the mind,” Caplan quips. “What I’m cynical about is people.”
That the unquestionable result would be the further economic stratification and racial segregation of higher education is apparently simply the price of obeying the incontrovertible laws of human behavior and the market. On the bright side, Americans would be freed from “compulsory enlightenment.”
Such arguments are the natural end point, the reductio ad absurdum, of the market-oriented ideas that dominate American thinking on the purpose and value of higher education. At least since the early 20th century, universities have been intimately interconnected with the capitalist economy and have reproduced many of their inherent inequalities and exclusions, not to mention their ideologies. Even the most redistributionary higher-education policy will never be a replacement for a robust welfare state. But compared to libertarian fantasies, even the radically unjust system we have looks like utopia.
David Sessions is a Ph.D. candidate in history at Boston College.
Correction (1/15/2020, 5:30 p.m.): A previous version of this essay suggested that Cracks in the Ivory Tower does not consider the cheap labor that doctoral programs provide. The text has been updated to clarify that it does.
Re: Wind Timeline 1960 - 2018
Prime Minister of the world
Clemens von Metternich: the master diplomat of the nineteenth century
By Ferdinand Mount
That, famously, was Clemens von Metternich in 1820, conscious as he always was of the times he lived in and of his own place in them. And how intensely he lived through them. At the age of seven, in 1780, he accompanied his father Franz Georg on a diplomatic mission to Cologne. (In 1853) Pushing eighty, he returned from exile to Vienna to give advice to the impulsive young Emperor Franz Joseph. In between, he had been foreign minister to Francis I for twelve years and Austrian Chancellor for another twenty-seven. For his skill in driving through the Congress of Vienna, the French foreign minister Montmorency dubbed him “the coachman of Europe”. Castlereagh went a step further. After the crucial peacemaking session at Châtillon, he told him, “you are the prime minister of the world”. At least that’s what Metternich said Castlereagh had said. Like most great men, he was no stranger to self-promotion.
Yet precisely because he was so identified with the high politics of his age, he has attracted bucketfuls of venom from historians and biographers, reflecting both their own prejudices and the resentments of their epochs. Metternich is King Rat in the influential History of Germany in the Nineteenth Century (1879–94) by Heinrich von Treitschke, the father of German hypernationalism and antisemitism (“Die Juden sind unser Unglück” – the Jews are our misfortune). Treitschke denounced Metternich as “a man of calculating cunning”, “good-natured and smiling mendacity”, and totally lacking in German spirit. Treitschke’s follower, Viktor Bibl, an early convert to National Socialism, called him “the demon of Austria”. Of the thirty biographies, by authors ranging from Treitschke to Barbara Cartland, the most archivally rich and authoritative so far have been the two volumes by Heinrich von Srbik (1925, with a third volume of source material published in 1954 after his death).
Which is a pity. Because Srbik, too, became a fellow traveller of the Nazis. Originally from Vienna, he welcomed the Anschluss, took a seat in Hitler’s Reichstag, and reigned over Vienna’s Academy of Science until 1945. As late as 1951, he was still pleading for “a properly understood German racial theory”. For Srbik, Metternich was a softie, who lacked the Herrennatur, the iron and ruthless will to impress himself upon the age. He should have pushed to the East, “permeating a natural space with predominantly German culture”. As with Treitschke, Srbik’s ultimate indictment of Metternich is that he was “un-German”; “there was no place in his mind or heart open to the high values of a national German Reich”; he just didn’t get the idea of “a community of blood”, ie he wasn’t Hitler.
Postwar studies, especially in English – notably by Henry Kissinger, Paul W. Schroeder and Alan Sked – have done much to retrieve Metternich’s reputation as a master diplomat who tirelessly crisscrossed Europe to cement and re-cement the fragile alliances against Napoleon, who with Castlereagh insisted on a peace after Waterloo which was so generous to France that it baffled Napoleon on St Helena, and who kept the Concert of Europe playing from more or less the same hymn sheet for three decades after the war. But there hasn’t really been a prejudice-free biography based on the voluminous archives in Prague and Vienna which sets out to give us Metternich at full length, to explain, if not necessarily to explain away, who he was and what he was trying to do, a Metternich per se.
This is the achievement of Wolfram Siemann, emeritus professor of history at Munich and already author of monographs on several crucial episodes in Metternich’s long life: the Congress of Vienna, the beginnings of the German secret police, the 1848 revolution and Metternich’s Britain. Metternich: Strategist and visionary is thus the culmination and encapsulation of a life’s work, and despite its length, twice as long as anybody else’s except Srbik, it is a running joy, full of winking sidelights and delightful detours, many of which are not really detours at all – for example, Metternich’s endless struggles to recover or replace the princely estates which had been torn away by the war and then to restore the family finances which had been frittered away by Franz Georg.
Metternich’s father is only one of the lesser characters who is recast or partly rehabilitated by Siemann’s patient researches. Dismissed by Algernon Cecil in his skittish, pro-Metternich 1933 biography as “a prosy babbler, a habitual liar and a glittering spendthrift”, Franz Georg reappears here as a resourceful diplomatist who laboured long and hard for his Emperor, and whom Francis recalled in 1810 to stand in as foreign minister while his son was away in Paris negotiating with Napoleon. Again, Francis’s successor as Emperor, Ferdinand, has often been dismissed by earlier historians as “an imbecile” or “mentally retarded”. Siemann points out that though he suffered from epilepsy and hydrocephalus, Ferdinand spoke five languages and played the piano.
The translation by Daniel Steuer is fluent and vivacious, though marred by several awkward choices of words in contexts where German has one word and English has several: “Herrschaft” is more naturally “domain” rather than “dominion” when describing Metternich’s estates; the revolutionary call to “a holy and healthy Empörung” is surely to “rebellion” rather than merely “indignation”; “Recht” is several times correctly Englished as “Law”, but elsewhere wrongly I think in the context as “Right”. Steuer is also inclined to translate “bürgerlich” as “bourgeois” in places where, ever since Rousseau and Marx roughed up the bourgeoisie, we would usually say “civic”.
Siemann points out how modern-minded Metternich could be. As a reward for his success at the Congress of Vienna, the Emperor had bestowed on him the glorious domain of Johannisberg, a former monastery overlooking the Rhine with a legendary vineyard. Instead of selling the wine by the barrel in the usual way and drinking the profits, he bottled it and sold it to selected grand clients, mostly monarchs. He also devised fancy armorial labels – Original Abfüllung der Fürstlich von Metternichschen Domäne – personally signed by the cellar-master. He was a pioneer high-end marchand de vins.
Similarly, when he got hold of another former abbey, at Plass on the edge of the Sudetenland, he lost no time in exploiting the estate’s extensive iron and coal deposits, using the latest cupola furnaces from England and patenting a machine for clearing tree trunks from the hillside. Soon the Plass works had more than 300 employees and were netting 280,000 guilders a year, enough to pay off most of Franz Georg’s debts all at once. To celebrate the arrival of a steamroller in 1854, the workforce choir sang an ode in Czech in Metternich’s honour, for he was a model employer, besides being a vigorous supporter of freeing up the land market from feudal restrictions. It is a piquant thought that Metternich’s efforts to industrialize Northern Bohemia (and indeed the rest of the empire) helped to make the Sudetenland so irresistible to Hitler.
Why then did this up-to-the-minute technocrat remain so stubbornly stick-in-the-mud in domestic politics? It is not as if he was unacquainted with democratic theory and practice. As a young man, he had fallen in love with England. “If I were not what I am, I would like to be an Englishman. If I could be neither the one nor the other, I would rather be nothing at all.” In London, he had made friends with Edmund Burke and Charles Grey. In his late seventies when Metternich was in exile in Brighton, Disraeli came and sat at his feet. Srbik claimed that it was not possible to say anything beyond “the level of the probable” about Metternich’s view of Burke. Cecil, too, says simply that “we cannot tell” whether Metternich inhaled Burke’s “matchless rhetoric”. But actually we can. Siemann finds in the library of Königswart, another of Metternich’s castles in Bohemia, a copy of Burke’s Reflections. Metternich has underlined several key passages, one about the nature of constitutional liberty, and also the celebrated riff about the maltreatment of Marie Antoinette, whose later execution we know from Metternich’s outburst at the time horrified him as much as it was to horrify Burke. And there is no doubt that he loved the English system of government. Coming back to England in 1848 after an absence of thirty-four years, he feels as if he had never been away: “This great country is the way it was, strong because of its unshakeable belief in the value of the law, of order, and the kind of freedom which, if it truly wants to exist, must be based on these foundations”.
On his first visit, his knowledge of England had been sketchy. He refers to Europe’s largest coachworks as “Hatched” (in fact John Hatchett of Long Acre), to “the Flemish artist John Bacon” who sculpted Chatham’s tomb in Westminster Abbey (Bacon was born in Southwark). The author of The School for Scandal would be surprised to hear himself described as “the Whig politician and poet”, as would Burke to hear himself called an “aristocratic Whig politician”, when he was a decidedly middle-class Irish lawyer who depended on aristocratic patronage for much of his political life. But there could be no doubt about Metternich’s enthusiasm for the hubbub of commercial London and the heated debates in Parliament, not least the trial of Warren Hastings, which he watched avidly.
Yet he remained obdurate against any suggestion that any Continental nation should take even a tiny step on the road to popular participation. The much- admired and imitated Spanish Cortes Constitution of 1812 he denounced as “the work of caprice or of a wild delusion”. In 1819, he offered to help King Friedrich Wilhelm III of Prussia to deal with dissenters but only on condition that he promised “not to introduce a representation of the people in your state, which is less suited to it than any other”. Siemann makes much of the elaborate plan for reorganizing the imperial Apparat, which Metternich presented to the Emperor on October 27, 1817 – and which the Emperor left lying on his desk for years, only toying with it again on his deathbed. Metternich himself in old age told his protégé Alexander von Hübner that “if the Emperor had adopted my ideas on the reorganization of the Diets, we would perhaps be in a better position to face the tempest”. True, Francis was ineradicably hostile to change, and Metternich was ultimately only his servant. But we cannot ignore the fact that Metternich never attempted to nudge him in the direction of popular representation. For all Siemann’s pleas that we should see Metternich as a reformer by stealth, the 1817 plan looks like a mega reshuffle of the bureaucracy across the constituent parts of the Empire, with no hint of democracy about it. In fact, Metternich told Hübner explicitly that the heart of his scheme, the new council sitting in Vienna, would be a council of the provincial estates, “and not a chamber of deputies, a Volkshaus, whose members would be elected”.
In Metternich’s view, the Germans couldn’t handle political parties or deal with a free press, although when in England he himself lapped up the newspapers and even founded one himself, the Spectateur de Londres, which flopped. There’s a weird kinship here between Metternich and the later Marx, who, after his years in London, reluctantly came to the conclusion that Britain, the United States and perhaps Holland might well evolve peacefully towards socialism, but that elsewhere on the Continent violent revolution was inevitable.
Hence the succession of repressive measures over thirty years – the “Metternich system” – most notably the Carlsbad Decrees of 1819 – of which he was rather proud: “a great act, the most important act of my life”. Here Siemann is indulgent to his subject, pointing out that Metternich was responding to vociferous public outcry after the murder of the pro-government poet August von Kotzebue by the firebrand Karl Ludwig Sand. All over Europe, the respectable classes were in a state of high panic that another great revolution was looming. In England at the same time, the government of Lord Liverpool was passing the no less repressive Six Acts. Both sets of laws were time-limited, but the Six Acts expired quicker and anyway they had been watered down by the Whigs.
By contrast, the Carlsbad Decrees seemed to become more stifling with time, and Metternich’s opposition to any form of democratic representation more entrenched. While Charles Grey, the friend of Metternich’s youth, was finally shoving the Great Reform Bill through Parliament – to Metternich’s dismay – the Austrian Chancellor was setting up his own secret police, the Mainzer Zentralpolizei. The family had form in this department. In 1810, Franz Georg had set up the empire’s first formal censorship bureau to cope with the secret agents that Napoleon left behind in all the larger cities he occupied. This new “political police” defined itself as “a counter-police which moved in step with the other police” – thus a forerunner of all modern counter-terrorism departments.
But Clemens’s objection to democracy went deeper than his agonized concern for Ruhe – the word always on his lips – calm/rest/peace. An elected central parliament could only be “a composition of individual, mutually alien and hostile deputies who are far from being united for the One purpose of being a state”. Democracy was “always a dissolving, decomposing, principle; it tends to separate men, it loosens society”. Echoes here of Rousseau and the ideal of organic unity within a state: for Rousseau, to be found in the Calvinist semi- theocracy of Geneva; for Metternich, only in the non-racial, multinational kaiserlich und königlich monarchy he served.
His obsession with Ruhe is not hard to understand. He had known little of it in his early life, starting with his education. Franz Georg was not such a stuffy character as you might think. He was a Freemason and a member of the Illuminati, and he chose as his son’s first tutor a fellow Illuminatus, Johann Friedrich Simon, a young Protestant pastor with advanced ideas, so advanced in fact that as soon as the French Revolution broke out, he shot off to join it, leading the storming on the city hall in Strasbourg, going on to style himself “the Grand Inquisitor of France at Mainz”, joining forces with Saint-Just in Paris and as a member of a revolutionary tribunal passing sixty-two death sentences. The next year, 1794, Franz Georg, the Emperor’s governor-general in the Austrian Netherlands, was forced to flee from Brussels for the second time when Austrian rule collapsed there. Soon Napoleon was chasing princes out of their palaces and estates all over Europe – the Metternichs included. New statelets were formed with idealistic republican constitutions only to have Napoleon’s siblings dumped on them as pop-up monarchs. After Austerlitz, the Austrian army was almost wiped out. As Metternich coolly remarked on his first great promotion four years later, “It is no minor task to be the foreign minister of Austria in 1809”.
He conceived that task in brutally undeceived terms: “From the day when peace is signed we must confine our system to tacking, and turning, and flattering. Thus alone may we possibly preserve our existence, till the day of general deliverance … for us there remains but one expedient, to preserve our strength for better days”. Accusations of inconsistency or weakness simply bounced off him. Between 1809 and 1813, he pursued the most brilliant defensive strategy in diplomatic history. Only after the French had lost 700,000 men in the snows, could he steer Austria away from alliance with France and into the final great coalition to destroy Napoleon.
He was undeceived, but he was not cynical. His tears were not shed simply for the loss of his own family estates or the unhorsed monarchs. He admitted that he was cold-blooded and thought that he needed to be. But he was not short of human feeling. He loved his numerous children by his three wives. His first wife Eleonore was an heiress and, in Metternich’s rather chilling words, “was never pretty and she is charming only to close acquaintances”. He added – this was in a letter to one of his many high-born mistresses, the legendary Dorothea von Lieven – “Those who know her really well cannot fail to love her; the majority of people find her stiff, unpleasant, and this is just what she wants. There is nothing in the world I would not do for her”.
Except stay faithful. He took an unseemly delight in his reputation as a ladies’ man. You can almost hear his lips smacking as he wrote to Dorothea: “Many women have been mentioned in connection with me who were not even in my thoughts. By contrast, I have entertained relationships with many women that were anything but romantic, without the public ever learning anything about it”. Srbik pursed his lips at Metternich’s Weibergeschichten. Siemann is rather soppy about it all. But there is no doubt about the depth of Clemens’s grief at the deaths of his wives, all of whom he outlived, and several of his children.
Nor was he unmoved by the horrors of the battlefield. One of Siemann’s finest passages brings together Metternich’s laments for the terrible scenes he had witnessed rattling across Europe along roads littered with corpses. In 1815, following Karl Philipp Schwarzenberg’s army along the same route that Napoleon had taken after the Battle of Leipzig with the remnants of his army, Metternich wrote to his mistress Wilhelmine von Sagan:
Few men can have spent more time than Metternich in private talk with Napoleon, notably as foreign minister in Dresden when he was shuttling to and fro between the emperors. There in the Chinese room at the Palais Marcolini on June 26, 1813, they talked for eight or nine hours on the trot. By then, he had long foreseen that Napoleon’s desire for European domination would prove insatiable. Yet he always found that the Emperor’s conversation “had a charm that was difficult to define”. He had a way of stripping a subject down to its essentials, listening carefully and bringing the topic to a point, while fobbing off any attempt to pin him down to any specific commitment. As the Comte de Las Cases was to find on St Helena, you could say anything to him. Both men asked Napoleon why he told such lies in his military bulletins. To Metternich, Napoleon replied with a smile: “They are not written for you, the Parisians believe everything”. Then towards the end, he would launch into the most furious and intimidating tirade. At Dresden, Metternich pointed out to him how the nature of war had changed to become total war: “today, it is a whole people that you have called to arms”; mere children were being pressed onto the battlefield. Enraged, Napoleon retorted: “You are no soldier and you do not know what goes on in the soul of a soldier. I was brought up in military camps, and a man like me doesn’t give a fuck about the lives of a million men”.
Admirers of Napoleon, such as Andrew Roberts in Napoleon the Great, cast doubt on whether he actually made this last shocking remark, although Metternich records it in a handwritten note not for publication, “se fout de la vie”. Siemann points out that Napoleon had used very similar language in a talk with Schwarzenberg’s deputy, Count Ferdinand Bubna, a few weeks earlier:
Metternich’s accounts of his talks with Napoleon recall Sir Eric Phipps’s accounts of his conversations with Hitler as Ambassador in Berlin between 1933 and 1937 (Our Man in Berlin, edited by Gaynor Johnson, 2008). There are the same apparent candour and approachability, even charm, the same refusal to be tied down to any international commitment, and then after the fobbing off comes the great rant, a mixture of aggression, self-pity and national resentments which forms a staged finale to the encounter.
Out of that quarter century of bloodshed and destruction, Metternich’s world view emerges, never to change for the remaining forty years of his life. It is permeated not only by his great longing for Ruhe but also by an ingrained suspicion of national passions. Democracy is to be distrusted because it legitimizes and inflames those passions, exciting racial paranoia and the urge for racial separation.
Even Metternich’s modern admirers find this attitude hard to swallow. Kissinger concludes that Metternich’s virtuoso performance was ultimately futile: “Unable to adapt its domestic structure, unable to survive with it in a century of nationalism, even Austria’s most successful policies amounted to no more than a reprieve, to a desperate grasping to commit allies, not to a work of construction, but to deflect part of the inevitable holocaust”. Schroeder, too, ticks off those who swallow claims of their hero’s claims to modernity: “Metternich was basically a rigid absolutist whose political outlook was tied to a system of government and society which may once have had its grandeur and fitness, but which even by Metternich’s time was becoming outworn, and by our own is completely anachronistic”.
Sked, though, in his dazzling Metternich and Austria, denounces Kissinger’s views as “fantasy” and asserts that Schroeder “gets almost everything wrong”. The empire was not brought down by the aborted revolution of 1848 and was unhorsed only by the First World War. The Emperor continued to reign for another sixty years, as popular as ever. Railways were built, glass and chemical factories constructed. The censorship was annoying but understaffed and rather indulgent; ditto the police state. Nor was there much popular unrest; the only mass slaughter, in Galicia in 1846, was carried out by local Polish peasants in support of their emperor. Sentences of death were usually commuted, and the waltzing and winemaking continued unabated. At the very least, Sked declares, “Metternich by 1848 had had an exceptionally good run for his money”.
Ironically, Dr Sked’s own experience would have offered Metternich yet another opportunity to practise his favourite pastime of saying “I told you so”. It was Sked who in 1993 founded the United Kingdom Independence Party to take Britain out of the European Union in the interests of democracy and honest governance. In no time, he found to his horror that the new party had been invaded by out-and-racists and hardliners from the far right. He resigned his leadership, but the party went on to get 3.8 million votes in the 2015 general election and to provide the essential impetus for Brexit, threatening not only to separate Britain from the Continent, with bitterness on all sides, but to break up her own multinational state, with potentially even more bitterness to come.
What modernity required, Metternich thought by contrast, was a drawing together of nations, a Völkerbund, to take the edge off national ambition. Europe needed not the sort of “deals” which could be torn up whenever it suited any of the parties but an abiding, law-governed structure of co-operation. This is of course exactly what Napoleon did not want, any more than Donald Trump does today. As Metternich told him in one of those frank exchanges: “your peace is never more than a truce”. The Grande Armée had to keep marching to pastures new to graze on.
For three decades after the Napoleonic wars, there were other leaders in Europe who mostly shared Metternich’s concerting instincts: Castlereagh, Aberdeen and Peel in Britain, Guizot and Molé and indeed King Louis Philippe in France, latterly Tsar Alexander and Nesselrode in Russia (who deserves to be remembered by more than a pudding in Proust). Then came Palmerston and the age of national assertion began all over again. If Metternich really had been born in 1900, I don’t think he would have enjoyed what he was destined to witness.
Ferdinand Mount is a former editor of the TLS. His most recent book, Prime Movers: From Pericles to Gandhi, was published last year
Clemens von Metternich: the master diplomat of the nineteenth century
By Ferdinand Mount
My life has fallen at a hateful time. I have come into the world either too early or too late. Now, I do not feel comfortable; earlier, I should have enjoyed the time; later, I should have helped to build it up again; today I have to give my life to prop up this mouldering edifice. I should have been born in 1900, and I should have had the twentieth century before me.
That, famously, was Clemens von Metternich in 1820, conscious as he always was of the times he lived in and of his own place in them. And how intensely he lived through them. At the age of seven, in 1780, he accompanied his father Franz Georg on a diplomatic mission to Cologne. (In 1853) Pushing eighty, he returned from exile to Vienna to give advice to the impulsive young Emperor Franz Joseph. In between, he had been foreign minister to Francis I for twelve years and Austrian Chancellor for another twenty-seven. For his skill in driving through the Congress of Vienna, the French foreign minister Montmorency dubbed him “the coachman of Europe”. Castlereagh went a step further. After the crucial peacemaking session at Châtillon, he told him, “you are the prime minister of the world”. At least that’s what Metternich said Castlereagh had said. Like most great men, he was no stranger to self-promotion.
Yet precisely because he was so identified with the high politics of his age, he has attracted bucketfuls of venom from historians and biographers, reflecting both their own prejudices and the resentments of their epochs. Metternich is King Rat in the influential History of Germany in the Nineteenth Century (1879–94) by Heinrich von Treitschke, the father of German hypernationalism and antisemitism (“Die Juden sind unser Unglück” – the Jews are our misfortune). Treitschke denounced Metternich as “a man of calculating cunning”, “good-natured and smiling mendacity”, and totally lacking in German spirit. Treitschke’s follower, Viktor Bibl, an early convert to National Socialism, called him “the demon of Austria”. Of the thirty biographies, by authors ranging from Treitschke to Barbara Cartland, the most archivally rich and authoritative so far have been the two volumes by Heinrich von Srbik (1925, with a third volume of source material published in 1954 after his death).
Which is a pity. Because Srbik, too, became a fellow traveller of the Nazis. Originally from Vienna, he welcomed the Anschluss, took a seat in Hitler’s Reichstag, and reigned over Vienna’s Academy of Science until 1945. As late as 1951, he was still pleading for “a properly understood German racial theory”. For Srbik, Metternich was a softie, who lacked the Herrennatur, the iron and ruthless will to impress himself upon the age. He should have pushed to the East, “permeating a natural space with predominantly German culture”. As with Treitschke, Srbik’s ultimate indictment of Metternich is that he was “un-German”; “there was no place in his mind or heart open to the high values of a national German Reich”; he just didn’t get the idea of “a community of blood”, ie he wasn’t Hitler.
Postwar studies, especially in English – notably by Henry Kissinger, Paul W. Schroeder and Alan Sked – have done much to retrieve Metternich’s reputation as a master diplomat who tirelessly crisscrossed Europe to cement and re-cement the fragile alliances against Napoleon, who with Castlereagh insisted on a peace after Waterloo which was so generous to France that it baffled Napoleon on St Helena, and who kept the Concert of Europe playing from more or less the same hymn sheet for three decades after the war. But there hasn’t really been a prejudice-free biography based on the voluminous archives in Prague and Vienna which sets out to give us Metternich at full length, to explain, if not necessarily to explain away, who he was and what he was trying to do, a Metternich per se.
This is the achievement of Wolfram Siemann, emeritus professor of history at Munich and already author of monographs on several crucial episodes in Metternich’s long life: the Congress of Vienna, the beginnings of the German secret police, the 1848 revolution and Metternich’s Britain. Metternich: Strategist and visionary is thus the culmination and encapsulation of a life’s work, and despite its length, twice as long as anybody else’s except Srbik, it is a running joy, full of winking sidelights and delightful detours, many of which are not really detours at all – for example, Metternich’s endless struggles to recover or replace the princely estates which had been torn away by the war and then to restore the family finances which had been frittered away by Franz Georg.
Metternich’s father is only one of the lesser characters who is recast or partly rehabilitated by Siemann’s patient researches. Dismissed by Algernon Cecil in his skittish, pro-Metternich 1933 biography as “a prosy babbler, a habitual liar and a glittering spendthrift”, Franz Georg reappears here as a resourceful diplomatist who laboured long and hard for his Emperor, and whom Francis recalled in 1810 to stand in as foreign minister while his son was away in Paris negotiating with Napoleon. Again, Francis’s successor as Emperor, Ferdinand, has often been dismissed by earlier historians as “an imbecile” or “mentally retarded”. Siemann points out that though he suffered from epilepsy and hydrocephalus, Ferdinand spoke five languages and played the piano.
The translation by Daniel Steuer is fluent and vivacious, though marred by several awkward choices of words in contexts where German has one word and English has several: “Herrschaft” is more naturally “domain” rather than “dominion” when describing Metternich’s estates; the revolutionary call to “a holy and healthy Empörung” is surely to “rebellion” rather than merely “indignation”; “Recht” is several times correctly Englished as “Law”, but elsewhere wrongly I think in the context as “Right”. Steuer is also inclined to translate “bürgerlich” as “bourgeois” in places where, ever since Rousseau and Marx roughed up the bourgeoisie, we would usually say “civic”.
Siemann points out how modern-minded Metternich could be. As a reward for his success at the Congress of Vienna, the Emperor had bestowed on him the glorious domain of Johannisberg, a former monastery overlooking the Rhine with a legendary vineyard. Instead of selling the wine by the barrel in the usual way and drinking the profits, he bottled it and sold it to selected grand clients, mostly monarchs. He also devised fancy armorial labels – Original Abfüllung der Fürstlich von Metternichschen Domäne – personally signed by the cellar-master. He was a pioneer high-end marchand de vins.
Similarly, when he got hold of another former abbey, at Plass on the edge of the Sudetenland, he lost no time in exploiting the estate’s extensive iron and coal deposits, using the latest cupola furnaces from England and patenting a machine for clearing tree trunks from the hillside. Soon the Plass works had more than 300 employees and were netting 280,000 guilders a year, enough to pay off most of Franz Georg’s debts all at once. To celebrate the arrival of a steamroller in 1854, the workforce choir sang an ode in Czech in Metternich’s honour, for he was a model employer, besides being a vigorous supporter of freeing up the land market from feudal restrictions. It is a piquant thought that Metternich’s efforts to industrialize Northern Bohemia (and indeed the rest of the empire) helped to make the Sudetenland so irresistible to Hitler.
Why then did this up-to-the-minute technocrat remain so stubbornly stick-in-the-mud in domestic politics? It is not as if he was unacquainted with democratic theory and practice. As a young man, he had fallen in love with England. “If I were not what I am, I would like to be an Englishman. If I could be neither the one nor the other, I would rather be nothing at all.” In London, he had made friends with Edmund Burke and Charles Grey. In his late seventies when Metternich was in exile in Brighton, Disraeli came and sat at his feet. Srbik claimed that it was not possible to say anything beyond “the level of the probable” about Metternich’s view of Burke. Cecil, too, says simply that “we cannot tell” whether Metternich inhaled Burke’s “matchless rhetoric”. But actually we can. Siemann finds in the library of Königswart, another of Metternich’s castles in Bohemia, a copy of Burke’s Reflections. Metternich has underlined several key passages, one about the nature of constitutional liberty, and also the celebrated riff about the maltreatment of Marie Antoinette, whose later execution we know from Metternich’s outburst at the time horrified him as much as it was to horrify Burke. And there is no doubt that he loved the English system of government. Coming back to England in 1848 after an absence of thirty-four years, he feels as if he had never been away: “This great country is the way it was, strong because of its unshakeable belief in the value of the law, of order, and the kind of freedom which, if it truly wants to exist, must be based on these foundations”.
On his first visit, his knowledge of England had been sketchy. He refers to Europe’s largest coachworks as “Hatched” (in fact John Hatchett of Long Acre), to “the Flemish artist John Bacon” who sculpted Chatham’s tomb in Westminster Abbey (Bacon was born in Southwark). The author of The School for Scandal would be surprised to hear himself described as “the Whig politician and poet”, as would Burke to hear himself called an “aristocratic Whig politician”, when he was a decidedly middle-class Irish lawyer who depended on aristocratic patronage for much of his political life. But there could be no doubt about Metternich’s enthusiasm for the hubbub of commercial London and the heated debates in Parliament, not least the trial of Warren Hastings, which he watched avidly.
Yet he remained obdurate against any suggestion that any Continental nation should take even a tiny step on the road to popular participation. The much- admired and imitated Spanish Cortes Constitution of 1812 he denounced as “the work of caprice or of a wild delusion”. In 1819, he offered to help King Friedrich Wilhelm III of Prussia to deal with dissenters but only on condition that he promised “not to introduce a representation of the people in your state, which is less suited to it than any other”. Siemann makes much of the elaborate plan for reorganizing the imperial Apparat, which Metternich presented to the Emperor on October 27, 1817 – and which the Emperor left lying on his desk for years, only toying with it again on his deathbed. Metternich himself in old age told his protégé Alexander von Hübner that “if the Emperor had adopted my ideas on the reorganization of the Diets, we would perhaps be in a better position to face the tempest”. True, Francis was ineradicably hostile to change, and Metternich was ultimately only his servant. But we cannot ignore the fact that Metternich never attempted to nudge him in the direction of popular representation. For all Siemann’s pleas that we should see Metternich as a reformer by stealth, the 1817 plan looks like a mega reshuffle of the bureaucracy across the constituent parts of the Empire, with no hint of democracy about it. In fact, Metternich told Hübner explicitly that the heart of his scheme, the new council sitting in Vienna, would be a council of the provincial estates, “and not a chamber of deputies, a Volkshaus, whose members would be elected”.
In Metternich’s view, the Germans couldn’t handle political parties or deal with a free press, although when in England he himself lapped up the newspapers and even founded one himself, the Spectateur de Londres, which flopped. There’s a weird kinship here between Metternich and the later Marx, who, after his years in London, reluctantly came to the conclusion that Britain, the United States and perhaps Holland might well evolve peacefully towards socialism, but that elsewhere on the Continent violent revolution was inevitable.
Hence the succession of repressive measures over thirty years – the “Metternich system” – most notably the Carlsbad Decrees of 1819 – of which he was rather proud: “a great act, the most important act of my life”. Here Siemann is indulgent to his subject, pointing out that Metternich was responding to vociferous public outcry after the murder of the pro-government poet August von Kotzebue by the firebrand Karl Ludwig Sand. All over Europe, the respectable classes were in a state of high panic that another great revolution was looming. In England at the same time, the government of Lord Liverpool was passing the no less repressive Six Acts. Both sets of laws were time-limited, but the Six Acts expired quicker and anyway they had been watered down by the Whigs.
By contrast, the Carlsbad Decrees seemed to become more stifling with time, and Metternich’s opposition to any form of democratic representation more entrenched. While Charles Grey, the friend of Metternich’s youth, was finally shoving the Great Reform Bill through Parliament – to Metternich’s dismay – the Austrian Chancellor was setting up his own secret police, the Mainzer Zentralpolizei. The family had form in this department. In 1810, Franz Georg had set up the empire’s first formal censorship bureau to cope with the secret agents that Napoleon left behind in all the larger cities he occupied. This new “political police” defined itself as “a counter-police which moved in step with the other police” – thus a forerunner of all modern counter-terrorism departments.
But Clemens’s objection to democracy went deeper than his agonized concern for Ruhe – the word always on his lips – calm/rest/peace. An elected central parliament could only be “a composition of individual, mutually alien and hostile deputies who are far from being united for the One purpose of being a state”. Democracy was “always a dissolving, decomposing, principle; it tends to separate men, it loosens society”. Echoes here of Rousseau and the ideal of organic unity within a state: for Rousseau, to be found in the Calvinist semi- theocracy of Geneva; for Metternich, only in the non-racial, multinational kaiserlich und königlich monarchy he served.
His obsession with Ruhe is not hard to understand. He had known little of it in his early life, starting with his education. Franz Georg was not such a stuffy character as you might think. He was a Freemason and a member of the Illuminati, and he chose as his son’s first tutor a fellow Illuminatus, Johann Friedrich Simon, a young Protestant pastor with advanced ideas, so advanced in fact that as soon as the French Revolution broke out, he shot off to join it, leading the storming on the city hall in Strasbourg, going on to style himself “the Grand Inquisitor of France at Mainz”, joining forces with Saint-Just in Paris and as a member of a revolutionary tribunal passing sixty-two death sentences. The next year, 1794, Franz Georg, the Emperor’s governor-general in the Austrian Netherlands, was forced to flee from Brussels for the second time when Austrian rule collapsed there. Soon Napoleon was chasing princes out of their palaces and estates all over Europe – the Metternichs included. New statelets were formed with idealistic republican constitutions only to have Napoleon’s siblings dumped on them as pop-up monarchs. After Austerlitz, the Austrian army was almost wiped out. As Metternich coolly remarked on his first great promotion four years later, “It is no minor task to be the foreign minister of Austria in 1809”.
He conceived that task in brutally undeceived terms: “From the day when peace is signed we must confine our system to tacking, and turning, and flattering. Thus alone may we possibly preserve our existence, till the day of general deliverance … for us there remains but one expedient, to preserve our strength for better days”. Accusations of inconsistency or weakness simply bounced off him. Between 1809 and 1813, he pursued the most brilliant defensive strategy in diplomatic history. Only after the French had lost 700,000 men in the snows, could he steer Austria away from alliance with France and into the final great coalition to destroy Napoleon.
He was undeceived, but he was not cynical. His tears were not shed simply for the loss of his own family estates or the unhorsed monarchs. He admitted that he was cold-blooded and thought that he needed to be. But he was not short of human feeling. He loved his numerous children by his three wives. His first wife Eleonore was an heiress and, in Metternich’s rather chilling words, “was never pretty and she is charming only to close acquaintances”. He added – this was in a letter to one of his many high-born mistresses, the legendary Dorothea von Lieven – “Those who know her really well cannot fail to love her; the majority of people find her stiff, unpleasant, and this is just what she wants. There is nothing in the world I would not do for her”.
Except stay faithful. He took an unseemly delight in his reputation as a ladies’ man. You can almost hear his lips smacking as he wrote to Dorothea: “Many women have been mentioned in connection with me who were not even in my thoughts. By contrast, I have entertained relationships with many women that were anything but romantic, without the public ever learning anything about it”. Srbik pursed his lips at Metternich’s Weibergeschichten. Siemann is rather soppy about it all. But there is no doubt about the depth of Clemens’s grief at the deaths of his wives, all of whom he outlived, and several of his children.
Nor was he unmoved by the horrors of the battlefield. One of Siemann’s finest passages brings together Metternich’s laments for the terrible scenes he had witnessed rattling across Europe along roads littered with corpses. In 1815, following Karl Philipp Schwarzenberg’s army along the same route that Napoleon had taken after the Battle of Leipzig with the remnants of his army, Metternich wrote to his mistress Wilhelmine von Sagan:
You cannot take ten steps without coming across someone dead or dying, or a prisoner whose face looks worse than those of the dead. Dear God, this man has no right to reproach you, who sacrificed the blood of so many millions out of a vain feeling of misguided fame. How is it possible that having witnessed a scene like this, just once, he does not recoil from himself in horror! Napoleon has also strewn the road from Moscow to Frankfurt with wreckage.
Few men can have spent more time than Metternich in private talk with Napoleon, notably as foreign minister in Dresden when he was shuttling to and fro between the emperors. There in the Chinese room at the Palais Marcolini on June 26, 1813, they talked for eight or nine hours on the trot. By then, he had long foreseen that Napoleon’s desire for European domination would prove insatiable. Yet he always found that the Emperor’s conversation “had a charm that was difficult to define”. He had a way of stripping a subject down to its essentials, listening carefully and bringing the topic to a point, while fobbing off any attempt to pin him down to any specific commitment. As the Comte de Las Cases was to find on St Helena, you could say anything to him. Both men asked Napoleon why he told such lies in his military bulletins. To Metternich, Napoleon replied with a smile: “They are not written for you, the Parisians believe everything”. Then towards the end, he would launch into the most furious and intimidating tirade. At Dresden, Metternich pointed out to him how the nature of war had changed to become total war: “today, it is a whole people that you have called to arms”; mere children were being pressed onto the battlefield. Enraged, Napoleon retorted: “You are no soldier and you do not know what goes on in the soul of a soldier. I was brought up in military camps, and a man like me doesn’t give a fuck about the lives of a million men”.
Admirers of Napoleon, such as Andrew Roberts in Napoleon the Great, cast doubt on whether he actually made this last shocking remark, although Metternich records it in a handwritten note not for publication, “se fout de la vie”. Siemann points out that Napoleon had used very similar language in a talk with Schwarzenberg’s deputy, Count Ferdinand Bubna, a few weeks earlier:
A man who was a simple private person and has ascended [parvenu] to the throne, who has spent twenty years hailed with bullets, is not afraid of projectiles, he does not fear any threats at all. I do not value my life above all else, nor that of others very much. I do not waver to and fro to save my life; I do not rate it higher than that of a hundred thousand people. I sacrifice a million if necessary.
Metternich’s accounts of his talks with Napoleon recall Sir Eric Phipps’s accounts of his conversations with Hitler as Ambassador in Berlin between 1933 and 1937 (Our Man in Berlin, edited by Gaynor Johnson, 2008). There are the same apparent candour and approachability, even charm, the same refusal to be tied down to any international commitment, and then after the fobbing off comes the great rant, a mixture of aggression, self-pity and national resentments which forms a staged finale to the encounter.
Out of that quarter century of bloodshed and destruction, Metternich’s world view emerges, never to change for the remaining forty years of his life. It is permeated not only by his great longing for Ruhe but also by an ingrained suspicion of national passions. Democracy is to be distrusted because it legitimizes and inflames those passions, exciting racial paranoia and the urge for racial separation.
Even Metternich’s modern admirers find this attitude hard to swallow. Kissinger concludes that Metternich’s virtuoso performance was ultimately futile: “Unable to adapt its domestic structure, unable to survive with it in a century of nationalism, even Austria’s most successful policies amounted to no more than a reprieve, to a desperate grasping to commit allies, not to a work of construction, but to deflect part of the inevitable holocaust”. Schroeder, too, ticks off those who swallow claims of their hero’s claims to modernity: “Metternich was basically a rigid absolutist whose political outlook was tied to a system of government and society which may once have had its grandeur and fitness, but which even by Metternich’s time was becoming outworn, and by our own is completely anachronistic”.
Sked, though, in his dazzling Metternich and Austria, denounces Kissinger’s views as “fantasy” and asserts that Schroeder “gets almost everything wrong”. The empire was not brought down by the aborted revolution of 1848 and was unhorsed only by the First World War. The Emperor continued to reign for another sixty years, as popular as ever. Railways were built, glass and chemical factories constructed. The censorship was annoying but understaffed and rather indulgent; ditto the police state. Nor was there much popular unrest; the only mass slaughter, in Galicia in 1846, was carried out by local Polish peasants in support of their emperor. Sentences of death were usually commuted, and the waltzing and winemaking continued unabated. At the very least, Sked declares, “Metternich by 1848 had had an exceptionally good run for his money”.
Ironically, Dr Sked’s own experience would have offered Metternich yet another opportunity to practise his favourite pastime of saying “I told you so”. It was Sked who in 1993 founded the United Kingdom Independence Party to take Britain out of the European Union in the interests of democracy and honest governance. In no time, he found to his horror that the new party had been invaded by out-and-racists and hardliners from the far right. He resigned his leadership, but the party went on to get 3.8 million votes in the 2015 general election and to provide the essential impetus for Brexit, threatening not only to separate Britain from the Continent, with bitterness on all sides, but to break up her own multinational state, with potentially even more bitterness to come.
What modernity required, Metternich thought by contrast, was a drawing together of nations, a Völkerbund, to take the edge off national ambition. Europe needed not the sort of “deals” which could be torn up whenever it suited any of the parties but an abiding, law-governed structure of co-operation. This is of course exactly what Napoleon did not want, any more than Donald Trump does today. As Metternich told him in one of those frank exchanges: “your peace is never more than a truce”. The Grande Armée had to keep marching to pastures new to graze on.
For three decades after the Napoleonic wars, there were other leaders in Europe who mostly shared Metternich’s concerting instincts: Castlereagh, Aberdeen and Peel in Britain, Guizot and Molé and indeed King Louis Philippe in France, latterly Tsar Alexander and Nesselrode in Russia (who deserves to be remembered by more than a pudding in Proust). Then came Palmerston and the age of national assertion began all over again. If Metternich really had been born in 1900, I don’t think he would have enjoyed what he was destined to witness.
Ferdinand Mount is a former editor of the TLS. His most recent book, Prime Movers: From Pericles to Gandhi, was published last year
Re: 2019
"This war is not meant to be won. It is meant to be continuous." - George Orwell
"We haven't won a war since 1945 (with a lot of help), excepting the grotesque farce of Grenada (World War II was the last declared war, not coincidentally). We have been fighting in the Middle East, as we did earlier in Vietnam, to blow shit up and rebuild it (awarding many of the contracts for both to Halliburton) and to loot minerals and build up the opium trade in Afghanistan, just as we built up the heroin trade and imported it from Southeast Asia, etc. And fighting perpetual wars helps keep our authoritarian leaders in power at home restricting our civil liberties as well as looting the public treasury." - Joseph McBride
"We haven't won a war since 1945 (with a lot of help), excepting the grotesque farce of Grenada (World War II was the last declared war, not coincidentally). We have been fighting in the Middle East, as we did earlier in Vietnam, to blow shit up and rebuild it (awarding many of the contracts for both to Halliburton) and to loot minerals and build up the opium trade in Afghanistan, just as we built up the heroin trade and imported it from Southeast Asia, etc. And fighting perpetual wars helps keep our authoritarian leaders in power at home restricting our civil liberties as well as looting the public treasury." - Joseph McBride
Re: Wind Timeline 1960 - 2018
https://outline.com/VNVMDM
THE OTHER SIDE OF THE WIND
Strange that arrival of a storied, unfinished masterpiece by the notoriously talented Orson Welles—a director whose unmade films are just as legendary as the ones he did make—should feel like it came and went with barely any notice, except among critics and hardcore cinephiles. Yet that seems to be the fate of The Other Side of the Wind, which Welles filmed in bits and pieces between 1970 and 1976, and which remained incomplete upon his death in 1985. It was finished long after the fact by producer Frank Marshall (a production manager on the movie when Welles was still making it), Polish producer Filip Jan Rymsza, and editor Bob Murawski, and though there’s no doubt that the mad genius Welles might have assembled the movie differently, the raw material is strong enough—wild enough—that the resulting film is more than worth the wait.
It stars Hollywood maverick John Huston as, well, a Hollywood maverick—but one at the tail end of his career, at the tail end of his life, actually, when his reputation is flailing. His chief apprentice (played by Peter Bogdanovich) is shooting past him in success and importance; his final film project is an embattled mess. That’s not the accidental irony of history you’re sensing, by the way: the Welles of the ‘70s was very much a filmmaker in exile from Hollywood. And every gleeful turn here—from the gaudy, riotous movie-within-a-movie (a wink at the work of Michelango Antonioni) to the a not-remotely-subtle forays into the daddy issues of it all—doubles as a send-up of Hollywood at large and as playful self-excoriation. No one knew how to lay it on thick quite as handily, or daringly, as Welles. This movie is yet more proof.
Re: Wind Timeline 1960 - 2018
The paradox of an atheist soul
Why the idea of a single self only makes sense in a theistic world.
BY JOHN GRAY
There are many arguments for theism, most of them not worth rehearsing. The ontological argument, first formulated by St Anselm in the 11th century and reframed by the 17th-century French rationalist René Descartes (1596-1650), maintains that God must exist because humans have an idea of a perfect being and existence is necessary to perfection. Since many of us have no such idea, it is a feeble gambit. The arguments of creationists are feebler, since they involve concocting a theory of intelligent design to fill gaps in science that the growth of knowledge may one day close. The idea of God is not a pseudo-scientific speculation.
A different and more interesting approach is to argue that theism is suggested by the fact that we experience ourselves as unified, conscious beings – in other words, as having a soul. Not necessarily an immaterial entity, the soul is the part of us that strives to realise what is best in our nature. We do not come to know the soul through any special revelation. We know it by considering the kind of creature we find ourselves to be – a thinking being inhabiting a life-world that seems to reflect a mind greater than our own. Once we realise we have a soul, theism becomes a credible way of thinking.
Such is the approach adopted in this lucid and illuminating book by John Cottingham, professor of the philosophy of religion at University of Roehampton. Modestly described as an essay, Cottingham’s short study explores fundamental questions more fully than many much longer volumes. While it fails as an argument for theism, it is forceful and compelling in arguing that the idea of selfhood taken for granted in secular societies makes sense only in the context of a theistic world-view.
Cottingham presents a version of the transcendental argument deployed by the German Enlightenment philosopher Immanuel Kant (1724-1804). A transcendental argument does not appeal to anything factual. Instead, it asks what must be true if certain features of human experience are accepted as given. Kant used it to support his belief in a universal moral law and, at points in his writings, the existence of God. As used by Cottingham, its purpose is to refute the Scottish sceptic David Hume (1711-1776), whom Kant described as “having interrupted my dogmatic slumber”. In A Treatise of Human Nature (1739), Hume had written that the self is “nothing but a bundle or collection of different perceptions, which succeed one another with an inconceivable rapidity, and are in a perpetual flux and movement”. If the self is not an autonomous entity but an assemblage of sensations Kant’s theistic faith crumbles into dust.
Cottingham spells out the connection between theism and the idea of the self:
Theistic religions are inherently anthropocentric. God is an infinitely enlarged projection of human personality. Yet many religions have understood God as an impersonal world-soul that may spawn souls that resemble human beings, but is itself remote from anything human. Other religions have done without any soul at all.
Older than Christianity and at least as philosophically sophisticated, Buddhism begins by rejecting the concept of the soul. The core Buddhist doctrine of anatta (no-self, no-soul) teaches that there is nothing in humans like a continuing identity. Popular Buddhism upheld older ideas of metempsychosis, according to which a soul is reincarnated after death. But in Buddhist philosophy, only a complex psychophysical process continues from death. Wherever it seems to exist, selfhood is an illusion.
The Buddhist view is similar to Hume’s: the apparent solidity of the self comes from the extreme rapidity with which one perception follows another. One of the goals of Buddhist meditation is to slow this process, so the practitioner can shed the illusion of selfhood. Some Western mystics have talked of the individual soul merging with a world-soul, but in Buddhism the idea of a world-soul is also rejected. Human salvation involves ridding oneself of any idea of the soul, human or divine. It is hard to think of a view more distant from the central traditions of theism.
When exploring the idea of the soul Cottingham says nothing of Buddhism, or any non-Western religion. He considers briefly a modern version of the denial of self- hood, which questions the idea that we should aim for narrative unity in our lives. Any such defence of the “episodic” or “happy-go-lucky” life, he tells us, “seems open to a swift and devastating rebuttal: lives of this episodic kind are possible only because others who are not leading happy-go-lucky lives are sustaining the stable relationships that make their easy-come-easy-go attitude possible”. He goes on to observe that advocates of the “episodic” life “tend to be drawn in the end to abandon the very idea of a self persisting over time… Yet the more we think about this, the more it starts to look like a fantasy of evasion.”
Cottingham’s rebuttal may be swift but it is hardly devastating. An episodic view of human life is the claim that no persisting self is revealed in the course of our lives. Certainly there are patterns of continuity in memory and behaviour, but these marks of selfhood shift and fade in lives that are long or varied.
The life of Bertrand Russell (1872-1970) was both. He writes in his autobiography that when he looked back he found not a single person but something more like a club whose members changed over time. The solitary, rationalistic and rather puritanical self of Russell’s late Victorian youth was not the self that flirted with mysticism as he fell unhappily out of love with his first wife. Nor was it the self that emerged from a spell in prison for pacifist resistance against the First World War, after which his interests shifted from mathematics and logic to politics, and he travelled to Lenin’s Russia and war-torn China. Still less was it the self that married three more times and had countless affairs. Reflecting on his life, Russell found no enduring selfhood.
Of course, Russell was exceptional in many ways. But an episodic life featuring a succession of disparate selves captures the experience of many people better than any story of the continuous unfolding of an autonomous individual. The selfhood that some find throughout their lives is a by-product of stability in society, which rarely lasts for very long. War, revolution and social breakdown regularly overwhelm the sense of being a person with a coherent life-story. A unitary self is a fantasy that can be enjoyed only in peaceful times.
***
Cottingham’s dismissal of the episodic self is of a piece with his theory of value. He points out that morality makes demands on us we are compelled to recognise, whether we like them or not. Accordingly, human values reflect “an objective, authoritative demand” imposed on us by what he describes as “strong normativity”. No doubt many people experience morality in this way, but that does not mean values are objective. Their apparent objectivity is a projection from the ways of life by which human beings are formed. There are many moralities, each experienced as compelling by their practitioners.
Even within the Western tradition, as Tom Holland showed in Dominion: the Making of the Western Mind, there are enormous moral gulfs. The Iliad knows nothing of forgiveness, nor does Aristotle’s Ethics of humility. Self-sacrifice figures nowhere in the Epicurean pursuit of tranquil pleasure, nor does concern for the downtrodden and forgotten in Stoicism. Our revulsion at the gladiatorial games of ancient Rome does not come from any inbuilt repugnance at the spectacle of human suffering and violent death. There is no sign that those who watched the games felt any such revulsion. Nor is there much evidence from that era that slavery was felt to be inherently wrong. The repugnance we feel for these practices is an inheritance from Jewish and Christian ideas of human dignity and equality.
In this and other cases, what liberal humanists believe to be universal values are relics of particular religious traditions. Here Nietzsche was right. Human values are too changeable, and too divergent, for morality to be in any meaningful sense objective.
Philosophy did not cease to be a handmaiden of theology when it abandoned theism. In his monumental study On What Matters (2011), the late Oxford philosopher Derek Parfit tried to show that some things really, truly, deeply matter; that there are moral truths that give us reason to act and live in certain ways. But in what kind of world could such truths exist and possess authority over human beings? Without theism, or some Platonic spiritual realm, these supposed objective values are left hanging in the void.
Human values need not be exclusively human. As Darwin implied in The Expression of The Emotions in Man and Animals (1872), our moral responses can be traced back to our evolutionary kin. Without having the kind of self-conscious awareness humans intermittently display, other animals inhabit life-worlds and pursue distinctive ideas of good. Cottingham comes close to denying this evident fact, speculating that without human beings the world would be “blank, silent, dark”. He writes us that “no animal could formulate the Cartesian idea of ‘this me by which I am what I am’”. (It is unclear what this is meant to prove. Possibly it shows only that other animals are not as prone to delusion as human beings.) But there is nothing to suggest values subsist in a realm beyond that of living organisms.
Secular thinkers who cling to the idea of human autonomy have not shaken off theism. Cottingham writes, “The contrast between the theistically inspired and the post-Enlightenment conceptions of the role of the self could not be more marked.” Actually, the opposite is the case. As Cottingham acknowledges a page later, it was Kant – a lifelong Christian – who asserted the prototypical Enlightenment belief in “independent human rationality and autonomy”. The belief that human beings are essentially autonomous agents is the theistic myth of the soul reiterated in rationalist terms.
When philosophers deploy transcendental arguments it is in order deduce themselves as they imagine themselves to be. Because Cottingham argues that the experience of selfhood points in the direction of theism, secular readers may decide that his essay is not for them. But if they do they will be mistaken, for the sense of self he invokes is their own.
In Search of the Soul: A Philosophical Essay
John Cottingham
Why the idea of a single self only makes sense in a theistic world.
BY JOHN GRAY
There are many arguments for theism, most of them not worth rehearsing. The ontological argument, first formulated by St Anselm in the 11th century and reframed by the 17th-century French rationalist René Descartes (1596-1650), maintains that God must exist because humans have an idea of a perfect being and existence is necessary to perfection. Since many of us have no such idea, it is a feeble gambit. The arguments of creationists are feebler, since they involve concocting a theory of intelligent design to fill gaps in science that the growth of knowledge may one day close. The idea of God is not a pseudo-scientific speculation.
A different and more interesting approach is to argue that theism is suggested by the fact that we experience ourselves as unified, conscious beings – in other words, as having a soul. Not necessarily an immaterial entity, the soul is the part of us that strives to realise what is best in our nature. We do not come to know the soul through any special revelation. We know it by considering the kind of creature we find ourselves to be – a thinking being inhabiting a life-world that seems to reflect a mind greater than our own. Once we realise we have a soul, theism becomes a credible way of thinking.
Such is the approach adopted in this lucid and illuminating book by John Cottingham, professor of the philosophy of religion at University of Roehampton. Modestly described as an essay, Cottingham’s short study explores fundamental questions more fully than many much longer volumes. While it fails as an argument for theism, it is forceful and compelling in arguing that the idea of selfhood taken for granted in secular societies makes sense only in the context of a theistic world-view.
Cottingham presents a version of the transcendental argument deployed by the German Enlightenment philosopher Immanuel Kant (1724-1804). A transcendental argument does not appeal to anything factual. Instead, it asks what must be true if certain features of human experience are accepted as given. Kant used it to support his belief in a universal moral law and, at points in his writings, the existence of God. As used by Cottingham, its purpose is to refute the Scottish sceptic David Hume (1711-1776), whom Kant described as “having interrupted my dogmatic slumber”. In A Treatise of Human Nature (1739), Hume had written that the self is “nothing but a bundle or collection of different perceptions, which succeed one another with an inconceivable rapidity, and are in a perpetual flux and movement”. If the self is not an autonomous entity but an assemblage of sensations Kant’s theistic faith crumbles into dust.
Cottingham spells out the connection between theism and the idea of the self:
It is a fundamental theistic belief, following the words of Genesis, that human beings are made “in the image” of God; and this is taken to be especially true in virtue of our conscious minds, in virtue of our attributes of intellect and will. Theism thus posits a source of ground of all being that is somehow mind-like: consciousness is taken to be at the heart of reality. The theistic picture tends to be discarded or ignored by the majority of contemporary philosophers, but it seems perverse to dismiss it from consideration should it turn out to fit rather well with certain aspects of reality that cannot in integrity be denied… [such as] the irreducible reality of consciousness.
Theistic religions are inherently anthropocentric. God is an infinitely enlarged projection of human personality. Yet many religions have understood God as an impersonal world-soul that may spawn souls that resemble human beings, but is itself remote from anything human. Other religions have done without any soul at all.
Older than Christianity and at least as philosophically sophisticated, Buddhism begins by rejecting the concept of the soul. The core Buddhist doctrine of anatta (no-self, no-soul) teaches that there is nothing in humans like a continuing identity. Popular Buddhism upheld older ideas of metempsychosis, according to which a soul is reincarnated after death. But in Buddhist philosophy, only a complex psychophysical process continues from death. Wherever it seems to exist, selfhood is an illusion.
The Buddhist view is similar to Hume’s: the apparent solidity of the self comes from the extreme rapidity with which one perception follows another. One of the goals of Buddhist meditation is to slow this process, so the practitioner can shed the illusion of selfhood. Some Western mystics have talked of the individual soul merging with a world-soul, but in Buddhism the idea of a world-soul is also rejected. Human salvation involves ridding oneself of any idea of the soul, human or divine. It is hard to think of a view more distant from the central traditions of theism.
When exploring the idea of the soul Cottingham says nothing of Buddhism, or any non-Western religion. He considers briefly a modern version of the denial of self- hood, which questions the idea that we should aim for narrative unity in our lives. Any such defence of the “episodic” or “happy-go-lucky” life, he tells us, “seems open to a swift and devastating rebuttal: lives of this episodic kind are possible only because others who are not leading happy-go-lucky lives are sustaining the stable relationships that make their easy-come-easy-go attitude possible”. He goes on to observe that advocates of the “episodic” life “tend to be drawn in the end to abandon the very idea of a self persisting over time… Yet the more we think about this, the more it starts to look like a fantasy of evasion.”
Cottingham’s rebuttal may be swift but it is hardly devastating. An episodic view of human life is the claim that no persisting self is revealed in the course of our lives. Certainly there are patterns of continuity in memory and behaviour, but these marks of selfhood shift and fade in lives that are long or varied.
The life of Bertrand Russell (1872-1970) was both. He writes in his autobiography that when he looked back he found not a single person but something more like a club whose members changed over time. The solitary, rationalistic and rather puritanical self of Russell’s late Victorian youth was not the self that flirted with mysticism as he fell unhappily out of love with his first wife. Nor was it the self that emerged from a spell in prison for pacifist resistance against the First World War, after which his interests shifted from mathematics and logic to politics, and he travelled to Lenin’s Russia and war-torn China. Still less was it the self that married three more times and had countless affairs. Reflecting on his life, Russell found no enduring selfhood.
Of course, Russell was exceptional in many ways. But an episodic life featuring a succession of disparate selves captures the experience of many people better than any story of the continuous unfolding of an autonomous individual. The selfhood that some find throughout their lives is a by-product of stability in society, which rarely lasts for very long. War, revolution and social breakdown regularly overwhelm the sense of being a person with a coherent life-story. A unitary self is a fantasy that can be enjoyed only in peaceful times.
***
Cottingham’s dismissal of the episodic self is of a piece with his theory of value. He points out that morality makes demands on us we are compelled to recognise, whether we like them or not. Accordingly, human values reflect “an objective, authoritative demand” imposed on us by what he describes as “strong normativity”. No doubt many people experience morality in this way, but that does not mean values are objective. Their apparent objectivity is a projection from the ways of life by which human beings are formed. There are many moralities, each experienced as compelling by their practitioners.
Even within the Western tradition, as Tom Holland showed in Dominion: the Making of the Western Mind, there are enormous moral gulfs. The Iliad knows nothing of forgiveness, nor does Aristotle’s Ethics of humility. Self-sacrifice figures nowhere in the Epicurean pursuit of tranquil pleasure, nor does concern for the downtrodden and forgotten in Stoicism. Our revulsion at the gladiatorial games of ancient Rome does not come from any inbuilt repugnance at the spectacle of human suffering and violent death. There is no sign that those who watched the games felt any such revulsion. Nor is there much evidence from that era that slavery was felt to be inherently wrong. The repugnance we feel for these practices is an inheritance from Jewish and Christian ideas of human dignity and equality.
In this and other cases, what liberal humanists believe to be universal values are relics of particular religious traditions. Here Nietzsche was right. Human values are too changeable, and too divergent, for morality to be in any meaningful sense objective.
Philosophy did not cease to be a handmaiden of theology when it abandoned theism. In his monumental study On What Matters (2011), the late Oxford philosopher Derek Parfit tried to show that some things really, truly, deeply matter; that there are moral truths that give us reason to act and live in certain ways. But in what kind of world could such truths exist and possess authority over human beings? Without theism, or some Platonic spiritual realm, these supposed objective values are left hanging in the void.
Human values need not be exclusively human. As Darwin implied in The Expression of The Emotions in Man and Animals (1872), our moral responses can be traced back to our evolutionary kin. Without having the kind of self-conscious awareness humans intermittently display, other animals inhabit life-worlds and pursue distinctive ideas of good. Cottingham comes close to denying this evident fact, speculating that without human beings the world would be “blank, silent, dark”. He writes us that “no animal could formulate the Cartesian idea of ‘this me by which I am what I am’”. (It is unclear what this is meant to prove. Possibly it shows only that other animals are not as prone to delusion as human beings.) But there is nothing to suggest values subsist in a realm beyond that of living organisms.
Secular thinkers who cling to the idea of human autonomy have not shaken off theism. Cottingham writes, “The contrast between the theistically inspired and the post-Enlightenment conceptions of the role of the self could not be more marked.” Actually, the opposite is the case. As Cottingham acknowledges a page later, it was Kant – a lifelong Christian – who asserted the prototypical Enlightenment belief in “independent human rationality and autonomy”. The belief that human beings are essentially autonomous agents is the theistic myth of the soul reiterated in rationalist terms.
When philosophers deploy transcendental arguments it is in order deduce themselves as they imagine themselves to be. Because Cottingham argues that the experience of selfhood points in the direction of theism, secular readers may decide that his essay is not for them. But if they do they will be mistaken, for the sense of self he invokes is their own.
In Search of the Soul: A Philosophical Essay
John Cottingham
-
Black Irish
- Wellesnet Veteran
- Posts: 317
- Joined: Thu Aug 02, 2012 10:07 pm
-
Black Irish
- Wellesnet Veteran
- Posts: 317
- Joined: Thu Aug 02, 2012 10:07 pm
Re: Wind Timeline 1960 - 2018
Orson Welles - What now?
Here are 12 projects offhand that could and should be done.
1. THE FOUNTAIN OF YOUTH
2. A GOOD DON QUIXOTE DOCUMENTARY
3. A FAREWELL TO ARMS (radio broadcast)
4. THE DEEP (some kind of completion)
5. MAGIC SHOW COMPILATION (or documentary)
6. CHIMES AT MIDNIGHT IN STEREO
7. THE NIKKA WHISKEY COMMERCIALS
8. ORSON'S BAG
9. COMPILATION OF WELLES'S TV APEARANCES
10. PUBLICATION OF MORE SCREENPLAYS
11. PORTRAIT OF GINA
12. 1961 DOCUMENTARY ON BULLFIGHTING
Here are 12 projects offhand that could and should be done.
1. THE FOUNTAIN OF YOUTH
2. A GOOD DON QUIXOTE DOCUMENTARY
3. A FAREWELL TO ARMS (radio broadcast)
4. THE DEEP (some kind of completion)
5. MAGIC SHOW COMPILATION (or documentary)
6. CHIMES AT MIDNIGHT IN STEREO
7. THE NIKKA WHISKEY COMMERCIALS
8. ORSON'S BAG
9. COMPILATION OF WELLES'S TV APEARANCES
10. PUBLICATION OF MORE SCREENPLAYS
11. PORTRAIT OF GINA
12. 1961 DOCUMENTARY ON BULLFIGHTING
Return to “Arts and Letters Daily”
Who is online
Users browsing this forum: No registered users and 1 guest