Cap and gown, helmet and uniform

Anthropologist David Price contributes an article (subscriber-only link, sorry) to the latest issue of Counterpunch.  Under the title “Resistance’s Half-Life: Militarization and the Growing Academic Silence,” Professor Price contrasts the widespread refusal of American anthropologists to join military-sponsored research projects during the 1960s with the far more compliant attitude of their counterparts today.  Professor Price’s narrative begins in 1965, when sociologist Johan Galtung, then director of  the Institute of Peace Research in Oslo, publicized Project Camelot, a plan under which social scientists would work under the direction of US military and intelligence officials to produce a study of insurgent movements and counterinsurgent operations in Latin America and elsewhere.  In response to Professor Galtung’s efforts, both Latin American public opinion and US academic associations demanded, and received, official assurances from the Johnson administration that Project Camelot would be canceled and that the warmaking organs of the Washington regime would not use scholarly research as a pretext for activities “which in the judgment of the Secretary of State would adversely affect United States foreign relations.”

Later attempts by the military and intelligence agencies to press social science into the service of covert operations met with equally strong resistance.  Professor Price illustrates the resistance defense contractors were likely to encounter from social scientists with a series of highly amusing quotations from an exchange between sociologist Pierre van den Berghe and the late Hans Weigert, in which Professor van den Berghe patiently explains why he would regard it as unethical for a scholar to conduct intelligence work for the United States in the Congolese Republic, while Weigert responds with name-calling.

Professor Price reports:

Because I have written about the militarization of anthropology since the mid-1990s, after the post-9/11 recruitment renaissance began, I often received copies of recruitment emails forwarded to me along with the angry replies that scholars had sent to the unwanted solicitors. I have a file of these forwarded angry replies from 2004-2008, when these feelers from the military and contractors were seen by many as shocking. Sometimes a single recruitment emailing would be forwarded to me by a dozen concerned scholars. These were then new, previously unthinkable proposals, shocking that they were made so openly and broadly circulated. In many cases, the approached anthropologists vented spleen in ways reminiscent to Van den Berghe’s above response, giving history and ethics lessons to would-be recruiters – who I’m sure generally did not read past the first few lines of anger and deleted the replies, or perhaps deleted the sender from an e-list. Certainly no minds were changed from these responses, but the reaction measured the outrage many anthropologists felt over these disciplinary border intrusions. In some instances it is possible to deduce having obviously taken the contract.

In the last four years, these messages have ceased to come Professor Price’s way.  He draws an ominous conclusion from this silence.  US society has become thoroughly militarized; “there has been a shift in the acceptance that these military and intelligence intrusions into our daily lives are now a normal feature of our world. These military advances into academia have become regular features of our social fabric. These are the social facts of a militarized society.”  Perhaps it no longer occurs to scholars that they have an obligation to something other than the dictates of the national security apparatus.

Professor Price quotes a phrase coined by anthropologist Catherine Lutz: “the military normal.”  Professor Price describes the military normal as “the ubiquitous spread of the military into all aspects of American daily life and consciousness, advancing at such a rate that we internalize the militarization of everything from police departments, hiring practices, educational processes, discussions of healthcare, workplace regimentations, to an extent where the militarization of everything becomes a normal part of our cultural fabric in ways we hardly notice anymore.”  Professor Lutz herself described it in these terms in the abstract of the paper where she introduced the phrase:

Prevailing mainstream media discussions of the counterinsurgency wars in Iraq and Afghanistan have a deeply restricted kind of range, focusing on how the wars are being .fought, or should be fought – with what tactics, for how long, and with what level of “success.” The pundits, with the populace in tow, debate whether the military is stretched too thin, well-enough resourced or not, or in need of tens of thousands more troops to do the job. They do not ask more fundamental questions about the US military, history’s most powerful and most globally expansive in its positioning. This talk considers the emergence of what can be called the military normal in World War II and its wake, the contemporary political economy of the military, as well as the cultural understandings that currently legitimate it.

Professor Price complains of a growing silence that has resulted from the rise of “the military normal.”  Both of these descriptions make it clear that the silence is a natural consequence of this process.  The heart of the process itself is the reverse of silence.  The military and the intelligence agencies can carry on their operations and the moneyed elite that controls the US political system can reap profits from those operations untroubled by public opposition even if scholars speak out against them, if the public is not in the habit of listening to critical voices.  Silence is what we experience when we listen in quietness; what our warlords wish on us is not quietness, but noise, constant, deafening noise, noise sufficient to knock all impertinent questions and inconvenient qualms out of our heads.  Cable television, talk radio, the internet, and other outlets of prefabricated opinion produce a great deal of noise, and often suffice to drown out the unfamiliar voices that present us with complex, closely reasoned, ethically challenging arguments.

Surely, however, that sort of noise is not adequate by itself to drive scholars to abandon ethical standards based on ideals of disinterested inquiry and service to a truth that exists independently of national allegiance or corporate profit  and take up positions as functionaries of a warmaking regime.  A different kind of noise is necessary to bury those ideals so deeply that they no longer trouble the mind of the potential recruit.  Professor Price touches on this kind of noise at the end of his article.  Listing the developments that have discouraged scholars from holding to principles that would lead them to refuse war contacts and speak out against them, he includes “three decades of neoliberal programs’ impacts on student loan debt, campus austerity programs, and new promises of military funding.”  Scholars working in American universities from the 1960s through the 1990s may have had many realistic possibilities of making a living.  A scholar who would not subject his or her research project to the warmaking ambitions of the power elite might in those days have been confident that other, more peaceful opportunities would present themselves.

Today, the noise that rings through the halls of the American academy is the noise of desperation.  Every year, graduate schools produce more Ph. D.s; virtually every year, universities hire fewer faculty members.  The newly minted doctors of philosophy generally enter the glutted labor market saddled with tens or even hundreds of thousands of dollars of student loan debt.  Therefore, the alternative facing an academic today is rarely between ethically acceptable and ethically unacceptable work.  Rather, the academic must choose, on the one hand, to making himself or herself agreeable to whoever might be in a position to grant the favor of a career, or, on the the other hand, to vanish from the academic world and sink into a life of poverty.   When noise like that is battering away at one’s mind, it can be difficult indeed to hear the voice of conscience.

Ancient Regime

Shortly before the stock markets closed yesterday afternoon, the US Supreme Court announced a ruling on the so-called “Affordable Care Act” (also known as ACA.)  Health care stocks generally rose on the news of the ruling, in some cases sharply, while shares in health insurers showed a mixed reaction.  Today, the trend has been slightly downward across the board.

A majority of the US Supreme Court held that the US government does have the power to compel citizens and other residents of the USA to buy health insurance.  While the court rejected the Obama administration’s argument that this power, the core of the law, was within the scope of the authority the Constitution grants the federal government to regulate interstate commerce, it concluded that, because the law is to be enforced by the Internal Revenue Service in the process of collecting taxes, it is supported by the government’s authority to levy taxes.

In effect, the law establishes a tax that will be paid directly to health insurance companies.  US residents who refuse to pay this tax will be assessed an alternative tax, one paid to the treasury.  As written, the statute did not include the word “tax,” speaking instead of “premiums” and “penalties.”  These words are euphemisms.  This is clear not only from the Supreme Court’s legal reasoning, but also from the most basic economic logic.  A law which directs people to dispose of their wealth in a particular way to advance a particular set of policy objectives is a tax, whatever label marketing-minded politicians may choose to give it.

Many opponents of the ACA have spoken out against the idea of a tax directly payable to private citizens.  For example, today on the Counterpunch website Dr Clark Newhall complains that the bipartisan Supreme majority represents “Corporatists United.”  Dr Newhall denounces the statute and the ruling in strong terms.  I would like to make three quotes from Dr Newhall’s piece abd add my own comments to them:

In an eagerly anticipated opinion on the Patient Protection and Affordable Care Act, colloquially known as “Obamacare’, an unusual alignment of justices upheld the Act nearly entirely.  The crucial part of the decision found the ‘odd bedfellows’ combination of Chief Justice Roberts joining the four ‘liberal’ justices to uphold the ‘individual mandate’, the section of the law requiring all Americans to buy health insurance from private health insurance companies…

Many supporters of the ACA object to the term “Obamacare.”  The law was crafted on the model of a regime of health insurance regulations and subsidies enacted in Massachusetts in 2006.  That regime is widely known as “Romneycare,” in honor of Willard M. Romney (alias “Mitt,”) who, as Massachusetts’ governor at the time, had been its chief advocate.  So calling the federal version “Obamacare” is simply a matter of continuing to follow the Massachusetts model.  Now, of course, Mr Romney is the Republican Party’s choice to oppose Mr Obama in this year’s presidential election.  Therefore Mr Romney and his surrogates are creating much merriment for political observers by trying to attack the president’s most widely-known legislative achievement, which as it so happens is identical to Mr Romney’s most widely-known legislative achievement.

Dr Newhall goes on:

Those who make, interpret and enforce the laws no longer lie on the ‘left-right’ political continuum. Instead, they are in effect at ‘right angles’ to that continuum.  The ideology that drives the Supreme Court, the political administration and the Congress is not Conservative or Liberal but can best be described as “Corporatist.”  This is the ideology that affirms that “corporations are citizens, my friends.”  it is the ideology that drove the Roberts Court to the odious Citizens United decision.  it is the ideology behind a bailout for banks that are ‘too big to fail.’  And it is the ideology that allows Congress to pass a law like the ACA that is essentially written by a favored industry…

It seems to me very clear what Dr Newhall means to evoke in these sentences is the spectre of fascism.  During the 1930’s, fascists in Italy, Britain, Belgium, and several other countries used the words “fascism” and “corporatism” interchangeably, and economic historians still cite Mussolini’s Italy, and to a lesser extent Hitler’s Germany, as examples of corporatist economics in practice.  The American diplomat-turned-economist-turned-journalist-turned-pariah Lawrence Dennis argued in a series of books in the 1930’s that laissez-faire capitalism was doomed, that state ownership of industry was a dead end, and that the economic future of the developed world belonged to a system in which the state coordinated and subsidized the operations of privately-owned corporations.  The most famous of the books in which Dennis endorsed this system was titled The Coming American Fascism.

Not only the word “corporatism,” but also the image of a ruling elite “at right angles” to the old left/right politics might well remind readers of fascism.  The fascists continually claimed to represent a new politics that was neither left nor right; while such anticapitalist fascist tendencies as il fascismo della sinistra or Germany’s Strasserites were not markedly successful in the intra-party politics of fascist movements,* all fascist parties used anticapitalist rhetoric from time to time (think of the “National Socialist German Workers’ Party,” and of Joseph Goebbels’ definition of revolution as a process by which the right adopts the language and tactics of the left.)  Moreover, the image of “left” and “right” suggests that political opinions form a continuum that stretches from one extreme to another, with any number of points in between.  That in turn suggests that people who disagree may have enough in common with each other that their conflicts may be productive.  Fascism, on the other hand, demands a one-party state in which a single ideology is imposed on everyone.  Fascism finds nothing of value in political conflict, and strives to annihilate disagreement.  I think that’s what the late Seymour Martin Lipset was driving at in his book Political Man when he placed most fascist movements, including the Italian fascists and German Nazis, not on the far right, but in the “Radical Center.”

Counterpunch is edited by Alexander Cockburn, who recently declared that the United States of America has completed its transition to fascism.  So it would not be surprising if by these remarks Dr Newhall were insinuating that the ACA is fascist in its substance.  I would demur from such an assessment.  Before I can explain why, permit me to quote one more paragraph from Dr Newhall’s piece:

Why does Corporatism favor Obamacare?  Because Obamacare is nothing more than a huge bailout for another failing industry — the health insurance industry.  No health insurer could continue to raise premiums at the rate of two to three times inflation, as they have done for at least a decade.  No health insurer could continue to pay 200 million dollar plus bonuses to top executives, as they have done repeatedly.  No health insurer could continue to restrict Americans’ access to decent health care, in effect creating slow and silent ‘death panels.’  No health insurer could do those things and survive.  But with the Obamacare act now firmly in place, health insurers will see a HUGE multibillion dollar windfall in the form of 40 million or more new health insurance customers whose premiums are paid largely by government subsidies.  That is the explanation for the numerous expansions and mergers you have seen in the health care industry in the past couple of years.  You will see more of the same, and if you are a stock bettor, you would do well to buy stock in smaller health insurers, because they will be snapped up in a wave of consolidation that dwarfs anything yet seen in this country.

Certainly the health insurance industry was in trouble in 2009, and the ACA is an attempt to enable that industry to continue business more or less as usual.  In that sense, it is a bailout.  Indeed, the health insurance companies are extremely influential in both the Democratic and Republican parties, and there can be little doubt that whichever of those parties won the 2008 elections would have enacted similar legislation.  Had Mr Romney been successful in his 2008 presidential campaign, doubtless he would have signed the same bill that Mr Obama in fact signed.  The loyal  Democrats who today defend the ACA as a great boon to working-class Americans would then be denouncing it in terms like those Dr Newhall employs, while the loyal Republicans who today denounce the ACA as a threat to the “free-enterprise system” that they fondly imagine to characterize American economic life would then defend it on some equally fanciful basis.

In a deeper sense, however, I disagree with Dr Newhall’s assessment quite thoroughly.  A moment ago, I defined taxation as any law that requires people to dispose of their wealth in particular ways to advance particular policy objectives.  If we think about that definition for a moment, we can see that the United States’ entire health insurance industry exists to receive taxes.  In the USA, wages paid to employees are subject to a rather heavy tax called FICA.  Premiums that are paid for employees’ health insurance policies are not subject to FICA, and so employers have an incentive to put a significant fraction of their employees’ compensation packages into health insurance premiums.  Since the health insurers have been collecting taxes all along, it is quite misleading to call the ACA a bailout.  It is, rather, a tax increase.

Now, as to the question of fascism, certainly fascist regimes did blur the line between the public and private sectors.  The most extreme case of this was of course the assignment of concentration camp inmates as slave labor for I. G. Farben and other cartels organized under the supervision of the Nazi state.  So it would not have been much of a stretch for fascists to grant corporations the power to collect taxes.  Even if they had done so, however, fascists could hardly claim to have made an innovation.  Tax farming, the collection of taxes by private-sector groups in pursuit of profit, was the norm in Persia by the sixth century BC, and spread rapidly throughout the ancient world.  In ancient Rome under the later Republic, tax farming proved itself to be a highly efficient means of organizing tax collection. So the fact that tax farming is one of the principal aspects of the US economy is not evidence that the USA is a fascist or a proto-fascist regime.  Indeed, the fact that the Supreme Court seriously considered a case that would have challenged the legitimacy of tax farming is an encouraging sign, however unedifying the opinions that the court issued as a result of that consideration might be.

Of course, in the ancient world tax farmers bid competitively for the right to collect taxes, and the winners put their bids into the public treasury.  In the USA, there is no such bidding, and no such payment.  Instead, wealthy individuals and interest groups buy politicians by financing their campaigns and their retirements.  Perhaps we would be better off to adopt the ancient system.

At any rate, “fascism” seems a misnomer for our economic system, almost as misleading as “free enterprise” or as anachronistic as “capitalism.”  A more accurate term, at least as regards the components that are dominated by tax farming, would be neo-feudalist.  The US political class is increasingly an hereditary class; Mr Obama defeated the wife of a former president to win his party’s nomination to succeed the son of a former president, and now faces the son of a former presidential candidate in his campaign for a second term.  This hereditary nobility will now sit atop a system in which the non-rich are legally obligated to pay tribute or provide service to those in power in the land, who will in turn honor certain obligations to them.

*Fascism being what it was, “not markedly successful in intra-party politics” often meant “shot several times in the head and dismembered,” as happened to Gregor Strasser.

Some points to consider when deciding how to vote

This morning a story went out on the Associated Press wire that appeared in American newspapers under titles like “Undecided voters may sway presidential election.”  These two paragraphs got me thinking:

“I don’t believe in nothing they say,” says Carol Barber of Ashland, Ky., among the 27 percent of the electorate that hasn’t determined whom to back or that doesn’t have a strong preference about a candidate.

Like many uncommitted voters, Barber, 66, isn’t really paying attention to politics these days. She’s largely focused on her husband, who just had a liver transplant, and the fact that she had to refinance her home to pay much of his health bill. “I just can’t concentrate on it now,” she says before adding, “If there were somebody running who knows what it’s like to struggle, that would be different.”

It takes a bit of a imagination to think of ways the U. S. political system might be reformed so that a person could go from Ms Barber’s current position to the presidential nomination of a major party.  While President Obama as a child lived for a time in a household eligible for food stamps, and as recently as 1996 both major parties nominated candidates who had begun their lives in very modest economic circumstances, by the time each of those men entered his thirties he had risen well into the upper middle class.  It isn’t to downplay the challenges that faced the poor children Bill Clinton, Bob Dole, and Barack Obama once were that I point out that none of them ever had to keep a gravely ill spouse alive by taking on substantial debt at a time when he likely believed that his working days were numbered.

I could suggest some reforms that might empower people like Ms Barber.  Among those suggestions would be the devolution of as many legislative powers as possible to neighborhoods and other localities small enough for all citizens to assemble in face-to-face meetings, and of executive powers to boards of citizens chosen by lot.  Such a system worked quite well in ancient Athens, and when systems like it are given a chance they work well in the modern world.   However,  I doubt that such reforms will be adopted any time soon.  So, granted that we are stuck with a system in which politics is conducted on a continental scale and the average citizen can signal her or his policy preferences only by voting in occasional election, what questions should we ask as we decide how to vote?

I agree with Ms Barber that we need people in politics who can see the world from some point of view other than that of the moneyed elite among whom presidential candidates typically move.  I’d add that people like Bill Clinton, Bob Dole, and Barack Obama may be the last people we should expect to adopt such a point of view.  A man who rose from a childhood of poverty and obscurity to wealth and power is likely to have learned two lessons from the experience: first, that it’s no fun to be poor; second, that the way out of poverty is to make oneself useful to the rich.  Such politicians may be able to empathize with the non-rich, especially the very young among them, but they are the very last people we would expect to go out on a limb for the sake of people who are not in a position to advance their careers. And people who have been anything other than rich as adults are simply not going to have the resume that people expect of presidential candidates, let alone have the connections to organize a viable national campaign.

So, if the candidate’s personal experience of economic or other hardship is not a major criterion to use in deciding how to vote, what is?  I brought up the 1996 presidential campaign, not only because Ms Barber’s remark reminds me of the Clinton-Dole pairing, but also because I read a magazine article during it that has helped to clarify my political thinking ever since.  Written by David Samuels, it was titled “Presidential Shrimp: Bob Dole Caters the Political Hors d’Oeuvres” and appeared on pages 45 through 52 of the March 1996 issue of Harper’s Magazine (volume 292, number 1750.)  Subscribers to Harper’s can access the article online here; I stopped subscribing to it years ago, and of course I don’t keep 16 year old magazines around the house, so when I read Ms Barber’s remark this morning I had to take a trip to the library to track the article down.

One night in December, 1995, Mr Samuels’ press credentials gained him admittance to a fundraising dinner for the Dole campaign.  The dinner, held at the Sheraton Hotel in Boston, was organized by a group of Massachusetts businessmen, among them “Mitt Romney, the Mormon banker who nearly knocked off Ted Kennedy in the Senate race here in 1994.”  It’s a bit misleading to say that Mr Romney “nearly knocked off Ted Kennedy” in that race; though an early poll or two had given Mr Romney a narrow lead, at the end of the day Mr Kennedy was reelected by a margin of 58% to 41%, hardly a squeaker.  Nor is it accurate to call Mr Romney a banker; as a private equity operator, he borrowed a great deal of money, though he neither lent money nor held it in trust in the way banks do.   Be that as it may, it was a bit of an uncanny moment to see his name in an article from so many years ago that I was looking up for insight into an election in which he is one of the leading candidates.  What they call an “Eldritch moment,” I suppose.

Mr Samuels used vignettes from that dinner to illustrate several points about how U. S. political campaigns operated in those days.  After listing many of the major donors in attendance, Mr Samuels writes: “If Bill Clinton is the candidate of high-wage, capital intensive business- investment banking, high tech, and entertainment- Dole looks increasingly like the candidate of low-wage, labor-intensive retail, manufacturing, and small business” (pages 49-50.)  Nowadays, a candidate with a donor profile dominated by retail, manufacturing, small business, and agribusiness concerns would be unlikely to advance as far as Mr Dole did; as Tom Frank demonstrates in his recent book Pity the Billionaire, it is precisely these groups that have funded the “Tea Party.”  Despite the headlines that tendency generated, it certainly did not represent much of an inconvenience for Mr Romney’s finance capital-backed march to this year’s Republican presidential nomination.

Mr Samuels describes Mr Dole’s public persona in a way that rings true to me: “[T]here is something appealingly adult about Dole’s performance.  As he smirks and blinks, and tramples on his applause lines, it is not hard to imagine some kind of fundamental honesty that prevents him from pulling out all the stops and putting on the expected show.  Dole’s best lines, his best moments in the Senate, have in common a weary and knowing respect for his audience.  The very depth of Dole’s cynicism can even translate as charm: ‘I’m not going to lie to you’ is one of the few lines that the senator delivers with any conviction in public, not because Dole doesn’t lie but because, unlike so many politicians, he is at least aware that he is lying” (page 51.)  As a connoisseur of world-weary cynicism, my favorite moment of the 1996 campaign came when Mr Dole, expected to repeat his campaign slogan “Bob Dole. A Better Man.  For a Better America,” said “Bob Dole.  Better man with a better plan.  Or whatever.”  The man had such contempt for the process that he couldn’t be bothered to memorize his own slogan.  That almost made me want to vote for him.

This image of Bob Dole as a man who “is at least aware that he is lying” inspires Mr Samuels to a flight of political science fiction: “In a rational political system, of course, geared to show off the strengths of the two opposing candidates for the highest office in the land, Bob Dole would be allowed to go on television and explain to the voters who is supporting him (and why,) who is supporting Bill Clinton (and why,) and encourage the voters to choose between them based on this practical knowledge.”  Mr Dole’s “weary and knowing respect for his audience” made it possible to imagine him operating under those conditions.  I can almost hear his voice saying “I represent a consortium of investors drawn from private equity, agribusiness, trucking, manufacturing, and retail.  They want a capital gains tax cut, managed trade deals like NAFTA, subsidies for exports, a rollback of workplace safety standards, and lax enforcement of securities regulations.”

I don’t disagree that our evaluation of the opposing candidates should begin with consideration of their sponsors and of what those sponsors expect in return for their investment.  But it mustn’t end there.  In a two-party system, we not only elect one party to fill an office, we also elect the other party to serve as the opposition.  So we should not only consider each party by the potential office-holders it offers us, but also by its likely effectiveness as an opposition party.  The first presidential election in which I voted was 1988.  I remember one afternoon that autumn when I read literature from the campaigns of George H. W. Bush and Michael Dukakis.  The more I read, the less appealing either of them looked.  The next day, I was walking to a class when it occurred to me that whichever of them was elected, Congress would rewrite any proposals he sent them.  That struck me like a thunderbolt.  Suddenly it was obvious to me that a President Dukakis would be in no position to enact the parts of his platform I disliked, while the President Bush we actually ended up with would have a relatively easy time enacting his very worst ideas.  So it was easy for me to vote for Mr Dukakis.

Moreover, while it is undoubtedly true that the people who provide the money for a campaign set the boundaries to the policies the candidate can espouse, that campaign must also enlist the support of groups that provide little money but many votes.  So, our parallel universe Bob Dole would tell us not only what his sponsors expected in return for their money, but also what they had authorized him to offer to constituency groups whose support he needed.  For example, none of his principal backers had a financial stake in the abortion-rights debate, yet Mr Dole adopted a rigidly anti-abortion line in preparation for the 1996 campaign.  A Republican candidate who failed to do so at that time would have lost his hold over voters without whose support he would have had no chance at all in the Midwestern states where presidential elections are usually decided.  A pro-choice Bob Dole would have been a certain loser and therefore an extremely poor investment.

So, when we elect a president, we elect three things: we elect a consortium of investors to serve as the president’s de facto Executive Council; we elect the other party as the official opposition; and we elect the most volatile constituency groups within the president’s coalition to a position in which they have a veto over executive action.  Notice, it is not the largest groups backing the president that hold this veto; it is the groups whose support the president cannot take for granted and must earn.  Therefore, when we choose a presidential candidate, we should do so because we see a way in which the economic interests of that candidates’ backers will promote the national interest as we understand it; because the other party, as the opposition party, is able to block the worst aspects of our candidate’s agenda and unable to block some of its best aspects; and because our votes, coming from us as members of particular constituencies,  are unlikely to send a signal that the candidate’s party can take our support for granted.

Mr Samuels, writing more than 16 years ago, noted that wealth was rapidly becoming more concentrated in the USA: “That the economic program of the new Democratic financiers may also imply the continuing hemorrhage of American jobs abroad is of little concern to those who pay the party’s bills today: with the rich getting richer and the poor getting poorer at unprecedented speed, the first term of the Clinton presidency bears an alarming resemblance, in its effects if not in its tone, to that of Ronald Reagan” (page 48.)  In the years since, this process of concentration has reached fantastic levels, as the financial sector’s elite has pulled away from every other group.  Mr Samuels describes scenes in which manufacturing bosses join the likes of Mr Romney and other financiers as the senior-most figures at the top table.  Today presidential candidates treat the heads of manufacturing businesses the way they treat disabled children,  seating them at the dais when they plan to introduce the as inspiring examples of what is still possible in America.  “And they are going to keep that factory and those jobs right here in the USA!,” applause, applause.

As the number of people who qualify as truly rich and the range of fields in which their fortunes are amassed shrinks, the universe of moneymen who can finance national campaigns shrinks even more rapidly.  It shrinks not only in number, but also in the variety of interests it represents.  This shrinking variety has three major consequences.  First, the differences between the major parties fade into irrelevance as they come to depend not only on consortia of investors who are equally rich, but on consortia that are drawn from the same sectors and that massively overlap in membership.  Second, the likelihood grows that the moneyed elite, small as it is and detached as it is from any but a tiny handful of concerns, will become bizarre, absorbed in ideas that may come naturally to its members for some economic or other reason, but which have no relevance to the public at large.  Third, the less rapport there is between an elite and the public it governs, the more repressive its government is likely to be.

These three processes are all well advanced in the USA.  For evidence that the differences between the parties are fading into irrelevance, consider the unprecedented level of legislative and executive activity in Washington in the last twenty years.  Contrary to the weirdly fashionable complaint that national politics is mired in gridlock, the Congress has in these last decades appropriated money by the trillion, cut taxes by the trillion, and condoned the printing of dollars by the trillion, deregulated entire industries,  required citizens to pay taxes directly to corporations in favored industries, established massive new agencies, started several wars of aggression, and granted the president unrestricted power to monitor, detain, torture, and kill whomever he pleases.   Granted, politicians running for reelection rarely point to any of this activity as an achievement of which they are proud, and not one item of it enjoys the support of even a plurality of voters, let alone a majority.  But it certainly constitutes extreme productivity, and every part of it was enacted with broad bipartisan support.  The unpopularity of this formidably efficient bipartisan cooperation attests to the detachment of the moneyed elite that sponsors both parties from the life of the country more generally.  The legislation that Presidents George W. Bush and Barack Obama have signed granting their office the powers of a police state show that the donors behind both men see the nonrich public as a source of danger to their position and want to give their political agents the means to intimidate it into silence.

Even when there is a functioning avenue of communication between the elite and the rest of us, minor parties are essential to a two party system.  Voters who decide that the party they usually support has become too different from the other party can signal their displeasure by crossing over to support the other party.  But in the absence of minor parties, voters who decide that their party has become too much like the other party have no effective way to signal their opinion.  Abstaining can send that message, but may not give the party a clear incentive to alter its behavior.  Given a choice between continuing to do what they have been doing and holding on to whatever success they have already gained or changing their approach in hopes of bringing nonvoters back to the polls, surely it would be a rare leadership cadre that would take the path of high risk.

When the ruling elite has drifted as far from the voting public as they have in the USA, the role of minor parties is crucial.  The only party that will resist the excesses of the elite, let alone embrace a program that may reverse the centralization of power in ever fewer hands, is one that faces certain defeat otherwise.  The Republican Party draws its base of support from voters who are comfortable with hierarchy ; it is therefore unlikely to become the vehicle for such resistance.  The Democratic Party absorbs the votes of people who want to create a more open political system; if that is the goal, it is therefore necessary either to wrest control of the Democratic Party from its current sponsors, or to destroy it and make way for a new party that will rise to that challenge.  Therefore, I will cast my ballot for Rocky Anderson for president.

Counterpunch, 1-15 March 2012

In the latest issue of Counterpunch, JoAnn Wypijewski tells the story of Keith Jennings, a resident of Stony Ridge, Ohio.  Mr Jennings couldn’t keep up with his house payments, so the bank owns it now.  He has responded to this by enlisting a group of local youths to seal the house off, covering it in tar and cement.  Ms Wypijewski is at pains to portray Mr Jennings and his cohorts as a thoroughly unheroic bunch.  Their lack of heroism is precisely what makes their odd little story seem urgent to her.  They stand for all the forgotten eccentrics who have, over the centuries, done odd, apparently pointless things that have made life a little bit more complicated for people in power, and have thereby helped to prepare the way for the great figures whose names we do remember.

Harry Browne asks “How Toxic is the Fog of Benevolence in Foundation Journalism”?  Mr Browne points out that, while many people express concerns about possible conflicts of interest when journalistic enterprises are parts of big businesses, very few express such concerns about journalism that is funded by philanthropic institutions.  Considering that philanthropic institutions are usually endowed and overseen by the very people who have the greatest influence over big businesses, this certainly is a strange state of affairs.  It is all the stranger in view of the fact that for-profit journalism must appeal to a broad public, while charity projects need only satisfy their funders.

Self-described “adventurer, chef, yogi, and army wife” Rachel Ortiz contributes “Faith: An Atheist Perspective.”  As a Jewish teenager in Texas, Ms Ortiz fell in with a group of very outgoing Southern Baptists.  Converting to their faith, she spent three years being happy at church and miserable at home before she started asking questions that the Southern Baptists couldn’t answer.  After a period away from church, the 16 year old Ms Ortiz went back as an observer.   She was appalled to see everyone moving at the same times and speaking in the same ways during the service.  This seemed to her a sign of “brainwashing.” She writes:

I began to notice that when children “spoke in tongues,” it sounded remarkably similar to the way their parents sounded when they spoke in tongues.  I noticed that everyone simultaneously knew when to bow their heads, when to stand, when to sit, when to clap, when to say Amen!  It was in that moment that I knew to the very core of my being that I had been, and all of them were, brainwashed.

My reaction to this was a bit complicated.  Mrs Acilius and I pay regular visits to a couple of nearby Anglican and Lutheran churches.  There, everyone simultaneously knows when to bow their heads, when to stand, when to sit, when to kneel, when to say amen.  If that’s the result of brainwashing, it’s the least subtle brainwashing imaginable. They give you a paper when you go in the door on which a full set of instructions are printed.  It isn’t subliminal recruiting, but superliminal recruiting.  So the picture Ms Ortiz painted did not immediately strike me as sinister.

On the other hand, most Sundays we can be found in a Quaker meetinghouse.  Mrs Acilius is a member of the meeting, and I am also active in it.  In traditional Quaker meetings, shared silence is communion and an explicit agenda is a sign of the secular.  The one we attend isn’t like that.  They have a bulletin with a list of Sunday morning Protestant stuff, including hymns, a sermon from the pastor, etc etc etc.  There are some moments which are not stuffed full of planned events, what Quakers call “Open Worship.”  In these moments we usually sit silently together, but occasionally someone feels compelled to speak.  These moments are usually too brief to be a meditative experience that quiets the mind.  Frankly, that’s part of the reason why we keep going back to the neighboring liturgical churches; a well-executed service there is a single experience, and has a clarifying effect similar to that which an hour of meditation in communal silence can provide.  By contrast, the brief interludes of silence in our very churchy Quaker meeting often represent interruptions in a little series of tasks that all concerned are busily keeping up with.  Even so, the meeting fits into what is often called the “Free Church” tradition of Protestantism, in which congregations value spontaneity and individualism.  Because of these values, Mrs Acilius’ fellow members grow uneasy when we remark on the amount of busy-ness that is packed into that hour.  Thinking of their reactions when we talk about how little spontaneity there is in the meeting, it is easy to understand how a Free Church Protestant could be shocked to see a group of worshipers behaving in the highly coordinated manner Ms Ortiz describes.

Deep in the brain

An article about brain parasites that breed in cats and spread to creatures, possibly including humans, that then become unreasonably attracted to cats appeared in the March 2012 issue of The Atlantic.  The article triggered vast amounts of comment around the web; I’ll just mention that it appeared at about the same time Gregory Cochran argued on his “West Hunter” blog that the likeliest biological basis for homosexuality is a brain parasite.  If this strikes you as an obnoxious point to make, you are well on your way to grasping the nature of Dr Cochran’s mission.

The late Christopher Hitchens often irritated me, though not in the way that Dr Cochran sets out to irritate people.  I read his column in The Nation for many years, and always wondered what percentage of their working day that magazine’s widely praised fact-checkers spent correcting his misstatements, exaggerations, and outright falsehoods.  A few always slipped through; my personal favorite was this, from his column of 22 October 2001:

There are others who mourn September 11 because it was on that day in 1683 that the hitherto unstoppable armies of Islam were defeated by a Polish general outside the gates of Vienna. The date marks the closest that proselytizing Islam ever came to making itself a superpower by military conquest. From then on, the Muslim civilization, which once had so much to teach the Christian West, went into a protracted eclipse. I cannot of course be certain, but I think it is highly probable that this is the date that certain antimodernist forces want us to remember as painfully as they do. And if I am right, then it’s not even facile or superficial to connect the recent aggression against American civil society with any current “human rights issue.”

I agree that it is foolish to regard the attacks of 11 September 2001 as an act of political protest, but that is not because Hitchens was right in his suspicion that their perpetrators chose the date 11 September from an obsession with the events of the seventeenth century.  A correction appeared in the following issue pointing out that the Ottoman forces actually suffered their defeat on 12 September 1683, not 11 September.  Hitchens, in his next column, dug his heels in and argued that because the battle began the previous day, he shouldn’t have to give up his point.  In defense of this apparently preposterous stance, he quoted a remark in which Hilaire Belloc put the battle on 11 September, then said that Belloc’s “awful ‘Crusader’ style is just the sort of thing to get him noticed by resentful Islamists.”

The same column in which Hitchens tried to salvage his theory that 9/11 was a reprisal for Hilaire Belloc’s prose style includes a quote from G. K. Chesterton.  Chesterton and Belloc were so closely associated that in their day they were often referred to as “Chesterbelloc.”  This issue of The Atlantic includes an essay by Hitchens about Chesterton, who was apparently one of his favorite authors.  I didn’t think of it in 2001, but it explains a great deal about Hitchens to think of him as a follower of Chesterton and Belloc.  Like those men, he was a prolific writer who prided himself on a fluent style, showed significant erudition in a wide range of fields, and did not particularly trouble himself about questions of fact.  Also like Chesterton and Belloc, he was an insistent and grossly unfair apologist for his religious ideas.  Chesterton and Belloc defended the Roman Catholic church by presenting every other faith tradition in an absurdly negative light; Hitchens simply added one item to their catalogue of strawmen when he set up shop as a professional atheist.  The essay in this issue raises the possibility that Hitchens imitated at least some aspects of Chesterton and Belloc’s work deliberately, as well as exhibiting an influence that stemmed from his early and long exposure to them.

Sandra Tsing Loh describes the difficulties she faces adjusting to the idea that her father, Eugene Loh, is in a long, terminal decline, and that she is his caregiver. The article’s hook is “Why caring for my aging father has me wishing he would die.”  I shouldn’t think that would require much explanation.  It is difficult to watch a loved one suffer irretrievable losses, stressful to take care of another person, and natural to resent unfamiliar responsibilities.

I suspect that everyone who has ever occupied Ms Tsing Loh’s current position has at least momentarily wondered how much nicer things would be if the other person would just hurry up and die already.  If Ms Tsing Loh had written a short story about a fictional character in her position who couldn’t shake that thought, she would have explored a facet of the human experience* that needs acknowledgement.  By choosing to forgo the distancing mechanism of fiction and write a first person account, complete with photographs of Mr Loh, she is performing an entirely different sort of speech act.  She is not only confessing to this wholly predictable, probably well-nigh universal human response; she is also confronting her father and everyone else who loves him with a demand that they discard pretenses that have become conventional because they often make life more comfortable for people in their situation.  That demand, if met, would create a new kind of social situation, one which would be “honest” in the sense that it leaves raw emotions unconcealed.  However, that very honesty is another form of role playing, in which the members of the group play roles that might be appropriate in a therapeutic setting, though not necessarily so in the setting of a family group that is supposed to survive for many generations.  To keep people together for that long under all the stresses that come with family life, it’s necessary to develop a shared understanding of boundaries and to define ways to renegotiate boundaries.  Without those understandings, it’s impossible to predict each others behavior, which means that it is impossible to communicate without leaving the impression that one is saying more than one intends.  If Mr Loh were to recover the ability to read, I can hardly that he would not flinch when he realized that he was the theme of sentences like “if, while howling like a banshee, I tore my 91 year old father limb from limb with my own hands in the town square, I believe no jury of my peers would convict me.  Indeed, if they knew all the facts, I believe any group of sane, sensible individuals would actually roll up their shirtsleeves and pitch in.”  He might laugh, but I’m sure he would flinch.

*I’m familiar with the arguments against the phrase “the human experience”, and I still like to use it.  If you rehearse those arguments in the comments, be prepared to read long discussions of the thought of Irving Babbitt in response.

Unkept Republics

I named my online persona after Gaius Acilius, a man who lived in 155 BC, in part because the history Acilius wrote of Rome seems to have reflected some of the concerns that would define what scholars like Quentin Skinner call the “Republican Tradition” in political thought.  Professor Skinner has labeled such thinkers as Hobbes, Machiavelli, and Thomas More “neo-Roman” because of their preoccupation with themes that Romans like Acilius developed.  For example, all of these thinkers ask how a person can be called free when that person is dependent on the favor of others, and all of them answer with various schemes for creating compartments of social life within which people can be independent.  A couple of years ago, I suggested in this space that a way of developing this idea in a highly bureaucratized world like that of the twenty-first century might be to develop three conceptions of liberty in tandem with each other, as freedom from bureaucracy, freedom within bureaucracy, and freedom as a product of bureaucracy.  I called this suggestion “The Three Freedoms.”  So far as I can see, it is an idea which has had no influence on anyone.  I shouldn’t be surprised; I haven’t been trying very hard to draw anyone’s attention to it.  Gaius Acilius would probably be disappointed in me.

What brings all this to mind is a piece in the current issue of The Nation magazineYascha Mounk reviews Maurizio Viroli‘s The Liberty of Servants: Berlusconi’s Italy.  According to Mr Mounk, Professor Viroli accounts for Silvio Berlusconi’s long tenure at the forefront of affairs in Italy by arguing that “Berlusconi was able to stay in power because he transformed Italy from a republic into a kind of royal court.”  Not simply a monarchy, but a court.  Mr Mounk explains Professor Viroli’s terminology thus:

For him, a court system, far from being defined by the traditional trappings of royalty, is any arrangement of power whereby “one man is placed above and at the center of a relatively large number of individuals—his courtiers—who depend on him to gain and preserve wealth, status, and reputation.” Viroli calls the person at the center of the court system the signore. Even if it weren’t for the uncanny association with the droit du seigneur, it is clear why the label fits Berlusconi. Viroli is hardly exaggerating when he states that over the past few decades, “all of Italy’s political life has rotated around Silvio Berlusconi: all eyes turn to him, all thoughts, hopes, and fears.” He quickly became such a polarizing figure that the gulf between Italy’s left and right, which had been huge and vicious during much of Italy’s postwar history, has shrunk. What mattered most for Italians during his reign was whether one was for or against Berlusconi. In the summer of 2010, for example, several politicians on the left were prepared to fawn over Gianfranco Fini, a longtime fascist with center-right views, simply because he had broken with Berlusconi and spoken in public about his opposition to the prime minister.

Berlusconi not only made himself the Sun King of Italian politics; he acted like a Mafia don. At his word, pretty teenage girls became TV presenters, TV presenters ascended to the rank of government ministers and government ministers were offered lucrative jobs in various industries once they left office.

Mr Mounk goes on the explain the relationship between Professor Viroli’s views and those of the school associated with Professor Skinner:

For Viroli, Berlusconiland was more than a corrupt court. Drawing on republicanism, a long-neglected tradition of political thought that has recently been revived by intellectual historians and political theorists like John Pocock, Quentin Skinner and Philip Pettit, Viroli argues that Berlusconi’s corrosive influence has deprived Italians of their liberty. On Viroli’s account, philosophers who stand in the liberal tradition worry only about actual interference with a person’s actions. “A Free-Man,” wrote Thomas Hobbes with his characteristic crispness, “is he that, in those things, which by his strength and wit he is able to do, is not hindered to doe what he has a will to.” The subjects of a benevolent despot remain perfectly free so long as he does not inhibit their actions. Viroli argues that according to such a liberal conception of freedom, Berlusconi’s Italy remained a free country: “If we can rightly point to violations of liberty only in cases where fundamental civil and political rights are suppressed by force, then we Italians are, generally speaking, a free people.”

Yet for Viroli, the liberal definition of freedom, with its exclusive emphasis on freedom from interference, is too anemic. He worries that a ruler with vast, arbitrary power would have a chilling effect on the freedoms of his subjects even if he never chose to exercise his power. To emphasize this point, republicans such as Viroli like to cite the example of Tranio, the protagonist of a comedy by the Roman playwright Plautus. Tranio is a slave. But because his master is often absent, and because he is so wily, no one ever interferes with his actions. As long as he continues to flatter and manipulate his master, he is free to do as he pleases. And yet, the republicans point out, a slave is surely the very opposite of a “free man.”

While slavery is now officially banned throughout the world, Viroli argues that the most salient characteristic of slavery—the relation of domination and dependence between master and slave—persists in a milder form in our societies. “Citizens who can be tossed into prison arbitrarily by the police,” for example, stand in just such a relation of dependence to an oppressive, dominating power. Even if, for now, they nominally remain at liberty, they lack real freedom. In the case of Italy, though Berlusconi never used his vast power to interfere with the lives of Italian citizens, they knew that he could, at any moment, choose to do so. This lack of real freedom, Viroli argues, limited the things Italians dared to do as well as the words they dared to say.

Mr Mounk suspects that Professor Viroli’s model takes him at once too far and not far enough in his assessment of the damage that Mr Berlusconi did to Italy:

Viroli’s account of the theory of republican liberty is attractive, but his argument that Italians were, in his own sense, unfree is not convincing. Some Italians did find themselves in a true position of dependence on Berlusconi’s whims. Journalists at the networks and newspapers he controlled knew that one honest sentence could make the difference between a lucrative job and the dole. In a country where even many junior positions in business, government and academia have long been reserved for insiders and their children, many young people knew that their career prospects depended as much on their willingness to flatter Berlusconi or his cronies as on their ability to get the job done.

Nevertheless, even on a republican conception of liberty, most Italians remained free during Berlusconi’s rule. The reason is not just that Berlusconi never chose to interfere with the lives of his adversaries by, say, throwing a member of the opposition in jail for a rude op-ed; it’s that Italians knew perfectly well that Berlusconi had no more power to do such a thing than does Barack Obama. The price that opponents of Berlusconi were afraid of paying was not, as Viroli thinks, that Berlusconi might decide to interfere in their lives in an arbitrary manner but rather that he would choose not to tempt them with favors. For all the signore’s power and influence, ordinary Italians hardly lived in fear of his wrath.

One wonders exactly when these paragraphs were written; on 31 December 2011, Barack Obama signed into law a bill which grants him the power to throw anyone in jail on any grounds whatever.  So he is a rather poorly chosen example of an official with limited power to interfere with the lives of his adversaries.  Nonetheless, no such law seems to be on the books in Italy, and no Italian leader since Mussolini has behaved as if one did.

As Mr Mounk thinks that Professor Viroli’s model drives him too far when it implies that Italians have been reduced to slavery, so he claims that it prevents him going far enough in his analysis of aspects of the Berlusconi regime that liberalism also indicts:

The weakness of Viroli’s central assumption, that only the language of liberty can adequately express the horrors of Berlusconi’s rule, may explain why his account of Berlusconiland is not fully persuasive. Other critics of Berlusconi have written damning accounts of his reign, but instead of going so far as to claim that Berlusconi made Italians unfree, they have demonstrated that his government violated the equal treatment of citizens before the law, neglected the government’s duties to further the economic interests of its citizens and condoned corruption (failings that liberals as well as republicans condemn). In The Sack of Rome (2006), for example, Alexander Stille explains that Berlusconi’s business empire was, from its first days, built on political favors and rent-seeking. A true modernization of Italy’s economy would have given his companies unwanted competition and deprived them of crucial state subsidies. Berlusconi chose instead to preserve arcane rules and bureaucratic roadblocks, or even to create new ones, to protect his business interests. He sacrificed the country’s economic well-being for his own.

Berlusconi’s influence on the judicial system was equally disastrous. Whereas in many countries the statute of limitations cannot expire after a defendant has been indicted, in Italy defendants go free if the highest court of appeals has not upheld their convictions within the allotted time. Knowing this, Berlusconi’s attorneys, whom, in a rare instance of efficiency, he made members of Parliament, shortened the statute of limitations for the most troublesome white-collar crimes and devised rules to strengthen legal tactics for delaying trials. This change had the desired effect of aiding Berlusconi’s defense in his trials for false accounting and embezzlement. It also had the unintended effect of making it more difficult to jail members of the Mafia.

Even with these strictures, Mr Mounk’s final assessment of Professor Viroli’s book is strongly favorable:

Stille and others have described the disastrous economic and legal fallout of Berlusconi’s rule in much greater detail than Viroli. But Viroli, in his own way, paints an even more memorable portrait of Italy’s new ruling class. His description of Berlusconi as a signore is on the money. And while the servility of Berlusconi’s hangers-on may have been self-imposed, it still raises the central paradox of Berlusconiland. Absolute monarchs are able to cow their courtiers into submission by wielding the implicit threat of pain, imprisonment or execution. Berlusconi never had such tyrannical powers. Even so, his underlings acted as if they were mere courtiers—apparently, the hope of getting rich was quite enough to keep them in line. This makes the Italian case all the more relevant at a time when the superrich and their political enablers seek to wield ever more influence over democracies in a climate of austerity. It seems that to achieve their purposes, our would-be masters need not impede our rights or liberties: the promise of a farthing of their vast riches might be quite enough to turn many of us into docile servants.

Elsewhere in the issue, David Sarasohn contributes a piece with the resoundingly neo-Roman title “The Treason of the Senate,” in which he looks back to a series of essays published in 1906 and concludes that all the forms of corruption that marked the US Senate in the Gilded Age have reemerged and been joined by new evils.  Sarah Wildman’s “Israel’s New Left Goes Online” promoted a webzine called +972, which presents itself above all else as independent of ideological and institutional constraints characteristic of the Israel/ Palestine conflict.  Someone like old Gaius Acilius would certainly have been alarmed at a process that empowers extremist minorities and reduces citizens to dependence on increasingly professionalized security forces, so he likely would have understood +972s goals, whatever conclusion he might ultimately have reached regarding their politics.  Chris Savage writes of “The Scandal of Michigan’s Emergency Managers,” officials appointed by that state’s governor to replace elected municipal governments of whom he disapproves.  I think that someone in the republican tradition would say that the true scandal of this system is that there is no citizenry jealous of its rights that rises up in revolt when the governor pulls this stunt.  That same governor, incidentally, is the topic of Patricia J. Williams’ column in this issue; though he is a member of something called the Republican Party, he could hardly be called an heir of the republican tradition.

I’ll mention just one other piece, a review essay by Paula Findlen called “Galileo’s Credo.”  At various points in the development of the republican tradition, Galileo has been a powerful symbol of the autonomous individual maintaining his honor by refusing to knuckle under to the overweening power of a court.  Professor Findlen notes that as a young man, Galileo and his friends laughed at literal-minded neo-Romans who favored Latin over the vernacular and went about wearing togas.  Yet in his resistance to the demands of the Vatican, surely Galileo lived as the stubbornly independent noblemen of the old Res Publica would have recommended.

The American Conservative, March 2012

The table of contents of the March issue of The American Conservative seems to have a problem.  I haven’t seen the print edition yet, but the page numbers in the online edition’s table of contents  don’t match the pages numbers in the magazine. There was a similar, though smaller-scale, problem with last month’s issue.

In the cover story, Peter Hitchens argues that, while the snarling rage Margaret Thatcher continues to evoke in her opponents does go to show that she was a figure of great historical consequence, conservatives are quite wrong to adopt her as a model of political success.  Rather, her true significance is a tragic one, embodying the final collapse of a social ideal and of an approach to governance.  The reverence Lady Thatcher continues to enjoy on the Right in both the UK and the United States suggests to Mr Hitchens that her partisans in those countries have not come to terms with this collapse, and that their ability to formulate and direct national policy is handicapped by their attachment to these outworn notions.

Rod Dreher, the original “crunchy con,” takes a more optimistic view of another eminent Briton.  He gives a glowing writeup to Prince Charles, of all people.  Evidently Mr Dreher sees in His Royal Highness the prophet of a “revolutionary anti-modernism.”  I suppose it is a sign of my shortcomings that I can never keep an entirely straight face when the topic of the British Royal Family comes up; not being British, it would certainly be inappropriate of me to say that grown-up countries don’t have kings and queens.  But I will say that my favorite aspect of the British monarchy has always been the expectation that the various princes and princesses would keep their opinions to themselves.

Gary Johnson, who from 1995 to 2003 represented the Republican Party as governor of the state of New Mexico, has left that party and declared his candidacy for president as a member of the Libertarian Party.  W. James Antle gives sympathetic attention to the freedom-loving Mr Johnson and his quixotic campaign.  Mr Johnson and his fellow Libertarians oppose many things which I think are eminently worth opposing.  If they were the only ones speaking out against the crony capitalism, the wars of aggression, and the burgeoning police state that the Democrats and Republicans have combined to foist upon the USA, I would certainly vote for them.  Fortunately, however, former Salt Lake City mayor Rocky Anderson is running for president as a left-of-center candidate.   Mr Anderson stands against all the evils that the Libertarians would fight, and at the same time supports measures to ensure fair play for all to and restrain the excesses of the market.  Mr Anderson may not have much to offer the authors and editors of something called “The American Conservative,” but most of them are just as much opposed to Libertarianism as they are to the 1980s-style liberalism that Mr Anderson represents.

Our favorite Eve Tushnet returns to the magazine with an argument to the effect that the fear of divorce has spawned a social movement that has, paradoxically, weakened marriage in the USA.  Here’s one paragraph that’s too good not to quote:

Possibly in response to divorce scripts like “We just fell out of love,” or “It just happened,” which emphasize powerlessness, the contemporary delayed-marriage script attempts to crack the code, figure out the formula, and do it right.  The fact that marriage, like parenting, is mostly about acceptance, forgiveness, and flexibility in the face of change and trauma gets suppressed.

It’s hard to believe that a celibate like Ms Tushnet wrote such an insightful remark about the nature of marriage.   On the other hand, I don’t suppose Pythagoras was a triangle, and he came up with something useful to say about them.  Be that as it may, there’s some more great stuff in Ms Tushnet’s article.  For example:

A culture of love can’t be built on a foundation of rejection.  The path forward doesn’t include further stigmatizing divorce, or bringing back stigma against unmarried childbearing… What young people need is hope: a sense that marriages can last, not because the spouses were smart enough on the front end but because they were gentle and flexible enough in the long years after the wedding.

Samuel Goldman undertakes to explain “what sets conservatives apart from authoritarians and fascists,” a task prompted by a recent book that lumped together many writers who were in one way or another connected to the word “conservative” (in some cases by their own adoption of that label as a description of their ideological stands, in other cases by their affiliation with a political party with the word “Conservative” in its name, and in still other cases only by the fact that some self-described conservatives have spoken highly of them) and declared them all to be enemies of freedom.  Why so unimpressive a work should occasion an essay by anyone of Mr Goldman’s talent may seem mysterious, but the mystery lessens when one realizes that the author of the book actually occupies a chair of political philosophy at a well-known university.  When it first appeared, some critics noticed the author’s credentials and wondered if it was a parody of crude efforts by right-wingers to smear the word “liberalism” with tar from an equally injudicious brush, but that individual has insisted that he regards his production as a genuine contribution to scholarship.

Mr Goldman’s little essay is remarkable for the courtesy and patience which it shows towards this book and its author.  Not for Mr Goldman such words as “charlatan,” “impostor,” or “fraud.”  Nor does he engage even in subtle and urbane ridicule of his subject.  Instead, he takes it as an occasion for a concise exposition of major themes in the works of Edmund Burke and Joseph de Maistre.  Mr Goldman’s even temper, as much as his demonstration of the absurdity of the book’s characterization of those thinkers, exposes the depths of its author’s corruption far more effectively than could the most blistering polemic.

Atheism is no excuse for skipping church

In a recent review of Alain de Botton‘s Religion for Atheists: A Non-Believer’s Guide to the Uses of Religion, John Gray writes:

Rarely mentioned in the debates of recent years is that atheism has been linked with all kinds of positions in ethics, politics and philosophy. More particularly, there is no necessary connection – either as a matter of logic or in the longer history of atheist thinking – between atheism and the rejection of religion.

Atheist thinkers have rejected and at times supported religion for many different reasons. The 19th-century anarchist Max Stirner rejected religion as a fetter on individual self-assertion. Bakunin, Marx and Lenin rejected it because it obstructed socialist solidarity, while Nietzsche hated religion (specifically, Christianity) because he believed that it had led to ideologies of solidarity such as socialism. Auguste Comte, an atheist and virulent anti-liberal, attempted to create a new church of humanity based on science.

In contrast, the French atheist and proto-fascist Charles Maurras, an admirer of both Comte and Nietzsche, was an impassioned defender of the Catholic Church. John Stuart Mill – not exactly an atheist but not far off – tried to fuse Comte’s new religion with liberalism. In marrying atheism with very different ethical and political positions, none of these thinkers was confused or inconsistent. Atheism can go with practically anything, since in itself it amounts to very little.

Certainly a dictionary definition such as “the doctrine that there are no gods” amounts to very little.  Professor Gray champions such a definition:  “Rightly understood, atheism is a purely negative position: an atheist is anyone who has no use for the doctrines and concepts of theism.”  For my part, I am reflexively skeptical of any very simple, purely abstract definition of an ideological label.  I doubt that anyone adopts such a label as a self-description or responds powerfully to it as a description of a participant in a debate unless it suggests a rather substantial narrative.   “Atheist” is a label that millions of people wear with fierce pride, and that raises equally fierce anger and fear in hundreds of millions of others.  The strength of those reactions proves that the word has connotations for these people that go far beyond the tidy little abstractions of the dictionary, and their predictability shows that these connotations are much the same from person to person.   Therefore, I am not convinced that anyone anywhere is an atheist simply in the dictionary sense of the word.  There are people who reject particular religious beliefs that involve the existence of gods, and there are people who accept particular beliefs that exclude the existence of gods.  The key thing about each of these people is their relationship to those particular beliefs, to the people they know who espouse those beliefs, and to the institutions in their social worlds that are associated with those beliefs.  A label such as “atheist,” in the dictionary sense, would sort a pious Confucian, an orthodox Communist, and a militant freethinker together.  Certainly no category that includes three such disparate people could be a very important part of our understanding of the world.

As I am skeptical of the dictionary version of the word “atheism,” so too am I skeptical of the word “theism.”  The Oxford English Dictionary gives four definitions for “theism.”  (Not counting another, unrelated, word spelled the same way, which means “illness as the result of drinking tea.”)  These definitions are: “belief in a deity or deities; as opposed to atheism”; “belief in one god, as opposed to polytheism or pantheism”; “belief in the existence of god, with denial of revelation”; “belief in the existence of god, without denial of revelation.”  n the first of these senses, the word appears to be a back formation created by taking the prefix off of “atheism.”  The word is obsolete in the second sense, having been replaced by “monotheism.”  The third sense has been replaced by “deism”; where deism is a live option, its opponents still use the word “theism” to describe themselves.  In view of the word’s history, then, it would be as true to say that “theism” names a “purely negative position” as it is to say that “atheism” names a “purely negative position.”  A theist is someone who rejects the labels “atheist” and “deist” and will not play the social roles that come with those labels.

Again, no one does only this.  Those who call themselves “theists” are adherents of particular religions.  Surely, no one believes in “a personal god”; billions of people believe in the God their favorite preacher describes.  Mere theism is as unreal as C. S. Lewis’ “Mere Christianity.”  Indeed, the labels that name world religions cover so many people and so many cultures of faith that anyone can see the point the late Edward Said made when he proposed scrapping the term “Islam” on the grounds that such a word “imputes a unified and monolithic religious and cultural system” to what is in fact an infinitely diverse range of experiences lived by over a billion people scattered all over the globe.  How much worse then is a label that encompasses not only that range, but also the ranges of experience grouped under “Christianity,” “Judiasm,” Sikhism,” “Hinduism,” etc.

Professor Gray does recover a bit as the review goes on.  So:

Most people think that atheists are bound to reject religion because religion and atheism consist of incompatible beliefs. De Botton accepts this assumption throughout his argument, which amounts to the claim that religion is humanly valuable even if religious beliefs are untrue. He shows how much in our way of life comes from and still depends on religion – communities, education, art and architecture and certain kinds of kindness, among other things. I would add the practice of toleration, the origins of which lie in dissenting religion, and sceptical doubt, which very often coexists with faith.

Today’s atheists will insist that these goods can be achieved without religion. In many instances this may be so but it is a question that cannot be answered by fulminating about religion as if it were intrinsically evil. Religion has caused a lot of harm but so has science. Practically everything of value in human life can be harmful. To insist that religion is peculiarly malignant is fanaticism, or mere stupidity.

De Botton has done us a service by showing why atheists should be friendly to religion. Where he could have dug deeper is the tangled relations between religion and belief. If you ask people in modern western societies whether they are religious, they tend to answer by telling you what they believe (or don’t believe). When you examine religion as a universal human phenomenon, however, its connections with belief are far more tenuous.

The fixation on belief is most prominent in western Christianity, where it results mainly from the distorting influence of Greek philosophy. Continuing this obsession, modern atheists have created an evangelical cult of unbelief. Yet the core of most of the world’s religions has always been holding to a way of life rather than subscribing to a list of doctrines. In Eastern Orthodoxy and some currents of Hinduism and Buddhism, there are highly developed traditions that deny that spiritual realities can be expressed in terms of beliefs at all. Though not often recognised, there are parallels between this sort of negative theology and a rigorous version of atheism.

A couple of years ago, we noticed James P. Carse’s The Religious Case Against Belief, a book which argues not only that its beliefs are not the things which make a religious tradition most valuable, but that an excessive emphasis on beliefs is the surest way to drain a religious tradition of its value.  Professor Gray seems to be approaching Professor Carse’s views here.  He goes on to write paragraphs that will make any admirer of Irving Babbitt wince:

The present clamour against religion comes from confusing atheism with humanism, which in its modern forms is an offshoot of Christianity.

Unfortunately, de Botton falls into this confusion when he endorses Comte’s scheme for a humanist church. “Regrettably,” he writes, “Comte’s unusual, complex, sometimes deranged but always thought-provoking project was derailed by practical obstacles.” It is true that in accepting the need for religion Comte was more reasonable than the current breed of atheists. But it is one thing to point out why atheists should be friendly to religion and another to propose that a new religion should be invented for atheists.

The church of humanity is a prototypical modern example of atheism turned into a cult of collective self-worship. If this ersatz faith came to nothing, it was not because of practical difficulties. Religions are human creations. When they are consciously designed to be useful, they are normally short-lived. The ones that survive are those that have evolved to serve enduring human needs – especially the need for self-transcendence. That is why we can be sure the world’s traditional religions will be alive and well when evangelical atheism is dead and long forgotten.

I mention Irving Babbitt because of the episode that briefly made him a celebrity.  In 1930, Babbitt was 65 years old, and had for over 30 years taught French and Comparative Literature at Harvard University.  In those decades, he and his friend Paul Elmer More had assembled a school of learned followers who labeled themselves “the New Humanists.”  1930 was the year the New Humanists chose to make their debut as a movement.  A book featuring essays by Babbitt, More, and many of their followers (including Babbitt’s pupil T. S. Eliot) appeared under the title Humanism and America: Essays on the Outlook of Modern Civilization; Babbitt himself gave a lecture at Carnegie Hall, drawing an audience of 3000.  Much to the dismay of Babbitt and company, a circle around philosopher John Dewey also chose 1930 to launch a project under the name “the New Humanism.”  While Babbitt traced the criticism that he and his school practiced back to Erasmus and the other the Christian humanists of the Renaissance and claimed that it offered a way even for irreligious people such a himself to recognize the value of religion, the Deweyans were hostile to traditional religion and favored views quite similar to those Professor Gray describes above.  The extent of the Deweyans’ triumph in the battle for the word “humanist” can be measured not only by remarks like Professor Gray’s but also by the prosperity of the American Humanist Association, which had its origins in the Dewey group’s 1930 activities and which stands today as the USA’s foremost institutional champion of atheism.  Needless to say, the American Humanist Association’s successive “Humanist Manifestoes” make no reference to Babbitt and More, and certainly take no notice of Erasmus or any other Christian humanists.

Babbitt’s “humanism” suffered from many weaknesses, not least the fact that it was at least as sweeping a collection of diverse beliefs and experiences as would be sorted under the label “theism.”  Indeed, at the height of the “Humanist” controversy Paul Shorey slashed away at the New Humanists precisely because they made the term “humanism” bear an impossible burden.  Even as the dictionary versions of “theism” and “atheism” elide the whole world of religious experience, so too Babbitt’s conflation of all the sages, philosophers, and prophets of the past is, in Shorey’s words, “exposed to misunderstandings and misapplications, and Professor Babbitt wishes to deduce from it precisely his own ideals in religion, ethics, culture, philosophy, politics, and education.”  By contrast, Shorey declared himself  “content to take the word in a loose, fluid, literary way and in the traditional Renaissance sense of devotion to the Greek and Latin classics and to the cultural and ethical ideals that naturally result from an educational system in which they hold a considerable place.”  Babbitt would likely have claimed that he and his school used the word in the same way, but that they, unlike Shorey, had thought through the question of what “cultural and ethical ideals” can be expected to “naturally result” from various educational systems in which the Greek and Latin classics hold various places that might be called considerable.  In other words, what Shorey was doing with the word “humanism” may be very much like what Professor Gray is doing by invoking the dictionary definition of “atheism.”  In each case, the critic is trying to avoid a controversy by associating himself with a version of a word that is artificially drained of its connotations and narrative content and confined to a purely formal significance.  In each case, however, the word has associations that cannot be suppressed.  By trying to hide those associations behind the dictionary, the critic puts himself in a weak position.  If Shorey wished to escape from Babbitt’s attempt to overstuff the word “humanism” with all the wisdom in the world and to ground in it all of his preferred ideas, he would have been better advised to consider the particular uses of the word as evidenced by identifiable people in specific situations than to express a preference for a use of the word that differs from Babbitt’s chiefly in its greater vagueness.

Philosopher that he is, Professor Gray was never likely to declare that a term and the prejudices it expresses are best left unexamined.  His refuge in the dictionary, however, leaves him in a very awkward position.  For example:

“Religion,” writes Alain de Botton, “is above all a symbol of what exceeds us and an education in the advantages of recognising our paltriness.” It is a thought reminiscent of Blaise Pascal. One of the creators of modern probability theory, the 17th-century thinker invented an early calculating machine, the Pascaline, along with a version of the syringe and a hydraulic press. He made major contributions to geometry and helped shape the future development of mathematics. He also designed the first urban mass transit system.

Pascal was one of the founders of the modern world. Yet the author of the Pensées – an apology for Christianity begun after his conversion to Catholicism – was also convinced of the paltriness of the human mind. By any standards a scientific genius and one of the most intelligent human beings that may ever have lived, Pascal never supposed that humankind’s problems could be solved if only people were smarter.

The paradox of an immensely powerful mind mistrusting the intellect is not new. Pascal needed intellectual humility because he had so many reasons to be proud of his intelligence. It is only the illiteracy of the current generation of atheists that leads them to think religious practitioners must be stupid or thoughtless. Were Augustine, Maimonides and al-Ghazali – to mention only religious thinkers in monotheist traditions – lacking in intellectual vitality? The question is absurd but the fact it can be asked at all might be thought to pose a difficulty for de Botton. His spirited and refreshingly humane book aims to show that religion serves needs that an entirely secular life cannot satisfy. He will not persuade those for whom atheism is a militant creed. Such people are best left with their certainties, however childish.

I would be the last to deny that Pascal was a great mind, but neither would I say that atheism, even of the militant variety, has confined its appeal to people who can be dismissed as “best left with their certainties, however childish.”  As Professor Gray says, a bare denial of the existence of gods, considered in the abstract, doesn’t “amount to much.”  Yet there is something in the label “atheist” and the roles that atheists play in society that has a powerful attraction even to people who could have matched wits with Pascal.  Like Paul Shorey before him, Professor Gray has not followed his own lead.  As he is willing to break the “fixation on belief” in discussing religion, so too should he break the same fixation when discussing irreligion.

The Rodney King Era

The February 2012 issue of The American Conservative includes several pieces that reflect, directly or indirectly, on the presidential campaign currently underway in the USA, and a couple that have a broader interest.

The American Conservative started in 2002 as a forum for right-wingers who did not want the US to invade Iraq.  It continues to give voice to conservative anti-militarism.  Several items in this issue further develop right-wing arguments against warfare, among them: Doug Bandow’s “Attack of the Pork Hawks” (subtitle: “Loving the Pentagon turns conservatives into big-spending liberals”); William S. Lind’s “Clearing the Air Force,” which argues that the only useful functions of the United States Air Force are those that support operations led by the Army and Navy, and therefore that those functions should be transferred to those services while the independent Air Force is dissolved; and Kelly Beaucar Vlahos’ “Gitmo’s Prying Eyes,” about the Defense Department’s attempt to erase attorney-client privilege for the “unlawful combatants” it holds at Guantánamo Bay and elsewhere.  Noah Millman’s review of Gershom Gorenberg’s The Unmaking of Israel identifies Mr Gorenberg not by his usual sobriquet of “left-wing Zionist,” but as a “Jewish nationalist” who accepts a deeply conservative conception of nationhood as the maturity of a people, and who opposes Israeli occupation of the Palestinian territories because that occupation reduces Israel from achieved nation-state to insurgent revolutionary movement.

The cover story, Scott McConnell’s “Ron Paul and his Enemies,” notes that Dr Paul’s campaign has inspired levels of alarm and anger from various elite groups in official Washington far out of proportion to the modest levels of support the good doctor has attracted.  Mr McConnell’s explanation of this is that those bêtes-noires of The American Conservative, the “neocons,” fear that Dr Paul will trigger a movement that will threaten the prestige they enjoy in policy-making circles in the American government.  The neocons are the neo-conservatives, adherents of an intellectual movement that traces its origins to the anti-Stalinist Left of the 1930s and 1940s and its rise to political salience in the work of a group of activists, academics, and functionaries who attached themselves to the Senator Henry M. Jackson in the 1960s and 1970s.  Like the late Senator Jackson, the neo-conservatives are generally sanguine about the ability of the US government to do good by means of large scale programs intervening in the domestic affairs of both of the United States itself and of other countries.  The group around The American Conservative consists of old-fashioned conservatives and libertarians who are deeply skeptical of Washington’s potential as a doer of good in any sphere.  Mr McConnell’s argument, summed up in his piece’s subtitle– “An effective antiwar candidate is what the neocons fear most”– is that, even though neoconservatives now hold such a stranglehold on respectability in foreign policy discussions in official Washington that the manifest failure of their signature project, the invasion and occupation of Iraq, could not weaken it, they know that it is in fact very tenuous.  The mobilization of a powerful antiwar constituency within the Republican Party could send the neocons to the sidelines very quickly, he believes.  Therefore, they must move quickly to silence Dr Paul, lest the 29% of Republicans who tell pollsters that they share his antiwar views should crystallize into a force that could shift the national discussion away from the presuppositions of militarism.

One stick with which neoconservative spokesmen and others have beaten Dr Paul is a series of racially charged columns that appeared in newsletters he edited in the early 1990s.  Mr McConnell discusses the controversy over these columns thus:

Here the reprise of the story of the newsletters published under Ron Paul’s name 20 years ago proved critical. The New Republic had made a national story of them early in the 2008 campaign. James Kirchick reported that numerous issues of the “Ron Paul Political Report” and the “Ron Paul Survival Report” contained passages that could be fairly characterized as race-baiting or paranoid conspiracy-mongering. (Few in Texas had cared very much when one of Paul’s congressional opponents tried to make an issue of the newsletters in 1996.). With Paul rising in the polls, the Weekly Standard essentially republished Kirchick’s 2008 piece.

I’ve seen no serious challenge to the reporting done four years ago by David Weigel and Julian Sanchez for Reason: the newsletters were the project of the late Murray Rothbard and Paul’s longtime aide Lew Rockwell, who has denied authorship.* Rothbard, who died in 1995, was a brilliant libertarian author and activist, William F. Buckley’s tutor for the economics passages of Up From Liberalism, and a man who pursued a lifelong mission to spread libertarian ideas beyond a quirky quadrant of the intelligentsia. He had led libertarian overtures to the New Left in the 1960s. In 1990, he argued for outreach to the redneck right, and the Ron Paul newsletters became the chosen vehicle. For his part, Rockwell has moved on from this kind of thing.

Intellectual honesty requires acknowledging that much of the racism in the newsletters would have appeared less over the top in mainstream conservative circles at the time than it does now. No one at the New York Post editorial page (where I worked) would have been offended by the newsletters’ use of welfare stereotypes to mock the Los Angeles rioters, or by their taking note that a gang of black teenagers were sticking white women with needles or pins in the streets of Manhattan. (Contrary to the fears of the time, the pins used in these assaults were not HIV-infected.) But racial tensions and fissures in the early 1990s were far more raw than today. The Rockwell-Rothbard team were, in effect, trying to play Lee Atwater for the libertarians. A generation later, their efforts look pretty ugly.

The resurfacing of the newsletter story in December froze Paul’s upward movement in the polls. For the critical week before the Iowa caucuses, no Ron Paul national TV interview was complete without newsletter questions, deemed more important than the candidate’s opposition to indefinite detention, the Fed, or a new war in Iran. On stage in the New Hampshire debate, Paul forcefully disavowed writing the newsletters or agreeing with their sentiments, as he had on dozens of prior occasions, and changed the subject to a spirited denunciation of the drug laws for their implicit racism. This of course did not explain the newsletters, but the response rang true on an emotional level, if only because no one who had observed Ron Paul in public life over the past 15 years could perceive him as any kind of racist.

If the Weekly Standard editors hoped the flap would stir an anti-Paul storm in the black community, they were sorely disappointed. In one telling Bloggingheads.tv dialogue, two important black intellectuals, Glenn Loury and John McWhorter, showed far more interest in Paul’s foreign-policy ideas, and the attempts to stamp them out, than they did in the old documents. Atlantic blogger Ta-Nehisi Coates likened Paul to Louis Farrakhan. He didn’t mean it as a compliment, but the portrait fell well short of total scorn. It was difficult to ignore that the main promoters of the newsletters story, The New Republic and the Weekly Standard, had historically devoted exponentially more energy to promoting neoconservative policies in the Middle East than they had to chastising politicians for racism.

In 2008, Mr McConnell, then The American Conservative‘s editor, had responded to Mr Kirchick’s original piece with stern reproof for Dr Paul.  The magazine then endorsed Dr Paul for president anyway, though Mr McConnell himself would later express his preference for Barack Obama. In the paragraphs above, Mr McConnell seems to be rather straining to downplay the newsletter matter.  For one thing, while Glenn Loury and John McWhorter are by anyone’s standards “important black intellectuals,” each of them is rather conservative and neither of them could be accused of having a low tolerance for white-guy B.S.- rather the opposite, in fact.  It is true that the early 1990s were a time of unusually raw tension between whites and African Americans; indeed, the late 1980s and early 1990s were an extremely strange period in American history, as Dr Paul’s 1988 appearance on The Morton Downey, Jr Show should suffice to demonstrate.  But this does not excuse Dr Paul’s pandering to the racialist right in those years.  Rather, it makes it all the more culpable.  In 1991, many parts of the USA, from Crown Heights in New York City to South Central Los Angeles, were teetering on the brink of race riots.  In that year, a majority of white voters in Louisiana pulled the lever in support of the gubernatorial campaign of Neo-Nazi David Duke.   To peddle racially charged rhetoric at that time was, if anything, more irresponsible, because more dangerous, than it would be today.

An editorial in the same issue discusses Dr Paul from a slightly different perspective.  In a single page, it dismisses the newsletters twice, once as “artifacts of a time- the Andrew Dice Clay era in American politics, when the populist right reacted to political correctness– then a new phenomenon– by sinning in the opposite direction”; then with this line: “The Rodney King era is a distant memory; the wars and economic outrages of our bipartisan establishment are still very much with us.”  If these dismissals leave you unsatisfied, there is still a refuge for you on The American Conservative’s webpage, where blogger Rod Dreher has repeatedly expressed his objections to Dr Paul’s newsletters in very strong terms (see here for one of the strongest of these objections.)

No discussion of “the Rodney King era” would be complete without a reference to The Bell Curve, in which psychologist Richard Herrnstein and historian Charles Murray argued that American society was becoming more stratified by cognitive ability, that cognitive ability is largely inherited, and therefore that America’s class system will likely become more unequal and less fluid as the highly intelligent pull ever further away from the rest of us.  Four chapters of the book dealt with race, analyzing the average IQ scores of various ethnic groups and concluding that African Americans as a group are likely to be among the hardest hit by the adverse consequences of this trend.  Professor Herrnstein and Mr Murray offered chillingly few suggestions as to how this grim scenario could be prevented or ameliorated; Mr Murray’s right-of-center libertarianism led him always to emphasize out the ways in which social programs intended to broaden opportunity sometimes redound to the disadvantage of their intended beneficiaries, an emphasis which, in conjunction with the book’s overall argument, seemed to suggest that there is no escape from the most dystopian version of its predictions.  Published in 1994, The Bell Curve rose to the top of the bestseller lists and garnered enormous attention; today, it would be difficult to imagine a major publisher agreeing to release it.  The nativist theory of IQ which is at its heart, and particularly the explicit development of that theory’s implications in the four chapters on race, makes it such an easy target for anti-racist spokesmen that a publisher who released it nowadays would be risking public infamy.  Yet in those days, The Bell Curve hardly represented the far edge even of acceptable public discourse.  So the far more aggressively anti-black Paved With Good Intentions, by Jared Taylor (a self-styled “white nationalist”,) found a major publisher and considerable sales when it was published in 1992; his recent followups to that book have been self-published.

Mr Murray has returned to the scene with a new book, Coming Apart: The State of White America, 1960-2010.  By focusing exclusively on whites, Mr Murray need not dwell explicitly on racial differences in average IQ score or any theory as to what causes these differences; by setting 2010 as an ending date, he need not dwell on its grimmest implications for the future.  Reviewer Steve Sailer, himself a tireless advocate of the nativist theory of IQ, reviews this new book and finds some interesting nuggets in it.  For example, Mr Sailer refers to figures, evidently included in the book, which indicate that while 40 percent of affluent American whites are now unaffiliated with any religion (as compared with 27% of their counterparts in the early 1970s,) 59% of less well-off whites are now religiously unaffiliated (as compared with 35% of the same group in the earlier period.)  That leads me to wonder if the very conservative, rather militant forms of Evangelical Christianity that are so popular among the white working class, as well as the right-wing political views that so often accompany that form of Christianity, are a sign that the individuals who profess them identify themselves as cadet members of the  professional classes.  Their militancy, even when presented as a challenge to some relatively liberal subset of the upper middle class such as elite academics or Democratic Party politicians or leaders of mainline Protestant churches, advertises to all that they are church-goers, and thus strivers, not to be confused with the defeated mass who have lost interest in such institutions and faith in the promises they represent.

Timothy Stanley’s “Buchanan’s Revolution” looks back at the last antiwar rightist to make a splash as a US presidential candidate, Patrick J. Buchanan.  Mr Buchanan was one of the founders of The American Conservative, and the magazine still runs his column (including a recent one lauding Ron Paul.)  So it is no surprise that the treatment of him here is respectful.  However, in light of what was going on with race relations in the USA in 1992, it is sobering to see these passages:

Of all Pat’s buddies, the one most excited by his campaigns was columnist Samuel Francis, who had worked for North Carolina senator John East before landing a job with the Washington Times. Physically, he was a fearsome toad. The journalist John Judis observed that “he was so fat he had trouble getting through doors.” He ate and drank the wrong things and the only sport he indulged in was chess. The mercurial, funny, curious Francis was an unlikely populist. But he was ahead of the curve when it came to Pat’s insurgency.

Back in the 1980s, Francis had predicted an uprising against the liberal elite that governed America. The only people who would break their stranglehold were the ordinary folks who made up the ranks of the “Middle American Radicals,” or MARs. Mr. MARs was Mr. Average. He was either from the South or a European ethnic family in the Midwest, earned an unsatisfactory salary doing skilled or semi-skilled blue-collar work, and probably hadn’t been to college. He was neither wealthy nor poor, living on the thin line between comfort and poverty. All it took to ruin him was a broken limb or an IRS audit.

But Francis argued that the Middle American Radicals were defined less by income than by attitude. They saw “the government as favoring both the rich and the poor simultaneously… MARs are distinct in the depth of their feeling that the middle class has been seriously neglected. If there is one single summation of the MAR perspective, it is reflected in a statement … The rich give in to the demands of the poor, and the middle income people have to pay the bill.”

Preferring self-reliance to welfare feudalism, the MARs felt that the U.S. government had been taken captive by a band of rich liberals who used their taxes to bankroll the indolent poor and finance the cultural revolution of the 1960s. The MARs were a social force rather than an ideological movement, an attitude shaped by the joys and humiliations of middle-class life in postwar America. Any politician that could appeal to that social force could remake politics.

Two things made the MARs different from mainstream conservatives (and libertarians). First, not being rich, they were skeptical of wealthy lobbies. They hated big business as much as they hated big government. They opposed bailing out firms like Chrysler, or letting multinational companies export jobs overseas. They were especially critical of businesses that profited from smut, gambling, and alcohol. Although free market in instinct, they did appreciate government intervention on their behalf. They would never turn down benefits like Social Security or Medicare.

Second, the MARs were more revolutionary than previous generations of conservatives. Conservatives ordinarily try to defend power that they already control. But the MARs were out of power, so they had to seize it back. This was why conservatives like Buchanan behaved like Bolsheviks. “We must understand,” wrote Francis,

that the dominant authorities in… the major foundations, the media, the schools, the universities, and most of the system of organized culture, including the arts and entertainment—not only do nothing to conserve what most of us regard as our traditional way of life, but actually seek its destruction or are indifferent to its survival. If our culture is going to be conserved, then we need to dethrone the dominant authorities that threaten it.

Buchanan agreed. He wrote, reflecting on Francis’s words, “We traditionalists who love the culture and country we grew up in are going to have to deal with this question: Do we simply conserve the remnant, or do we try to take the culture back? Are we conservatives, or must we also become counter-revolutionaries and overthrow the dominant culture?”

The populist counter-revolution that Francis proposed was not explicitly racial. In theory, Hispanic or black industrial workers were just as threatened by economic change and high taxes as their white co-workers. And the cultural values of Hispanic Catholics and black Pentecostals were just as challenged by liberalism as those of their white brethren. But in Francis’s view, these ethnic groups had become clients of the liberal state. Only political correctness—argued Francis_prevented whites from admitting this and organizing themselves into their own ethnic interest group. In this worldview, the Democrats gave handouts to African-Americans in exchange for votes. Hispanics were brought in from Mexico to lower wages and break unions, providing cheap domestic labor for the ruling class and maximizing corporate profits. The only people without friends in high places were the middle-class white majority.

Buchanan and Francis disagreed over this point. Pat was concerned about the decline of Western civilization. But he never saw Western society in explicitly racial terms. He opposed both welfare and mass immigration, but he thought they hurt blacks and Hispanics as much as whites. Francis believed that human characteristics—including intelligence—were shaped by race.

And:

During the primary, (economist Harry) Veryser arranged a meeting between himself, Pat, Francis, and (scholar Russell) Kirk. Buchanan and Francis behaved as if no one else was there, and Pat sat in rapt silence listening to his friend expand upon the coming revolution. It was an intellectual romance, said Veryser. Harry was embarrassed, Kirk was furious that he wasn’t paid the attention he deserved. Both concluded that Buchanan was in love with Francis’s mind, that he truly believed that the two men could remake the world. Francis was a true believer, and his zeal infected Pat. He gave to Buchanan’s peculiar rebellion the theoretical structure of a popular revolution.

I used to read Samuel T. Francis’ column in Chronicles magazine.  It was a microcosm of Chronicles itself; full of one fascinating bit after another, often making the most interesting sort of points, and then, by the way, dropped in the middle someplace, a bizarre remark that could only be attributed to racism.  In one of the last to appear before his death in 2005, he was going on about the things that American children ought to, but don’t, learn in public schools.  He was developing a powerful vision of public education as a vehicle for cultural continuity and the formation of a common national heritage.  It was thrilling stuff, if not entirely convincing, until the middle of the fifth or sixth paragraph when he listed among the things that all Americans should learn in school “why slavery was right, and why the South was right to maintain it as long as it did.”  Then he went back to being interesting, but really, it was hard to focus after that.  And really, all of his columns were like that, brilliant, fascinating, and marred beyond saving by such outlandish remarks.  When The American Conservative started in 2002, Dr Francis wasan occasional contributor, writing three articles for the magazine (one each in 2002, 2003, and 2004.)  The editorial team there evidently took more of an interest than did their counterparts at Chronicles in toning the racialist content of his columns to a minimum, so that there were no true lightning bolts of lunacy.

Dr Francis, to the embarrassment of his more respectable friends, called himself a white nationalist and socialized with David Duke.  In the 1980s and early 1990s, Dr Francis was a figure of some influence.  The “job with the Washington Times” that Mr Stanley mentions was that of editorial page director.  That a man of his views could attain such a position is another marker of how raw the racial resentments of whites were in the Rodney King era. In his obituary of Dr Francis for The American Conservative, Scott McConnell wrote that at Dr Francis’ funeral he found himself talking with none other than Jared Taylor.  Mr Taylor said that the cab driver who took him from the airport to the funeral had asked who Dr Francis was.  In response, Mr Taylor proclaimed “He stood up for white people!”  The cab driver, a white workingman in Chattanooga, Tennessee, was visibly shocked and uncomfortable.  I very much doubt that many like him would have been upset by such a remark 14 years before.

One of Ron Paul’s rivals for the Republican nomination, former Massachusetts governor Willard Milton Romney (known familiarly as “Mitt,”) is mentioned by name in a review of economist Bruce Bartlett’s book, The Benefit and the Burden: Tax Reform, Why We Need It, and What It will TakeMr Bartlett was a staffer for Dr Paul in the 1970s, but has not been associated with him in recent years.  Reviewer Tom Pauken quotes Bartlett as saying that the USA’s corporate income tax exempts money spent on interest payments, but does not give such favorable treatment to money returned to shareholders in dividends.  It is unsurprising, then, that US businesses raise vastly more money by borrowing than by selling equity.  Mr Pauken says that this situation “has been great for private-equity moguls and leveraged buy-out operators like Mitt Romney and Stephen Schwarzman, who have made fortunes gaming the system.   But it has been destructive to the long-term health of many US companies and to American workers who have lost jobs as a consequence of tax incentives that encourage companies to pile up debt.”  Mr Bartlett calls for the repeal of the corporate income tax and of several other taxes, and their replacement by a border-adjusted value added tax.  I’ve endorsed similar proposals here, often under Mr Bartlett’s influence, and am glad to see that he is still working the old stand.  As for the connection to Mr Romney, I would mention a link I posted on our tumblr page to a recent column by Paul Rosenberg called “Mitt Romney, ‘Welfare Queen.'”  The caption I gave that link was “In the USA, corporations can write interest payments off their income taxes, while they have to pay taxes on dividends they pay shareholders.  So, shareholders collect almost nothing in dividends, while banks and private equity firms collect trillions of dollars in interest payments.  Those interest payments are an alternative form of taxation, and people like Willard M. Romney are tax recipients, not taxpayers.”  I think is a reasonably fair summary of Mr Rosenberg’s argument, though Mr Bartlett’s views are somewhat more complex.   

A few months ago, I noted here a column about the Revised Common Lectionary that Philip Jenkins had contributed to Chronicles magazine.  Professor Jenkins argued that the committees that produced that selection of Bible readings had left out all of the passages in which God is shown commanding or praising violence, thus creating a false impression of the scriptures.  Professor Jenkins has presented that argument at book length, in a volume called Laying Down the Sword: Why We Can’t Ignore the Bible’s Violent Verses.  Patrick Allitt’s review of Professor Jenkins’ book in this issue draws out some interesting points.  For example, the books of Joshua and Judges, which include many of the Bible’s most bloodthirsty passages, describe events that supposedly occurred in the late Bronze Age, but in fact were written at least 600 years after that period.  That not only means that the massacres they celebrate are not only unlikely to have taken place (archaeologists have found no residue of such conflicts,) but also that they were written at about the same time as, and very likely as part of a dialogue with the authors of, the passages about social justice and universal benevolence that warm the hearts of those who read the books of Ezekiel, Amos, and Isaiah.  The thorny passages in Deuteronomy also date from this relatively late period.  So to suppress the Mr Angry Guy passages from the Heptateuch is to misrepresent the Mr Nice Guy passages from the prophets.  I should mention that elsewhere on the magazine’s website, blogger Noah Millman appends a nifty bit of rabbinical logic to the review.

Intellectuals in the traditionalist right often mention the name of philosopher Eric Voegelin.  The late Professor Voegelin’s works are too deep for the likes of me, but an essay by Gene Callahan about his ideas in this issue of the magazine had me thinking of making another attempt at reading one of Professor Voegelin’s book, most likely The New Science of Politics (simply because it’s the one I’ve made the most progress with in my previous attempts.)  Of the many extremely interesting bits in Professor Callahan’s essay, the most interesting to me was his summary of a notion Professor Voegelin labeled the “hieroglyph.”  By this word, Professor Voegelin evidently meant “superficial invocations of a preexisting concept that failed to embody its essence because those  invoking it had not experienced the reality behind the original concept.  As hieroglyphs, the terms were adopted because of the perceived authority they embodied.  But as they were being employed without the context from which their original authority arose, none of these efforts created a genuine basis for a stable and humane order.”

I think this notion might explain a great deal.  Take for example a term like “national security.”  In such a place as the USA in the early nineteenth century, a poor country with a tiny population, a vast border, a radically decentralized political system, and every empire of Europe occupying territory in the immediate neighborhood, a patriot might very well advocate an aggressive program of territorial expansion, political consolidation, and a military buildup.  Such steps might well have been necessary for the infant USA to maintain its independence.  Today, however, such policies only weaken the United States.  Our international commitments empower our enemies, our national government threatens our liberties, our military expenditures divert capital from productive uses and weigh heavily on the economy as a whole.  To secure the blessings that make the United States of America worth living in and dying for, we must be prepared to revise or discontinue all of the policies customarily justified under the rubric of “national security.”

Likewise with the term “free market.”  As someone like Mr Bartlett has done so much to demonstrate, our current financial and corporate elites by no means owe their preeminence to success in unfettered competition.  Rather, they are the figures who have been most successful at manipulating a system that is defined and sustained by the continual involvement of government in every phase of economic life.  And yet even those among the rich who are most blatantly tax-recipients find defenders who speak of them as if they were so many Robinson Crusoes, in possession of nothing but that which they themselves had wrested single-handed from nature.  Virtually all conservatives and most libertarians are guilty of this form of hieroglyphic use of the term “free market” and its accompanying imagery at least occasionally.  Some libertarians, like the aforementioned Murray Rothbard, acknowledge the fact that the existing economic system is not a free market in any meaningful sense, and so speak not of a “free market” that is to be defended, but of a “freed market” that is to be created when our currently existing economic system is abolished.  The late Professor Rothbard and his followers frankly call the existing system, the one which they find unacceptable, “capitalism.”  For my part, I am perfectly willing to accept and defend the system Rothbardians call capitalism, though I would also call for a recognition that where there is subsidy, there must also be regulation.  And of course I would hope that we would have a lively democratic political culture that would guide our regime of subsidy and regulation to aim at socially desirable ends, rather than simply functioning as a means by which the power elite can entrench its position at the top of the economic and political order.

*I don’t actually agree with Mr McConnell that Llewellyn Rockwell is the likeliest author of the articles in question.  The most obnoxious piece, which in fact contains all of the tropes that drew fire in the other pieces, appeared under the byline “James B. Powell.”  A man by that name did in fact write for the Ron Paul newsletters, and is today a member of the board of directors of the Forbes Corporation.

The New York Review of Books, 22 December 2011

I subscribed to The New York Review of Books for years and years.  I kept renewing because interesting pieces would appear in it just as my subscription was about to expire.  Then it would go back to its usual unrelieved tedium for another 11 1/2  months.  Anyway, I saw a copy of the 22 December 2011 issue in a magazine exchange rack the other day.  I picked it up.  I’m glad I don’t subscribe anymore, or it would have been the issue to lead me to renew.

Michael Tomasky reviews sometime presidential hopeful Herman Cain’s campaign autobiography.  This sentence intrigued me:  “While some of us may scoff at a man whose claims to fame include peddling Whoppers (Cain turned around the Philadelphia regional division of Burger King) and pizzas (he was for ten years CEO of Godfather’s Pizza, which he also made profitable) to an increasingly obese nation with less and less need of them, conservatives find virtually any form of private-sector achievement admirable.”  In the USA, academics, journalists, and others in the nonprofit world are routinely challenged to justify their existence in terms of the value of their services to society at large.  Success in business, by contrast, is generally accepted as self-justifying.  I’ve lived in the USA long enough to find it a bit jarring, in fact, to hear Tomasky step outside this paradigm and treat business as an activity like any other.

Ingrid Rowland reviews Robert Hughes’ Rome: A Cultural, Visual, and Personal History.  Rowland meditates on the coexistence of Rome’s historical patrimony and the dominance of mafia groups in the city’s business life.  I wonder if the two things can be separated.  The only cities I can think of that have decisively broken mafia control are Las Vegas and New York, and in each case the slayer of the mafia was the unfettered multinational corporation.  That’s hardly an entity that would be likely to preserve the signs of eternity in the Eternal City.

Freeman Dyson reviews Daniel Kahneman’s Thinking, Fast and Slow.  Kahneman’s theme, Dyson tells us, is the power of “cognitive illusions,” which he defines as “false belief[s] that we intuitively accept as true.”  Kahneman began his career by identifying what he calls the “illusion of validity,” the idea that the conclusions which people intuitively draw when faced with questions relating to topics about which they are well-informed are likely to be true.  As a very young researcher in the Israeli army in 1955, Kahneman was called upon to evaluate and, eventually, to replace the system the army was then using to place recruits in jobs.  That system was based on the opinions that experienced officers formed after brief, informal interviews with recruits.  Kahneman found that those opinions had no correlation with the recruits’ eventual performance.  He then designed a brief factual questionnaire for recruits to fill out and a mechanical method  of analyzing the results of that questionnaire, a method which turned out to be quite accurate at predicting recruits’ performance, and which has been the basis of assignments in the Israeli Defense Forces ever since.  Dyson follows this story with a story from his own experience in the Royal Air Force during World War Two, when changes that would have made bombers likelier to complete their missions were made impossible by the unwillingness of their crews to admit a fact which statistical analysis made achingly plain, that bombers carrying experienced crews were just as likely to be shot down as were bombers carrying inexperienced crews.  The illusion of validity was at work here as well; the idea that they were acquiring expertise that they would be able to use to save themselves gave the crews self-confidence that they would not exchange for safer planes.

Dyson explains the title of Kahneman’s book in terms of his thesis that cognition should be analyzed in terms of two systems, which Kahneman calls System One and System Two.  System One, our inheritance from our early primate ancestors, is fast and inaccurate; System Two, the product of our neocortex, is much more accurate but very slow.  In the fast-changing conditions of life in the arboreal canopies where our distant ancestors lived, it was far more important to be fast than it was to be right.  If a predator was coming, immediate movement in any direction was likelier to lead to safety than was long-delayed movement in the ideal direction.  Indeed, the RAF crews who resisted the changes Dyson and his fellow analysts could use statistics to recommend found themselves in a very similar environment to that in which our lemur-like forebears darted about, and so could hardly be blamed for favoring System One reasoning over System Two.

Dyson puts in a good word for two thinkers whom Kahneman does not mention, William James (whom Rowland also mentions, for his telling in The Varieties of Religious Experience of the story of how Alphonse Ratisbonne converted to Christianity) and Sigmund Freud.  Dyson argues that Freud anticipated many of Kahneman’s key concepts, notably availability bias (that is, “a biased judgment based on a memory that happens to be quickly available. It does not wait to examine a bigger sample of less cogent memories.”)  Here’s what Dyson says about James:

James was a contemporary of Freud and published his classic work, The Varieties of Religious Experience: A Study in Human Nature, in 1902. Religion is another large area of human behavior that Kahneman chooses to ignore. Like the Oedipus complex, religion does not lend itself to experimental study. Instead of doing experiments, James listens to people describing their experiences. He studies the minds of his witnesses from the inside rather than from the outside. He finds the religious temperament divided into two types that he calls once-born and twice-born, anticipating Kahneman’s division of our minds into System One and System Two. Since James turns to literature rather than to science for his evidence, the two chief witnesses that he examines are Walt Whitman for the once-born and Leo Tolstoy for the twice-born.

Freud and James were artists and not scientists. It is normal for artists who achieve great acclaim during their lifetimes to go into eclipse and become unfashionable after their deaths. Fifty or a hundred years later, they may enjoy a revival of their reputations, and they may then be admitted to the ranks of permanent greatness. Admirers of Freud and James may hope that the time may come when they will stand together with Kahneman as three great explorers of the human psyche, Freud and James as explorers of our deeper emotions, Kahneman as the explorer of our more humdrum cognitive processes. But that time has not yet come.

Lorrie Moore reviews Werner Herzog’s film Into the Abyss: A Tale of Death, a Tale of Life.  This bit, describing the death house ordinary, stuck in my mind:

The reverend is against the death penalty but in thinking of it before the camera he veers off onto an anecdote about a golf trip and almost hitting a squirrel that “had stopped in the middle of the cart path,” and we can see how when pressed to illuminate its own contradictions the human mind can go on the fritz.  This may really be Herzog’s theme.  There is much strain and helplessness felt by the functionaries asked to dole out this ritualized punishment.

I can’t help but wonder what Kahneman would make of these flailings of a mind “on the fritz.”  Moore describes another of Herzog’s interview subjects, a former executioner named Fred Allan, who had to quit his job and foreswear his pension because he couldn’t stop visualizing the faces of all the hundreds of condemned men whose lives he had ended in the death chamber at Huntsville prison.  That sounds like a cognitive illusion worth cultivating in everyone inclined to set up shop in the killing business.

Kwame Anthony Appiah reviews two new books about W. E. B. DuBois, Lawrie Balfour’s Democracy’s Reconstruction: Thinking Politically with W. E. B. DuBois and Robert Gooding-Williams’ In the Shadow of DuBois: Afro-Modern Political Thought in America.  Appiah notes two facts that impede a proper study of DuBois.  Again, I wonder what label Kahneman would put on these cognitive illusions.  The first is that DuBois’ great longevity tempts us to see him as a more nearly contemporary figure than he in fact was.  His death date, 27 August 1963, is in many ways less illuminating of his thought than is his birth date, 23 February 1868.  The second is that he still ranks as a sort of patron saint of intellectual achievement among African Americans, and so any attention to his limitations may be taken as an attack on all such achievement.  Appiah acclaims Goodling-Williams and Balfour for having the courage to venture into these sacred precincts and do scholarly work there.

According to Appiah, Goodling-Williams finds three ideas at the heart of DuBois’ political ideas: first, the idea that politics is in essence the exercise of command over a community.  Second, the idea that this command is rooted in and to some extent tempered by “political expressivism,” a process by which those who are to be led recognize as their leaders those individuals who best express what they regard as the essence of their common life, what DuBois meant by “soul” in the title of The Souls of Black Folk.  Third, the idea that the main political issue facing African Americans was social exclusion, which in turn resulted from the twin evils of racial prejudice among whites and “the cultural (economic, educational, and social) backwardness of the Negro.”

These three points set DuBois at odds with Frederick Douglass, who saw healthy politics as essentially a matter of collaboration among equals rather than a matter of command and control; who rejected nationalistic conceptions of leadership as collective self-expression; and who saw white supremacy, that negation of collaborative politics, as an evil quite apart from social exclusion of African Americans.  Goodling-Williams, Appiah argues, uses Douglass as a mouthpiece for his own democratic vision of politics, one in which leaders must listen to the actual voices of their followers, rather than to their collective soul.

Appiah ends with an interesting question about DuBois archnemesis, Booker T. Washington:

Could it be right to act like Booker T. Washington, deferring a demand for justice for yourself if that would bring justice more swiftly for your descendants?  Or is there something so discreditable, so slavish, in acceding to these injustices that it is better to resist them, whether or not your resistance brings forward the date when they will cease?

My inclination is to ask how we could know that any given act of deferring a demand for justice would in fact bring justice more swiftly for our descendants.  From a God’s eye perspective in which we could know with certainty that this was so, then the question would be one we could analyze coolly, rationally, in what Kahneman might call a System Two manner.  But given the limitations on what we can in fact know about the future, surely the best course of action would be to set an example of resistance, however futile it might be in the short term, in the hope, however ill-founded, that our descendants might hear of it and be inspired to emulate it.  Both our own action and the action we would hope to inspire in our descendants under that scenario would be the results of System One reasoning, bold and drastic and very likely to be misguided, but I don’t see how under real world conditions a policy generated solely by System Two reasoning could lead us to anything other than a situation in which full equality between whites and African Americans will remain forever in the future.

Malise Ruthven reviews Hamid Dabashi’s Shi’ism: A Religion of Protest.  Ruthven talks a bit about the paradox that results when Westerners compare the Sunni/ Shia split to the Protestant/ Catholic split.  Shias, with their hierarchy, shrines, and veneration of saints, are often compared to Roman Catholics, while Sunnis, with their many sects, internationalist themes, and iconoclastic tendency, are often compared to Protestants.  Yet Shiism is at its heart a protest against Sunni ascendancy.  So at moments it is appealing to compare Shiism with Protestantism.

This discussion obviously doesn’t get one very far, since the very definition of an analogy is a comparison between things that are in other respects dissimilar.  In some ways the Shias are a bit like the Catholics, in other ways they are a bit like the Protestants, in a great many ways they aren’t much like either.

Interesting to me were a description of Dabashi’s rejection of Max Weber’s description of Muhammad (Appiah also mentions Weber, commenting on Weber’s admiration for DuBois and his skepticism about democracy.)  For Dabashi, Weber’s view of Muhammad as an “ethical prophet,” rather than an “exemplary prophet,” is too schematic and conceals the ideological difference at the heart of the Sunni/ Shia split.  Dabashi argues that the two branches have different ways of dealing with what Weber would call the exemplary character of the prophet.  Sunnis, says Dabashi, tend to believe that shari’ah law can absorb the prophet’s example and teach the community to cultivate virtue, while Shias favor the view that a living imam must embody his example in the presence of the community if its members are to know what is virtuous.

Dyson says that “religion does not lend itself to experimental study,” and so remains outside of Kahneman’s focus.  Nonetheless, I wonder how Kahneman might analyze this difference.  It sounds to me like the Sunni ideal is a System Two prophet, Muhammad converted from living man into the rational processes of the law.  The Shia ideal, by contrast, sounds like a System One prophet, Muhammad who gave us a line of successors who are not themselves prophets, but who share the prophet’s intuitive understandings of right conduct and arouse the same understandings in us by the influence of their example.

Ruthven mentions the political sociologist Sami Zubaida, who has written a number of things about the contradictions in the Iranian political system that stem from the fact that that country’s constitution defines sovereignty as stemming from God and also as stemming from the people.   He also mentions Dabashi’s admiration for Philip Rieff.

Tim Parks and Per Wästberg exchange views on the question, “Do We Need the Nobel?”  Considering the Nobel Prize for Literature, Mr Parks takes on a job that strikes me as absurdly easy, which is to prove that no group of eighteen people can be taken seriously as the judges of all the world’s contemporary literatures.  Mr  Wästberg can respond only by describing the lengths to which he and his fellow members of the Swedish Academy go in attempting the impossible task of giving fair consideration to all the living writers in all the languages of the world.  Mr Parks receives this description in good grace, but sees in it no rebuttal to his main point.