The way out of philosophy runs through philosophy

There’s a phrase I’ve been thinking about for years, ever since I read it somewhere or other in Freud: “the moderate misery required for productive work.”  It struck me as plausible; someone who isn’t miserable at all is unlikely to settle willingly into the tedious, repetitive tasks that productive work often involves, while someone who is deeply miserable is unlikely to tolerate such tasks long enough to complete them.  If blogging counts as productive work, I myself may recently have represented a case in point.  Throughout the summer and into the autumn, I wasn’t miserable at all, and I barely posted a thing.  Then I caught a cold, and I posted daily for a week or so.  If I’m typical of bloggers in this respect, maybe I could also claim to have something in common with a philosopher.  Samuel Johnson once quipped that he had intended to become a philosopher, but couldn’t manage it.  The cause of his failure?  “Cheerfulness kept breaking in.”

One item I kept meaning to post notes on when cheerfulness was distracting me from the blog was a magazine article about Johnson’s contemporary, David Hume.  Hume, of course, was a philosopher; indeed, many would argue that he was “the most important philosopher ever to write in English.”  Contrary to what Johnson’s remark suggests, however, Hume was suspected of cheerfulness on many occasions.  The article I’ve kept meaning to note is by Hume scholar and anti-nationalist Donald W. Livingston; despite the radicalism of Livingston’s politics (his avowed goal is to dissolve the United States of America in order to replace it with communities built on a “human scale”) in this article he praises Hume as “The First Conservative.”  Hume’s conservatism, in Livingston’s view, comes not only from his recognition of the fact that oversized political units such as nation-states and continental empires are inherently degrading to individuals and destructive of life-giving traditions, but also from his wariness towards the philosophical enterprise.  Hume saw philosophy as a necessary endeavor, not because it was the road to any particular truths, but because philosophical practice alone could cure the social and psychological maladies that the influence of philosophy had engendered in the West.

This is the sort of view that we sometimes associate with Ludwig Wittgenstein; so, it’s easy to find books and articles with titles like “The End of Philosophy” and “Is Philosophy Dead?” that focus on Wittgenstein.  But Livingston demonstrates that Hume, writing more than a century and a half before Wittgenstein, had made just such an argument.  Livingston’s discussion of Hume’s Treatise of Human Nature (first published in 1739-1740) is worth quoting at length:

Hume forged a distinction in his first work, A Treatise of Human Nature (1739-40), between “true” and “false” philosophy.  The philosophical act of thought has three constituents. First, it is inquiry that seeks an unconditioned grasp of the nature of reality. The philosophical question takes the form: “What ultimately is X?” Second, in answering such questions the philosopher is only guided by his autonomous reason. He cannot begin by assuming the truth of what the poets, priests, or founders of states have said. To do so would be to make philosophy the handmaiden of religion, politics, or tradition. Third, philosophical inquiry, aiming to grasp the ultimate nature of things and guided by autonomous reason, has a title to dominion. As Plato famously said, philosophers should be kings.

Yet Hume discovered that the principles of ultimacy, autonomy, and dominion, though essential to the philosophical act, are incoherent with human nature and cannot constitute an inquiry of any kind.  If consistently pursued, they entail total skepticism and nihilism. Philosophers do not end in total skepticism, but only because they unknowingly smuggle in their favorite beliefs from the prejudices of custom, passing them off as the work of a pure, neutral reason. Hume calls this “false philosophy” because the end of philosophy is self-knowledge, not self-deception.

The “true philosopher” is one who consistently follows the traditional conception of philosophy to the bitter end and experiences the dark night of utter nihilism. In this condition all argument and theory is reduced to silence. Through this existential silence and despair the philosopher can notice for the first time that radiant world of pre-reflectively received common life which he had known all along through participation, but which was willfully ignored by the hubris of philosophical reflection.

It is to this formerly disowned part of experience that he now seeks to return. Yet he also recognizes that it was the philosophic act that brought him to this awareness, so he cannot abandon inquiry into ultimate reality, as the ancient Pyrrhonian skeptics and their postmodern progeny try to do. Rather he reforms it in the light of this painfully acquired new knowledge.

What must be given up is the autonomy principle. Whereas the false philosopher had considered the totality of pre-reflectively received common life to be false unless certified by the philosopher’s autonomous reason, the true philosopher now presumes the totality of common life to be true. Inquiry thus takes on a different task. Any belief within the inherited order of common life can be criticized in the light of other more deeply established beliefs. These in turn can be criticized in the same way. And so Hume defines “true philosophy” as “reflections on common life methodized and corrected.”

By common life Hume does not mean what Thomas Paine or Thomas Reid meant by “common sense,” namely a privileged access to knowledge independent of critical reflection; this would be just another form of “false philosophy.” “Common life” refers to the totality of beliefs and practices acquired not by self-conscious reflection, propositions, argument, or theories but through pre-reflective  participation in custom and tradition. We learn to speak English by simply speaking it under the guidance of social authorities. After acquiring sufficient skill, we can abstract and reflect on the rules of syntax, semantics, and grammar that are internal to it and form judgments as to excellence in spoken and written English.  But we do not first learn these rules and then apply them as a condition of speaking the language. Knowledge by participation, custom, tradition, habit, and prejudice is primordial and is presupposed by knowledge gained from reflection.

The error of philosophy, as traditionally conceived—and especially modern philosophy—is to think that abstract rules or ideals gained from reflection are by themselves sufficient to guide conduct and belief. This is not to say abstract rules and ideals are not needed in critical thinking—they are—but only that they cannot stand on their own. They are abstractions or stylizations from common life; and, as abstractions, are indeterminate unless interpreted by the background prejudices of custom and tradition. Hume follows Cicero in saying that “custom is the great guide of life.” But custom understood as “methodized and corrected” by loyal and skillful participants.

The distinction between true and false philosophy is like the distinction between valid and invalid inferences in logic or between scientific and unscientific thinking. A piece of thinking can be “scientific”—i.e., arrived at in the right way—but contain a false conclusion. Likewise, an argument can be valid, in that the conclusion logically follows from the premises on pain of contradiction, even if all propositions in the argument are false. Neither logically valid nor scientific thinking can guarantee truth; nor can “true philosophy.” It cannot tell us whether God exists, or whether morals are objective or what time is. These must be settled, if at all, by arguments within common life.

True philosophy is merely the right way for the philosophical impulse to operate when exploring these questions. The alternative is either utter nihilism (and the end of philosophical inquiry) or the corruptions of false philosophy. True philosophy merely guarantees that we will be free from those corruptions.

This is rather like one of Friedrich Nietzsche’s parables, from Also Sprach Zarathustra (1883-1885).  Nietzsche’s Zarathustra preaches that the superman must become a camel, so as to bear the heaviest of all weights, which is the humiliation that comes when one discovers the extent of one’s ignorance, and the commitment to enlighten that ignorance; that he must then put the camel aside and become a lion, so that he may slay the dragon of “Thou-Shalt” and undertake to discover his own morality; and that at the last he must become a child, so that he may put that struggle behind him and be ready to meet new challenges, not as reenactments of his past triumphs, but on their own terms.  According to Livingston, Hume, like Nietzsche, sees the uneducated European as a half-formed philosopher, and believes that with a complete philosophical education s/he can become something entirely different from a philosopher:

(more…)

Mr O’s “anti-nuclear imperialism”

Let me tell you about a better way, a way that protects the purity of our precious bodily fluids.

The late September issue of Counterpunch (available to subscribers here; the newsletter’s website is here) includes a fine article by Darwin Bond-Graham titled “The Obama Administration’s Nuclear Weapons Surge.”  While Mr O has made many remarks declaring that nuclear weapons are bad and the world would be better off without them, he has in fact “worked vigorously to commit the nation to a multi-hundred-billion-dollar reinvestment in nuclear weapons, mapped out over the next three decades.”  Bond-Graham analyzes the New START agreement between the USA and Russia.  Though the publicity surrounding New START presented it as an arms-reduction treaty, Bond-Graham contends that it is nothing of the kind.  “On balance, the nominal reductions in nuclear weapons required by New START are insignificant when compared to the multibillion-dollar nuclear (and strategic non-nuclear) weapons programs committed to in the treaty’s text.”  Indeed, Bond-Graham classifies New START as an “arms-affirmation treaty.”  Mr O and his allies in the upper echelons of the congressional Democratic leadership were able to market New START as a disarmament agreement and to enlist the support of Americans who usually oppose nuclear weapons, even though “the treaty does not actually require the destruction of a single nuclear warhead.”  Bond-Graham also goes into depth on various other programs through which Mr O has managed to increase spending on nuclear weapons, to reorient the USA’s nuclear weapons programs towards potential use in conflict, and to strip away inhibitions against nuclear first strikes by the USA.

For Bond-Graham, Mr O’s anti-nuclear public statements not only represent a rhetorical device to “neutralize”  the “anti-nuclear and antiwar groups that so effectively exposed [George W.] Bush’s plans” to pursue policies similar to those of the current administration, but also constitute the foundation of a strategic orientation that Bond-Graham dubs “anti-nuclear imperialism.”  This orientation, ostensibly based on abhorrence of nuclear weapons, in fact promotes the development, maintenance, and deployment of such weapons.  Remember the claims that the Bush-Cheney administration made about Saddam Hussein’s alleged “Weapons of Mass Destruction” programs in 2002-2003, and the meaning of the phrase “anti-nuclear imperialism” becomes all too clear.

The contextualization fairy

Recently, John Holbo posted two items (here and here) on Crooked Timber about something odd in American politics.  Right-wing politicians in the USA quite often make public statements that would, if taken at face value, suggest that they are far more extreme in their views than they in fact are.  So, Professor Holbo finds remarks from Texas governor Rick Perry which, taken literally, would imply that Mr Perry thought that Texas should secede from the USA, that all federal programs established since 1900 should be abolished, indeed that there should be no government at all.  Mr Perry obviously does not believe any of those things, so obviously that only his committed opponents try to take him to task for making such extreme remarks.  This is not unique to Mr Perry, but is a usual pattern for right-wing US politicians.

What makes this so odd is that, while it is common for right-wing American politicians to exaggerate the radicalism of their views and for the public to realize that this is what they are doing, Professor Holbo can find no examples of their left-leaning counterparts doing the same thing.  A Democratic or leftist candidate who makes a radical-sounding statement likely means that statement to be taken at face value, and it certainly will be taken at face value by most observers.

Many commentators on American politics explain the right-wingers’ habit of making extreme sounding statements for which they do not expect to be held responsible as an effort to move the “Overton Window.”  The Overton Window, named for the late Joseph P. Overton, is the range of ideas that the people who hold sway in a given political culture hold to be acceptable at a particular time.  Only ideas within the window are likely to be put into effect.  The window shifts back and forth, as some ideas that had once seemed outlandish begin to seem mainstream, while other ideas that had once seemed mainstream begin to seem outlandish.

Key to the Overton Window is the idea of contextualization.  The idea of devolving Medicare, the program that ensures that most Americans over the age of 65 will be able to pay for health care, to the states may seem outlandish to many in the USA, but compared to the idea of large states seceding from the Union it is quite moderate.  The idea of shifting the revenues of Social Security, the program that provides a guaranteed income to  most Americans over the age of 65, from current benefits to private savings accounts may seem outlandish to many in the USA, but compared with the idea of abolishing the entire welfare state it is quite moderate.  Other policies favored by powerful interests on the right end of the political spectrum may also seem outlandish, but compared with anarchism they too are quite moderate.  So, within the context of the extreme remarks for which they are not called to account, rightists can gain a hearing for policies which they do seriously advocate.

(more…)

Brian Barder is a nice guy

Political blogging is not generally regarded as an activity that brings courtesy to the fore, but retired UK diplomat Brian Barder never fails to show good manners.  Though most of the topics he discusses are outside my usual circle of interests, I read him regularly, since it is such a pleasure to see politeness at work.

For example, the other day Brian Barder* posted a proposal about reforming the UK constitution. Brian Barder has considerable expertise on this subject, and his proposal is sufficiently close to his heart that he has been working to promote it for some years.  My knowledge of the UK constitution is limited to what I picked up during the years I took The Economist, and as someone who does not live in the UK my stake in its reform is close to nil.  Yet I took the liberty of posting comments (here and here) in which I expressed skepticism about the practical aspects of his plan.  Brian Barder would have been perfectly within his rights to ignore my uninformed remarks, or even to dismiss them icily, yet in fact he took the time to provide detailed responses to each of them.  In fact, he even emailed me to make sure that I knew he had done so.  Such generosity is not to be forgotten.

*I’d call him “Mr Barder,” but that isn’t actually his name.  He holds a knighthood, and so the proper courtesy title for him is “Sir Brian.”  I cannot bring myself to refer to any living person in that fashion; to me, it suggests only Monty Python.   So the only respectful way I can name him is as “Brian Barder.”  This is a shame, since Brian Barder is himself so scrupulous a user of courtesy titles that he titled his condemnation of the Oslo massacre “There are no lessons to learn from Mr Breivik.”  If Brian Barder can bring himself to give the correct title even to the murderer of dozens of helpless innocents, it seems churlish of me to withhold his title from him, yet I must, I must.

Some interesting comments by Michael Peachin about a new book on the Emperor Claudius

Proclaiming Claudius Emperor, by Lawrence Alma-Tadema

As a subscriber to Classical Journal, I regularly receive emailed reviews of new scholarly books concerning ancient Greece and Rome.  The other day, for example, they sent me Michael Peachin‘s review of Claudius Caesar: Image and Power in the Early Roman Empire, by Josiah Osgood (Cambridge University Press, 2010).  The only other notice I’d seen of the book was a drearily dutiful one in The Bryn Mawr Classical Review, so I was surprised that Peachin found some exciting points in the book.  I’ll quote two of these points:

Several recent accounts of Roman emperors have sailed off on a new tack. Instead of attempting a traditional biographical interpretationof the man, and thereby also a chronicle of his reign, each of thesehas sought to present an emperor on his own terms, and/or to view himas he was perceived by certain groups of contemporaries (other than theelite authors, who usually monopolize discussion). Thus, Caligula was notout of his mind; he simply had no taste for playing republic, when thereality was despotism; and so, he fashioned himself overtly as a tyrant,regardless of the consequences – or perhaps precisely to elicit certainones of those (A. Winterling, Caligula: A Biography [Berkeley, 2011]).

 

When I was in graduate school, I took a seminar on Roman history in which the professor horrified about half the class by spending a day arguing that Caligula was probably not a lunatic.  A few of my classmates were committed to the view of the third emperor presented in the ancient historical texts, and were appalled to hear a revision of that view; the others were committed to the idea that the only sort of history worth doing was social history that focused on the most numerous groups in a society, and so were appalled that we were spending so much time on the question of one man’s mental health.  I was not in either of those groups, but loved the day and have been defending Caligula ever since.  By the way, there’s a fine review of Winterling’s book in September’s New Criterion.  I recommend it to the the general reader.

Peachin makes a point that I found especially fascinating:

Augustus, in fine, had played his part well; but as Osgood aptly demonstrates, he fated all the various players in the sequel to write their own scripts as they went. In any case, Osgood argues that Claudius quite actively tried to shape his own time as emperor, and that in doing so, he contributed materially to the development of the imperial ‘system.’ As we observe this particular emperor at work, we are also being nudged slightly away from Fergus Millar’s  picture of a more passive, and perhaps generic, sort of monarch (The Emperor in the Roman World [Ithaca, 1977]): “…who the emperor was mattered” [136]). Still, Osgood sees quite clearly that Claudius (or any emperor) was indeed only one person; and hence, the princeps’  direct involvement with his subjects was perforce limited. Thus, when an emperor did choose to intervene, the event was so momentous as to carry an aura of the divine. That said, Claudius was no lone actor. We are reminded, throughout, that “…much of this emperor’s image, like any other’s, was constructed in dialogue with his subjects” (317.)

So, it was precisely because the emperor’s position was inherently weak that he inspired awe in his subjects.  This is just the sort of paradox I can never resist.

Many readers will be familiar with the theory that historian Arnaldo Momigliano developed and that Robert Graves popularized in his novels about Claudius.  Under this theory, Claudius wanted to phase out the principate and restore the old Republic.  Peachin explains Osgood’s view of this theory with admirable concision:

Following Momigliano’s observations (Claudius: the Emperor and His Achievement [Oxford, 1934]), Osgood stresses the fact that Augustus’ uneasy amalgam of republic and empire remained a befuddling puzzle  for Claudius (indeed, for every emperor). In particular, the quasi-retention of a republican state meant that a new imperial system of   government could not be crafted with anything even approaching clarity, or in any detail. Thus, to start at the start, when Gaius [a.k.a. Caligula] was murdered, and had not indicated a successor, a conclusively ‘proper’ or ‘constitutional’ way forward was nowhere to be discovered. That notwithstanding, Claudius was quickly on the throne; but then, the awkward facts of his accession, not to mention the earlier vituperation of him by members of the Augustan house (and others), seriously undercut his authority. Attempting to counter such hindrances, and just generally in his zeal to rule as he found appropriate, Claudius was too fastidious.     The result was a nasty paradox: “The loftier the goals the emperor set  for his administration, the more likely he was to fail, and to open himself to allegations of incompetency, or even corruption. Yet precisely  to try to win loyalty and increase his prestige, Claudius had to set   loftier goals than those of Tiberius, even those of Caligula” (189).

Considering that the written law in Rome in 41 BC was predicated on the idea that the Republic was still functioning, and that Claudius owed the principate to the very group of men who had just violently murdered his predecessor, it would have been quite a challenge for him to find a way to present his accession as legitimate without appealing to the idea of a restored Republic.   In no position to prosecute the assassins of Caligula, Claudius could only appeal to the right of tyrannicide, and thus evoke the two Brutuses, one who according to legend struck against the Tarquins in order to end the monarchy and establish the Republic, and the other who struck against Julius Caesar the Dictator in an attempt to prevent a new monarchy from ending the Republic.   If there were people who took this forced imposture at face value, one can hardly blame Claudius.

 

Did astrology originate in cities?

I wonder if the first astrologers were city-dwellers.  True, archeologists have found evidence that people who lived before the rise of cities paid close attention to the orbit of the Moon and identified constellations, and have argued that the orientations of temples and other religious structures from those days suggest that they attached a religious significance to the movements of heavenly bodies.  Those activities are hardly surprising; farmers need a calendar to plan their year, as hunter-gatherers also need to plan their expeditions for times when game will be relatively plentiful and fruit ripe for the picking.  Still, it might not be too much of a stretch to look at a society that invests heavily in maintaining and publicizing its calendar and to see a suggestion of something like what the western branch of organized Christianity used to call “natural astrology,” a set of ideas about ways in which heavenly bodies might influence the earth’s weather and various medical phenomena related to the transmission of disease.

Quite distinct from natural astrology are the various studies to which the Western Church used to refer as “judicial astrology.”  That’s the part that includes horoscopes, sun signs, and the like.  The difference matters when considering the origins of astrology; we have very ancient documents relating to the movements of heavenly bodies that seem to have some special significance and that predate the earliest references to judicial astronomy by centuries.  So, I’ll use the terms.

It is sometimes said that our earliest evidence of judicial astronomy comes from Mesopotamia, but that is misleading.  The nation state didn’t exist in those days; Ur and Lagash and Akkad and Babylon and the other urban centers that rose and fell in that region interacted with the political and economic systems of the countryside around them in a variety of ways, but in other ways they remained quite distinct.  It is in such cities that we find the first documents describing judicial astrology.

If astrology did arise in cities, it arose in a social environment where markets were familiar.  Its entire history would have taken place amid money, contracts, and production for exchange.  That calls into question the assumptions that we discussed last year when this xkcd appeared:

Not to be confused with "selling this stuff to OTHER people who think it works," which corporate accountants and actuaries have zero problems with.

Some people fall into the assumption that, because markets promote something called “rationality,” they must therefore favor every form of reason and disfavor every form of unreason.  However, the rationality which comes from markets is in fact something of a very narrow sort.  A month after our discussion, we noted that Shikha Dalmia had put it very well: “Markets don’t reward merit, they reward value.”  Dalmia summarizes the views of economist Friedrich Hayek:

In a functioning market, Hayek insisted, financial compensation depends not on someone’s innate gifts or moral character. Nor even on the originality or technological brilliance of their products. Nor, for that matter, on the effort that goes into producing them. The sole and only issue is a product’s value to others. Compare an innovation as incredibly mundane as a new plastic lid for paint cans with a whiz-bang, new computer chip. The painter could become just as rich as the computer whiz so long as the savings from spills that the lid offers are as great as the productivity gains from the chip. It matters not a whit that the lid maker is a drunk, wife-beating, out-of-work painter who stumbled upon this idea through pure serendipity when he tripped over a can of paint. Or that the computer whiz is a morally stellar Ph.D. who spent years perfecting his chip.

As markets are neutral as to the virtue or vice of economic actors, so too are they neutral as to the truth or falsity of the ideas that those actors bring as products for sale.  If falsehoods are in demand, falsehoods will sell; if truths are not in demand, their bearers will go begging.  The mouseover text for the xkcd represents a nod to this fact, and an attempt to wriggle out of its implications: “Not to be confused with ‘selling this stuff to OTHER people who think it works,’ which corporate accountants and actuaries have zero problems with.”  That won’t do, since it assumes that we can assign a fixed meaning to the expression “works.”  An investment advisor who believes in astrology may not be any likelier than other advisors to beat the market, but s/he may very well use that belief to “make a killing,” if s/he attracts clients who strongly value such a belief.  In that case, astrology would not “work” in the sense that quantitative analysts officially recognize, but it would make the advisor every bit as rich as it would if it did meet their definitions of success.  As for whether it makes the clients rich, well, Fred Schwed answered that one in 1940:

Once in the dear dead days beyond recall, an out of town visitor was being shown the wonders of the New York financial district.  When the party arrived at the Battery, one of his guides indicated some handsome ships riding at anchor.  He said, “Look, these are the bankers’ and brokers’ yachts.”  “Where are the customers’ yachts?,” asked the naive visitor.

Clearly, markets have not dissolved belief in astrology, any more than the continued non-existence of the customers’ yachts has discouraged people going to brokers and bankers.  If the practice of judicial astrology first arose in cities, it may in fact be a by-product of market society.  Perhaps we might find that judicial astrology began, not simply as a more elaborate version of a natural astrology that had long been a feature of rural life, but as an attempt to understand market interactions and the power of the market.  In that case, it would qualify as a school of economics.  One may wonder whether judicial astrology would be the most absurd such school in practice today.

The Atlantic, October 2011

The current issue of The Atlantic contains four pieces on which I took notes.  All four of them had to do with masculinity in one way or another.

Historian Taylor Branch contributes an article about college sports in the USA.  Non-USA types may not be aware that colleges and universities in the United States operate sports franchises, some of which have a mass following and an extremely lucrative financial aspect.  The athletes are not paid for their participation in this multibillion dollar industry; they are not even compensated for injuries they receive in the course of them.  Branch outlines the story of how this preposterously unfair system came to exist, and considers several recent developments that may bring it to an end.  Athletes are symbols of masculinity in the USA, as elsewhere; the amateur ideal may once have been part of a concept of masculinity that some upper-class Americans cherished, but nowadays even volunteerism is often justified in terms of its resume-building potential.  Moneymaking has become the masculine activity par excellence.  So the National Collegiate Athletic Association’s (NCAA’s) model of the unpaid “student-athlete” is a bit of an anachronism.

A piece called “Sex and the Married Politician” includes several references to the fall of New York Congressman Anthony Weiner.  Mr Weiner resigned his seat in the US House of Representatives shortly after it emerged that he had posted a picture of his genitalia on Twitter.  It strikes me as misleading to call this story a “sex scandal.”  Since everything on Twitter is public, Mr Weiner’s offense was not illicit sexual relations, but indecent exposure.  As such, he is in a league with longtime Friendsville, Maryland mayor Spencer Schlosnagle, who in the mid-1990s pled guilty to charges stemming from several incidents when he exposed himself to passersby on the highway.  Mr Schlosnagle paid a fine, went to a psychiatrist, and was reelected.  He continues in office today.  I think that the case of Mr Schlosnagle shows a community and a political system with a rational attitude towards mental illness.  Mr Schlosnagle initially tried to deny the charges against him; when the prosecution made such denials impossible, he accepted punishment and sought counseling, thus reducing the likelihood that he will reoffend.  Since his behavior was a real nuisance, the prosecution was rational.  On the other hand, it was only a nuisance, not a serious threat to anyone in particular; therefore, the voters’ decision to reelect him once he had shown that he was addressing his mental health problems was also rational.   Schlosnagle disclosed that he had suffered sexual abuse as a child, thus disowning any model of masculinity that would require him to project an image of himself as invulnerable or invincible.  The description of Weiner as the main figure in a “sex scandal,” by contrast, both obscures the fact that he doesn’t seem to have had any sexual contact with anyone and presents him as a menacingly potent figure.  I suppose it makes sense that he would have an easier time playing along with that image of himself that with presenting himself as a sick man compelled to behave in a somewhat annoying fashion.

The Library of America has finally devoted a volume to Ambrose Bierce, and this issue includes an admiring review of  Bierce’s work and of the Library’s edition.  I liked this sentence: “Bierce, after all, has always been best known for being undeservedly unknown.”  Reviewer Benjamin Schwarz also makes some good points about Bierce’s lapidary style, such as this:

Bierce’s seminal contribution to American letters is that “sharp-edged and flexible style, like the ribbon of a wound-up steel tape-measure,” as Edmund Wilson perfectly defined it. But that style emerged from Bierce’s compulsion to reveal a truth that remains unacceptable—or only selectively acceptable—today. It’s all very nice to decry the horror of war, but to Bierce its obscenity and its meaninglessness were merely integral to those of life. Bierce’s friend the editor Bailey Millard explained why all the leading publishers of the day rejected Bierce’s war fiction: they “admitted the purity of his diction and the magic of his haunting power, but the stories were regarded as revolting.” Understandably so, given what Bierce knew to be our delusional and self-serving tendencies.

Schwarz approves of Bierce’s flatly declarative style, especially as regards the US Civil War in which Bierce fought with distinction.  He quotes Walt Whitman’s remark that “The real war will never get into the books,” then says: “And in fact, excepting Bierce’s work, it didn’t.”  That’s high praise indeed; Bierce, alone among the tens of thousands of authors who have published books on that conflict, succeeded in putting “the real war” into his books.  I’ve posted previously about Bierce’s characteristic pose as The Man Without Illusions; evidently this is a pose Schwarz accepts at face value, and a form of masculinity he values highly.

B. R. Myers contributes a brief review essay on Australian crime fiction.  He quotes this exchange from one such novel:

“I hear someone punched out that cunt Derry Callahan,” he said. “Stole a can of dog food too. You blokes investigatin that?”

Cashin frowned. “That right? No complaint that I know of. When it happens, we’ll pull out all the stops. Door-to-door. Manhunt.”

“Let’s see your hand.”

“Let’s see your dick.”

“C’mon. Hiding somethin?”

“Fuck off.”

Bern laughed, delighted, punched Cashin’s upper arm. “You fuckin violent bastard.”

Upon which Myers comments “I grinned right along with that, as if I hadn’t left high school hoping never to have to hear such exchanges again.”  Indeed, talk like that is common among males of many ages and nationalities, and I can sympathize with Myers’ wish to escape from it.  As with his admiration for that rather well-crafted specimen of it.

Friends Journal, September 2011

The September 2011 issue of Friends Journal includes a couple of brief pieces I wanted to note.

Geoffrey Kaiser writes of “Three Kinds of Singing in Meeting.”  Kaiser tells of an old document he found when he was visiting Quaker meetings in New England in 1980.  This document was an official statement that a monthly meeting* issued in 1675.  It classified singing in meeting for worship** into three categories: “Serious Sighing,” “Sensible Groaning,” and “Reverent Singing.”

Erik Lehtinen, at the time of writing an Episcopalian deacon, explains in “True Confessions of a Closet Quaker” that he has for some time been sneaking out of his church to attend a Friends*** meeting, and that he has decided to leave the Episcopal church and to join the Quakers.  Lehtinen writes that “Many seekers probably start by reading and being inspired by The Journal of George Fox.****”  Seekers who are graduates of an Anglican seminary may start that way, but I very much doubt that Fox’s journals, written as they were in haste, in the seventeenth century, and by a man whose ideas are challenging to moderns in many ways, are in fact very attractive to a significant percentage of any other population.  Still, it is useful to read Lehtinen’s description of Fox as “a fellow Anglican.”  Fox spent his youth in the Church of England, and never quite admitted that he had left that communion.

*”Monthly meeting” is a Quakerese expression that other Christian traditions might translate as “parish” or “local church”

**”Meeting for worship” is also Quakerese; one might say, “worship service”

***”Friends” is Quakerese for “Quakers.”  It’s a term that Quakers themselves find confusing, or claim to find confusing; they sometimes make a show of saying “friends- big ‘F’ and little ‘f,'” to highlight the fact that Friends can have friends who aren’t Friends.

****George Fox was the founder of Quakerism.  There are people who think that lines like “Friends can have friends who aren’t Friends” are hilarious; such people have also been known to look for opportunities to make puns about foxes.  So if you are thinking of joining with the Quakers, don’t say I didn’t warn you.

“We do not believe in appointing Deputies to do what we think it wrong for ourselves to do”

Grover Cleveland, before he entered politics

This summer Mrs Acilius and I read Ryan P. Jordan‘s  Slavery and the Meetinghouse, a study of the great difficulty American Quakers had in the years 1821-1861 trying to decide on an approach to take to the issue of slavery.  Last night I was reminded of this passage, from pages 114 through 115 of Jordan’s book:

The editor of the National Anti-Slavery Standard, Sydney Howard Gay, wrote that the Anti-Slavery Society disagreed “with the philosophy of the Quaker[s]” who when appointed to political positions would not hang a man themselves but “would appoint a Deputy that would.”  “We do not believe,” continued Gay, “in appointing Deputies to do what we think to be wrong for ourselves to do.” 

Gay wrote these words in October of 1848, when many American Quakers were rallying to support the presidential campaign of slaveholder Zachary Taylor.  In the willingness of the ostensibly antislavery Quakers of the day to support a slaveholding president, Gay saw cowardice.  He equated the cowardice he believed he saw in this matter with the cowardice he saw in the same Quakers in regard to the death penalty.  In the seventeenth century, the founders of Quakerism opposed the death penalty, and in many parts of the world that opposition continues even today in an unbroken line of tradition.  The Quakers Gay saw in the antebellum USA paid lip service to that tradition, but many of them merely hid behind others while they became complicit in executions.

What brought this to my mind last night was this tweet from author Michael Brendan Dougherty:

I don’t like Rick Perry. And I think he failed in his answer on this. But it is wrong to say that “Rick Perry has executed” people.

To which I responded:

@michaelbd “it is wrong to say that “Rick Perry has executed” people.” Better Grover Cleveland, who did the job personally, than to delegate

Only someone with a lively interest in nineteenth century US history would be likely to know what I was talking about there, so permit me to explain.  In 1872, Stephen Grover Cleveland was sheriff of Erie County, New York.  The law of the state of New York in those days declared it to be the responsibility of the sheriff of each county to hang the prisoners condemned to death for crimes committed in that county.  As this 1912 New York Times article (pdf) put it, “In the office of Sheriff of Erie County there had for many years been a Deputy Sheriff named Jacob Emerick.  Mr Cleveland’s predecessors had from time immemorial followed the custom of turning over to Emerick all the details of public executions.  So often had this veteran Deputy Sheriff officiated at hangings that he came to be publicly known as ‘Hangman Emerick.'”  Evidently Emerick didn’t enjoy this sobriquet, and Cleveland noticed that the law explicitly named the High Sheriff as the officer responsible for hangings.  So when Patrick Morrisey was scheduled to be hanged on 6 September 1872, Cleveland resolved to execute Morrisey himself.  To return to the Times article, “Cleveland surprised the community and his friends by announcing that he personally would perform the act of Executioner.  To the remonstrances of his friends he refused to listen, pointing to the letter of the law requiring the Sheriff to ‘hang by the neck,’ &c.  He furthermore insisted that he had no moral right to impose upon a subordinate the obnoxious and degrading tasks that attached to his office.  He considered it an important duty on his part to relieve Emerick as far as possible from the growing onus of his title of ‘Hangman.'”   The following year, Cleveland again acted as hangman, putting one John Gaffney to death.  Cleveland was subsequently elected mayor of Buffalo, then governor of New York.  He was the Democratic Party’s candidate for president of the United States in 1884, 1888, and 1892, winning the popular vote on all three occasions and winning the electoral vote in 1884 and 1892.  He remains the only US president to serve two non-consecutive terms in office and one of only four candidates to win the popular vote three times.  He is also the only former sheriff to go on to become US president.

It is because of Cleveland’s willingness to look Morrisey and Gaffney in their faces and pull the lever that dropped the platform from beneath their feet that I have more respect for him than I do for Rick Perry.  In his years as governor of Texas, Mr Perry has signed death warrants that have consigned the 234 people to death.  So far from performing these executions himself, Mr Perry seems never even to have attended an execution.  And while Cleveland could acknowledge that performing an execution was one of the “obnoxious and degrading tasks attached to his office,” Mr Perry claims to regard signing death warrants as a carefree exercise.  This difference alone shows that Grover Cleveland lived in a different moral universe than does Rick Perry.  People whose imaginations are shaped by television and video games may think of indifference to human life as a form of strength, and of personal encounters with the object of one’s violent behavior as unimportant.  Such views would likely have struck a man of Cleveland’s sort as a sign of profound moral and spiritual immaturity.  Granted, executions were far more routine in America in the nineteenth century than they are today, even in a death-penalty happy state like Texas.  But does the fact that we execute fewer people today mean that we take the matter of life and death more seriously than the Americans of Cleveland’s day took it?  Or does it simply mean that other features of our society have interfered with the smooth functioning of the “machinery of death“?

“Among the loneliest creatures in the universe”

The other day, Eve Tushnet posted a link to this post by Mark P. Shea.  Shea is responding to remarks by Watergate figure Charles Colson, who was in turn discussing the question of whether puppets Bert and Ernie, of Sesame Street fame, should marry each other.  Colson approvingly quotes the official statement from the producers of Sesame Street to the effect that as puppets, Bert and Ernie “do not have a sexual orientation.”  He sees a deeper significance in the idea that Bert and Ernie’s close friendship suggests a homosexual relationship, and quotes blogger Alyssa Rosenberg’s remarks about it.  Colson’s quote from Rosenberg included the beginning and middle of this paragraph:

And more to the point, I think it’s actively unhelpful to gay and straight men alike to perpetuate the idea that all same-sex roommates, be they puppet or human, must necessarily be a gay couple. Having close, affectionate friendships with another man doesn’t mean that you two are sleeping together, just as liking fashion doesn’t automatically flip a switch on your sexual orientation and make you only interested in dudes. Such assumptions narrow the aperture of what we understand as heterosexual masculinity in a really strange way. As much as I write about how narrow depictions of women can be in pop culture, depictions of men may end up being more positive, but that doesn’t mean they’re less limiting.

To this, Shea adds that the idea that friendship between people of the same sex must somehow represent “sublimated homosexuality” is:

just a lie and the incredible poverty that is foisted on our culture (and on men in particular, who are starving to death for lack of male friendships) is one of the great famines of our time. Some of the most nourishing relationships I have ever known have been friendships–with both men and women. American men are among the loneliest creatures in the universe, not for lack of women, but for lack of friends.

Shea’s blog is called “Catholic and Enjoying It!”   Like Shea, Tushnet is a tradition-minded Roman Catholic; the tagline of her blog is “Conservatism reborn in twisted sisterhood.”  Unlike Shea, she is an uncloseted (though celibate) lesbian.  Colson is neither a Roman Catholic nor a lesbian, but his ardent Baptist faith comes with quite an old-fashioned view of sexual morality.  So Shea’s comment puts them in an awkward spot.  If the label “gay” holds such terror for American men that they would rather take a place “among the loneliest creatures in the universe” than risk being identified with it, surely it is an urgent matter to drain that label of its terror.  To be sure, this is no easy matter.  For decades now, legions of people have been laboring mightily to destigmatize homosexuality, and the work isn’t half done yet.  But it is the obvious answer.  Indeed, the only conclusion I can draw from Shea’s remark is that heterosexual men have a vital stake in the movement to gain full social equality for sexual minorities.  This conclusion, however, is not one that Christian conservatives such as Shea, Colson, and Tushnet can accept, and so they are left facing a vastly complex, perhaps hopelessly complex, problem.  I believe their hearts are in the right place, and so I feel sorry for them.