What matters in life

Here are the last three sentences of an opinion piece that appeared in Time magazine some time ago:

It is a pathetic little second-rate substitute for reality, a pitiable flight from life. As such it deserves fairness, compassion, understanding and, when possible, treatment. But it deserves no encouragement, no glamorization, no rationalization, no fake status as minority martyrdom, no sophistry about simple differences in taste—and, above all, no pretense that it is anything but a pernicious sickness.

To what does that “it” refer?  By themselves, these sentences leave open several possibilities.  They sound very much like the more strident remarks that aggressive atheists make about religion.  For their part, believers have been known to reply to these remarks in kind.  People on each side of that dispute tend to build their favorite presuppositions into the way they use words like “reality” and “life,” so that each could accuse the other of offering “a pathetic little second-rate substitute for reality, a pitiable flight from life.”  At the same time, leftists have been known to write this way about right-wingers, especially when the right-wingers in question belong to groups that the leftists see as victims of unjust policies that the right supports.  The phrase “false consciousness” may not be much in favor any longer, but there are other ways of accusing people of being deluded about what political movements are in their own best interests.  The line about “fake status as minority martyrdom” sounds just like the sort of thing left-of-center Americans are often provoked to say when their least favorite political figures claim to have suffered unfair treatment at the hands of a “liberal elite.”  Again, it is not uncommon for right-wingers respond in kind, presenting leftism as a mental illness and a sign of self-loathing.
:
Indeed, just about any activity or belief of which a speaker disapproves could be attacked in the words that Time magazine used above.  If the speaker is absorbed in a rival activity or committed to an opposing system of belief, it may seem obvious that Time‘s description is perfectly accurate.  For example, when I was in graduate school, I was entirely immersed in the study of ancient Greece and Rome.  For years, I and my fellow students averaged something between 80-100 hours a week studying the languages, literatures, histories, and material remains of classical antiquity.  We socialized primarily with each other, and modeled ourselves on our professors.  So by the middle of our grad school years, we came to take it for granted that every walk of life that did not advance classical learning was a waste of time, a poor consolation for people who couldn’t make it in classics.  We had entered graduate school with a more balanced view, and by the time we entered the job market most of us were on our way back to that balance, but for most of us there was a period starting sometime around the end of the first year and ending sometime before the fourth year when it was hard to take anything outside of classics at all seriously.  I suspect we would all have nodded in agreement if someone had described, say, a career in the insurance industry as “a pathetic little second-rate substitute for reality,” etc.
:
Of course, classical scholarship is not one of the most powerful or celebrated professions in the twenty-first century.  So once a person emerges from the odd little world of a graduate program in classics, that person is unlikely to continue taking it for granted that classicists are the only successful professionals.  Other fields enjoy far more  prestige; their practitioners are in much greater danger of becoming unalterably attached to the idea that they and their colleagues have a monopoly on wisdom.  Businesspeople, scientists, and medical doctors seem to number among their ranks many people whose intellectual development is permanently stunted in the condition of the second-year grad student.  For these individuals, the boundaries of “reality” and “life” are the boundaries of their disciplines, and anything outside those boundaries is a “substitute for reality” and a “flight from life,” and people who dwell out there are sad cases to be taken gently, but firmly, in hand.
:
Political and religious beliefs are even more likely to swallow up a person’s conception of success in life than is a sense of the importance of one’s profession, and certainly less likely to spit that conception back out into the open air.  So it is small wonder that left and right, atheist and believer might see each other in the light that Time magazine describes.  For each ideological group, it seems obvious which things truly matter in life, and people who are uninterested in those things and devoted to others must therefore be fools who are suffering from some peculiar sort of disorientation.  Any influence such fools have on those around them is, of course, dangerous and requires action to reassert the more wholesome values.
:
So, these sentences represent, on the one hand a content-free insult, but on the other hand the writer’s confession of faith.  What he was attacking as unreal, unliving, and pernicious was the direct negation of what he thought of as most plainly real, lively, and wholesome. To find out who Time magazine was insulting, turn to the original article (which I found here.)

“If politicians would stop arguing, they could work together– to get things done! Doesn’t matter what! Just, you know– things!”

The other day, The Monkey Cage featured a post called “Why Does Congress Flail?  Voters Reward Positions More than Success.”  As the title implies, the premise of the post was that the US Congress has been relatively ineffective in passing major legislation of late because its members know that their jobs depend, not on passing bills into law, but on striking poses that resonate with the ideological leanings of their constituents.

In a comment, I challenged the first part of the premise.  In the last ten years, the US Congress has in fact passed a great deal of major legislation, changing American government and American life far more profoundly than in almost any other epoch of US history.  Among this legislation are bills funding several wars, permitting the president to wiretap virtually anyone he likes, maintaining indefinite detention of persons accused of terrorism, creating the Department of Homeland Security, formalizing a variety of terrorism watch-lists, adding a prescription drug benefit to Medicare, requiring citizens to buy health insurance, dramatically expanding the federal role in education, handing a trillion dollars of taxpayers’ money to Goldman Sachs and companies friendly to it, and repeatedly cutting taxes on the largest incomes, to name just a few measures with vast ramifications.  It’s true that members of congress rarely cite these achievements in their reelection bids.  That isn’t because they are unimportant, but because none of them is at all popular.  If these acts constitute “success,” then it is no wonder voters don’t reward it.  Rather, it is a mystery that voters don’t punish such “success” by deserting both the Republican and Democratic parties, and replacing their entire set of political leaders.

Yet one still hears Americans who wish to be regarded as “moderate,” or “centrist,” or “responsible” say that top elected officials in Washington should stop battling with each other so that they can be more effective at “getting things done.”  I’ve found that the people who say this seem puzzled when I point out how much has “gotten done” in Washington since 2001 .  What seems equally difficult for them to grasp is the point Tom Tomorrow makes in this cartoon::

The way out of philosophy runs through philosophy

There’s a phrase I’ve been thinking about for years, ever since I read it somewhere or other in Freud: “the moderate misery required for productive work.”  It struck me as plausible; someone who isn’t miserable at all is unlikely to settle willingly into the tedious, repetitive tasks that productive work often involves, while someone who is deeply miserable is unlikely to tolerate such tasks long enough to complete them.  If blogging counts as productive work, I myself may recently have represented a case in point.  Throughout the summer and into the autumn, I wasn’t miserable at all, and I barely posted a thing.  Then I caught a cold, and I posted daily for a week or so.  If I’m typical of bloggers in this respect, maybe I could also claim to have something in common with a philosopher.  Samuel Johnson once quipped that he had intended to become a philosopher, but couldn’t manage it.  The cause of his failure?  “Cheerfulness kept breaking in.”

One item I kept meaning to post notes on when cheerfulness was distracting me from the blog was a magazine article about Johnson’s contemporary, David Hume.  Hume, of course, was a philosopher; indeed, many would argue that he was “the most important philosopher ever to write in English.”  Contrary to what Johnson’s remark suggests, however, Hume was suspected of cheerfulness on many occasions.  The article I’ve kept meaning to note is by Hume scholar and anti-nationalist Donald W. Livingston; despite the radicalism of Livingston’s politics (his avowed goal is to dissolve the United States of America in order to replace it with communities built on a “human scale”) in this article he praises Hume as “The First Conservative.”  Hume’s conservatism, in Livingston’s view, comes not only from his recognition of the fact that oversized political units such as nation-states and continental empires are inherently degrading to individuals and destructive of life-giving traditions, but also from his wariness towards the philosophical enterprise.  Hume saw philosophy as a necessary endeavor, not because it was the road to any particular truths, but because philosophical practice alone could cure the social and psychological maladies that the influence of philosophy had engendered in the West.

This is the sort of view that we sometimes associate with Ludwig Wittgenstein; so, it’s easy to find books and articles with titles like “The End of Philosophy” and “Is Philosophy Dead?” that focus on Wittgenstein.  But Livingston demonstrates that Hume, writing more than a century and a half before Wittgenstein, had made just such an argument.  Livingston’s discussion of Hume’s Treatise of Human Nature (first published in 1739-1740) is worth quoting at length:

Hume forged a distinction in his first work, A Treatise of Human Nature (1739-40), between “true” and “false” philosophy.  The philosophical act of thought has three constituents. First, it is inquiry that seeks an unconditioned grasp of the nature of reality. The philosophical question takes the form: “What ultimately is X?” Second, in answering such questions the philosopher is only guided by his autonomous reason. He cannot begin by assuming the truth of what the poets, priests, or founders of states have said. To do so would be to make philosophy the handmaiden of religion, politics, or tradition. Third, philosophical inquiry, aiming to grasp the ultimate nature of things and guided by autonomous reason, has a title to dominion. As Plato famously said, philosophers should be kings.

Yet Hume discovered that the principles of ultimacy, autonomy, and dominion, though essential to the philosophical act, are incoherent with human nature and cannot constitute an inquiry of any kind.  If consistently pursued, they entail total skepticism and nihilism. Philosophers do not end in total skepticism, but only because they unknowingly smuggle in their favorite beliefs from the prejudices of custom, passing them off as the work of a pure, neutral reason. Hume calls this “false philosophy” because the end of philosophy is self-knowledge, not self-deception.

The “true philosopher” is one who consistently follows the traditional conception of philosophy to the bitter end and experiences the dark night of utter nihilism. In this condition all argument and theory is reduced to silence. Through this existential silence and despair the philosopher can notice for the first time that radiant world of pre-reflectively received common life which he had known all along through participation, but which was willfully ignored by the hubris of philosophical reflection.

It is to this formerly disowned part of experience that he now seeks to return. Yet he also recognizes that it was the philosophic act that brought him to this awareness, so he cannot abandon inquiry into ultimate reality, as the ancient Pyrrhonian skeptics and their postmodern progeny try to do. Rather he reforms it in the light of this painfully acquired new knowledge.

What must be given up is the autonomy principle. Whereas the false philosopher had considered the totality of pre-reflectively received common life to be false unless certified by the philosopher’s autonomous reason, the true philosopher now presumes the totality of common life to be true. Inquiry thus takes on a different task. Any belief within the inherited order of common life can be criticized in the light of other more deeply established beliefs. These in turn can be criticized in the same way. And so Hume defines “true philosophy” as “reflections on common life methodized and corrected.”

By common life Hume does not mean what Thomas Paine or Thomas Reid meant by “common sense,” namely a privileged access to knowledge independent of critical reflection; this would be just another form of “false philosophy.” “Common life” refers to the totality of beliefs and practices acquired not by self-conscious reflection, propositions, argument, or theories but through pre-reflective  participation in custom and tradition. We learn to speak English by simply speaking it under the guidance of social authorities. After acquiring sufficient skill, we can abstract and reflect on the rules of syntax, semantics, and grammar that are internal to it and form judgments as to excellence in spoken and written English.  But we do not first learn these rules and then apply them as a condition of speaking the language. Knowledge by participation, custom, tradition, habit, and prejudice is primordial and is presupposed by knowledge gained from reflection.

The error of philosophy, as traditionally conceived—and especially modern philosophy—is to think that abstract rules or ideals gained from reflection are by themselves sufficient to guide conduct and belief. This is not to say abstract rules and ideals are not needed in critical thinking—they are—but only that they cannot stand on their own. They are abstractions or stylizations from common life; and, as abstractions, are indeterminate unless interpreted by the background prejudices of custom and tradition. Hume follows Cicero in saying that “custom is the great guide of life.” But custom understood as “methodized and corrected” by loyal and skillful participants.

The distinction between true and false philosophy is like the distinction between valid and invalid inferences in logic or between scientific and unscientific thinking. A piece of thinking can be “scientific”—i.e., arrived at in the right way—but contain a false conclusion. Likewise, an argument can be valid, in that the conclusion logically follows from the premises on pain of contradiction, even if all propositions in the argument are false. Neither logically valid nor scientific thinking can guarantee truth; nor can “true philosophy.” It cannot tell us whether God exists, or whether morals are objective or what time is. These must be settled, if at all, by arguments within common life.

True philosophy is merely the right way for the philosophical impulse to operate when exploring these questions. The alternative is either utter nihilism (and the end of philosophical inquiry) or the corruptions of false philosophy. True philosophy merely guarantees that we will be free from those corruptions.

This is rather like one of Friedrich Nietzsche’s parables, from Also Sprach Zarathustra (1883-1885).  Nietzsche’s Zarathustra preaches that the superman must become a camel, so as to bear the heaviest of all weights, which is the humiliation that comes when one discovers the extent of one’s ignorance, and the commitment to enlighten that ignorance; that he must then put the camel aside and become a lion, so that he may slay the dragon of “Thou-Shalt” and undertake to discover his own morality; and that at the last he must become a child, so that he may put that struggle behind him and be ready to meet new challenges, not as reenactments of his past triumphs, but on their own terms.  According to Livingston, Hume, like Nietzsche, sees the uneducated European as a half-formed philosopher, and believes that with a complete philosophical education s/he can become something entirely different from a philosopher:

(more…)

Mr O’s “anti-nuclear imperialism”

Let me tell you about a better way, a way that protects the purity of our precious bodily fluids.

The late September issue of Counterpunch (available to subscribers here; the newsletter’s website is here) includes a fine article by Darwin Bond-Graham titled “The Obama Administration’s Nuclear Weapons Surge.”  While Mr O has made many remarks declaring that nuclear weapons are bad and the world would be better off without them, he has in fact “worked vigorously to commit the nation to a multi-hundred-billion-dollar reinvestment in nuclear weapons, mapped out over the next three decades.”  Bond-Graham analyzes the New START agreement between the USA and Russia.  Though the publicity surrounding New START presented it as an arms-reduction treaty, Bond-Graham contends that it is nothing of the kind.  “On balance, the nominal reductions in nuclear weapons required by New START are insignificant when compared to the multibillion-dollar nuclear (and strategic non-nuclear) weapons programs committed to in the treaty’s text.”  Indeed, Bond-Graham classifies New START as an “arms-affirmation treaty.”  Mr O and his allies in the upper echelons of the congressional Democratic leadership were able to market New START as a disarmament agreement and to enlist the support of Americans who usually oppose nuclear weapons, even though “the treaty does not actually require the destruction of a single nuclear warhead.”  Bond-Graham also goes into depth on various other programs through which Mr O has managed to increase spending on nuclear weapons, to reorient the USA’s nuclear weapons programs towards potential use in conflict, and to strip away inhibitions against nuclear first strikes by the USA.

For Bond-Graham, Mr O’s anti-nuclear public statements not only represent a rhetorical device to “neutralize”  the “anti-nuclear and antiwar groups that so effectively exposed [George W.] Bush’s plans” to pursue policies similar to those of the current administration, but also constitute the foundation of a strategic orientation that Bond-Graham dubs “anti-nuclear imperialism.”  This orientation, ostensibly based on abhorrence of nuclear weapons, in fact promotes the development, maintenance, and deployment of such weapons.  Remember the claims that the Bush-Cheney administration made about Saddam Hussein’s alleged “Weapons of Mass Destruction” programs in 2002-2003, and the meaning of the phrase “anti-nuclear imperialism” becomes all too clear.

The contextualization fairy

Recently, John Holbo posted two items (here and here) on Crooked Timber about something odd in American politics.  Right-wing politicians in the USA quite often make public statements that would, if taken at face value, suggest that they are far more extreme in their views than they in fact are.  So, Professor Holbo finds remarks from Texas governor Rick Perry which, taken literally, would imply that Mr Perry thought that Texas should secede from the USA, that all federal programs established since 1900 should be abolished, indeed that there should be no government at all.  Mr Perry obviously does not believe any of those things, so obviously that only his committed opponents try to take him to task for making such extreme remarks.  This is not unique to Mr Perry, but is a usual pattern for right-wing US politicians.

What makes this so odd is that, while it is common for right-wing American politicians to exaggerate the radicalism of their views and for the public to realize that this is what they are doing, Professor Holbo can find no examples of their left-leaning counterparts doing the same thing.  A Democratic or leftist candidate who makes a radical-sounding statement likely means that statement to be taken at face value, and it certainly will be taken at face value by most observers.

Many commentators on American politics explain the right-wingers’ habit of making extreme sounding statements for which they do not expect to be held responsible as an effort to move the “Overton Window.”  The Overton Window, named for the late Joseph P. Overton, is the range of ideas that the people who hold sway in a given political culture hold to be acceptable at a particular time.  Only ideas within the window are likely to be put into effect.  The window shifts back and forth, as some ideas that had once seemed outlandish begin to seem mainstream, while other ideas that had once seemed mainstream begin to seem outlandish.

Key to the Overton Window is the idea of contextualization.  The idea of devolving Medicare, the program that ensures that most Americans over the age of 65 will be able to pay for health care, to the states may seem outlandish to many in the USA, but compared to the idea of large states seceding from the Union it is quite moderate.  The idea of shifting the revenues of Social Security, the program that provides a guaranteed income to  most Americans over the age of 65, from current benefits to private savings accounts may seem outlandish to many in the USA, but compared with the idea of abolishing the entire welfare state it is quite moderate.  Other policies favored by powerful interests on the right end of the political spectrum may also seem outlandish, but compared with anarchism they too are quite moderate.  So, within the context of the extreme remarks for which they are not called to account, rightists can gain a hearing for policies which they do seriously advocate.

(more…)

Brian Barder is a nice guy

Political blogging is not generally regarded as an activity that brings courtesy to the fore, but retired UK diplomat Brian Barder never fails to show good manners.  Though most of the topics he discusses are outside my usual circle of interests, I read him regularly, since it is such a pleasure to see politeness at work.

For example, the other day Brian Barder* posted a proposal about reforming the UK constitution. Brian Barder has considerable expertise on this subject, and his proposal is sufficiently close to his heart that he has been working to promote it for some years.  My knowledge of the UK constitution is limited to what I picked up during the years I took The Economist, and as someone who does not live in the UK my stake in its reform is close to nil.  Yet I took the liberty of posting comments (here and here) in which I expressed skepticism about the practical aspects of his plan.  Brian Barder would have been perfectly within his rights to ignore my uninformed remarks, or even to dismiss them icily, yet in fact he took the time to provide detailed responses to each of them.  In fact, he even emailed me to make sure that I knew he had done so.  Such generosity is not to be forgotten.

*I’d call him “Mr Barder,” but that isn’t actually his name.  He holds a knighthood, and so the proper courtesy title for him is “Sir Brian.”  I cannot bring myself to refer to any living person in that fashion; to me, it suggests only Monty Python.   So the only respectful way I can name him is as “Brian Barder.”  This is a shame, since Brian Barder is himself so scrupulous a user of courtesy titles that he titled his condemnation of the Oslo massacre “There are no lessons to learn from Mr Breivik.”  If Brian Barder can bring himself to give the correct title even to the murderer of dozens of helpless innocents, it seems churlish of me to withhold his title from him, yet I must, I must.

Some interesting comments by Michael Peachin about a new book on the Emperor Claudius

Proclaiming Claudius Emperor, by Lawrence Alma-Tadema

As a subscriber to Classical Journal, I regularly receive emailed reviews of new scholarly books concerning ancient Greece and Rome.  The other day, for example, they sent me Michael Peachin‘s review of Claudius Caesar: Image and Power in the Early Roman Empire, by Josiah Osgood (Cambridge University Press, 2010).  The only other notice I’d seen of the book was a drearily dutiful one in The Bryn Mawr Classical Review, so I was surprised that Peachin found some exciting points in the book.  I’ll quote two of these points:

Several recent accounts of Roman emperors have sailed off on a new tack. Instead of attempting a traditional biographical interpretationof the man, and thereby also a chronicle of his reign, each of thesehas sought to present an emperor on his own terms, and/or to view himas he was perceived by certain groups of contemporaries (other than theelite authors, who usually monopolize discussion). Thus, Caligula was notout of his mind; he simply had no taste for playing republic, when thereality was despotism; and so, he fashioned himself overtly as a tyrant,regardless of the consequences – or perhaps precisely to elicit certainones of those (A. Winterling, Caligula: A Biography [Berkeley, 2011]).

 

When I was in graduate school, I took a seminar on Roman history in which the professor horrified about half the class by spending a day arguing that Caligula was probably not a lunatic.  A few of my classmates were committed to the view of the third emperor presented in the ancient historical texts, and were appalled to hear a revision of that view; the others were committed to the idea that the only sort of history worth doing was social history that focused on the most numerous groups in a society, and so were appalled that we were spending so much time on the question of one man’s mental health.  I was not in either of those groups, but loved the day and have been defending Caligula ever since.  By the way, there’s a fine review of Winterling’s book in September’s New Criterion.  I recommend it to the the general reader.

Peachin makes a point that I found especially fascinating:

Augustus, in fine, had played his part well; but as Osgood aptly demonstrates, he fated all the various players in the sequel to write their own scripts as they went. In any case, Osgood argues that Claudius quite actively tried to shape his own time as emperor, and that in doing so, he contributed materially to the development of the imperial ‘system.’ As we observe this particular emperor at work, we are also being nudged slightly away from Fergus Millar’s  picture of a more passive, and perhaps generic, sort of monarch (The Emperor in the Roman World [Ithaca, 1977]): “…who the emperor was mattered” [136]). Still, Osgood sees quite clearly that Claudius (or any emperor) was indeed only one person; and hence, the princeps’  direct involvement with his subjects was perforce limited. Thus, when an emperor did choose to intervene, the event was so momentous as to carry an aura of the divine. That said, Claudius was no lone actor. We are reminded, throughout, that “…much of this emperor’s image, like any other’s, was constructed in dialogue with his subjects” (317.)

So, it was precisely because the emperor’s position was inherently weak that he inspired awe in his subjects.  This is just the sort of paradox I can never resist.

Many readers will be familiar with the theory that historian Arnaldo Momigliano developed and that Robert Graves popularized in his novels about Claudius.  Under this theory, Claudius wanted to phase out the principate and restore the old Republic.  Peachin explains Osgood’s view of this theory with admirable concision:

Following Momigliano’s observations (Claudius: the Emperor and His Achievement [Oxford, 1934]), Osgood stresses the fact that Augustus’ uneasy amalgam of republic and empire remained a befuddling puzzle  for Claudius (indeed, for every emperor). In particular, the quasi-retention of a republican state meant that a new imperial system of   government could not be crafted with anything even approaching clarity, or in any detail. Thus, to start at the start, when Gaius [a.k.a. Caligula] was murdered, and had not indicated a successor, a conclusively ‘proper’ or ‘constitutional’ way forward was nowhere to be discovered. That notwithstanding, Claudius was quickly on the throne; but then, the awkward facts of his accession, not to mention the earlier vituperation of him by members of the Augustan house (and others), seriously undercut his authority. Attempting to counter such hindrances, and just generally in his zeal to rule as he found appropriate, Claudius was too fastidious.     The result was a nasty paradox: “The loftier the goals the emperor set  for his administration, the more likely he was to fail, and to open himself to allegations of incompetency, or even corruption. Yet precisely  to try to win loyalty and increase his prestige, Claudius had to set   loftier goals than those of Tiberius, even those of Caligula” (189).

Considering that the written law in Rome in 41 BC was predicated on the idea that the Republic was still functioning, and that Claudius owed the principate to the very group of men who had just violently murdered his predecessor, it would have been quite a challenge for him to find a way to present his accession as legitimate without appealing to the idea of a restored Republic.   In no position to prosecute the assassins of Caligula, Claudius could only appeal to the right of tyrannicide, and thus evoke the two Brutuses, one who according to legend struck against the Tarquins in order to end the monarchy and establish the Republic, and the other who struck against Julius Caesar the Dictator in an attempt to prevent a new monarchy from ending the Republic.   If there were people who took this forced imposture at face value, one can hardly blame Claudius.

 

Did astrology originate in cities?

I wonder if the first astrologers were city-dwellers.  True, archeologists have found evidence that people who lived before the rise of cities paid close attention to the orbit of the Moon and identified constellations, and have argued that the orientations of temples and other religious structures from those days suggest that they attached a religious significance to the movements of heavenly bodies.  Those activities are hardly surprising; farmers need a calendar to plan their year, as hunter-gatherers also need to plan their expeditions for times when game will be relatively plentiful and fruit ripe for the picking.  Still, it might not be too much of a stretch to look at a society that invests heavily in maintaining and publicizing its calendar and to see a suggestion of something like what the western branch of organized Christianity used to call “natural astrology,” a set of ideas about ways in which heavenly bodies might influence the earth’s weather and various medical phenomena related to the transmission of disease.

Quite distinct from natural astrology are the various studies to which the Western Church used to refer as “judicial astrology.”  That’s the part that includes horoscopes, sun signs, and the like.  The difference matters when considering the origins of astrology; we have very ancient documents relating to the movements of heavenly bodies that seem to have some special significance and that predate the earliest references to judicial astronomy by centuries.  So, I’ll use the terms.

It is sometimes said that our earliest evidence of judicial astronomy comes from Mesopotamia, but that is misleading.  The nation state didn’t exist in those days; Ur and Lagash and Akkad and Babylon and the other urban centers that rose and fell in that region interacted with the political and economic systems of the countryside around them in a variety of ways, but in other ways they remained quite distinct.  It is in such cities that we find the first documents describing judicial astrology.

If astrology did arise in cities, it arose in a social environment where markets were familiar.  Its entire history would have taken place amid money, contracts, and production for exchange.  That calls into question the assumptions that we discussed last year when this xkcd appeared:

Not to be confused with "selling this stuff to OTHER people who think it works," which corporate accountants and actuaries have zero problems with.

Some people fall into the assumption that, because markets promote something called “rationality,” they must therefore favor every form of reason and disfavor every form of unreason.  However, the rationality which comes from markets is in fact something of a very narrow sort.  A month after our discussion, we noted that Shikha Dalmia had put it very well: “Markets don’t reward merit, they reward value.”  Dalmia summarizes the views of economist Friedrich Hayek:

In a functioning market, Hayek insisted, financial compensation depends not on someone’s innate gifts or moral character. Nor even on the originality or technological brilliance of their products. Nor, for that matter, on the effort that goes into producing them. The sole and only issue is a product’s value to others. Compare an innovation as incredibly mundane as a new plastic lid for paint cans with a whiz-bang, new computer chip. The painter could become just as rich as the computer whiz so long as the savings from spills that the lid offers are as great as the productivity gains from the chip. It matters not a whit that the lid maker is a drunk, wife-beating, out-of-work painter who stumbled upon this idea through pure serendipity when he tripped over a can of paint. Or that the computer whiz is a morally stellar Ph.D. who spent years perfecting his chip.

As markets are neutral as to the virtue or vice of economic actors, so too are they neutral as to the truth or falsity of the ideas that those actors bring as products for sale.  If falsehoods are in demand, falsehoods will sell; if truths are not in demand, their bearers will go begging.  The mouseover text for the xkcd represents a nod to this fact, and an attempt to wriggle out of its implications: “Not to be confused with ‘selling this stuff to OTHER people who think it works,’ which corporate accountants and actuaries have zero problems with.”  That won’t do, since it assumes that we can assign a fixed meaning to the expression “works.”  An investment advisor who believes in astrology may not be any likelier than other advisors to beat the market, but s/he may very well use that belief to “make a killing,” if s/he attracts clients who strongly value such a belief.  In that case, astrology would not “work” in the sense that quantitative analysts officially recognize, but it would make the advisor every bit as rich as it would if it did meet their definitions of success.  As for whether it makes the clients rich, well, Fred Schwed answered that one in 1940:

Once in the dear dead days beyond recall, an out of town visitor was being shown the wonders of the New York financial district.  When the party arrived at the Battery, one of his guides indicated some handsome ships riding at anchor.  He said, “Look, these are the bankers’ and brokers’ yachts.”  “Where are the customers’ yachts?,” asked the naive visitor.

Clearly, markets have not dissolved belief in astrology, any more than the continued non-existence of the customers’ yachts has discouraged people going to brokers and bankers.  If the practice of judicial astrology first arose in cities, it may in fact be a by-product of market society.  Perhaps we might find that judicial astrology began, not simply as a more elaborate version of a natural astrology that had long been a feature of rural life, but as an attempt to understand market interactions and the power of the market.  In that case, it would qualify as a school of economics.  One may wonder whether judicial astrology would be the most absurd such school in practice today.

The Atlantic, October 2011

The current issue of The Atlantic contains four pieces on which I took notes.  All four of them had to do with masculinity in one way or another.

Historian Taylor Branch contributes an article about college sports in the USA.  Non-USA types may not be aware that colleges and universities in the United States operate sports franchises, some of which have a mass following and an extremely lucrative financial aspect.  The athletes are not paid for their participation in this multibillion dollar industry; they are not even compensated for injuries they receive in the course of them.  Branch outlines the story of how this preposterously unfair system came to exist, and considers several recent developments that may bring it to an end.  Athletes are symbols of masculinity in the USA, as elsewhere; the amateur ideal may once have been part of a concept of masculinity that some upper-class Americans cherished, but nowadays even volunteerism is often justified in terms of its resume-building potential.  Moneymaking has become the masculine activity par excellence.  So the National Collegiate Athletic Association’s (NCAA’s) model of the unpaid “student-athlete” is a bit of an anachronism.

A piece called “Sex and the Married Politician” includes several references to the fall of New York Congressman Anthony Weiner.  Mr Weiner resigned his seat in the US House of Representatives shortly after it emerged that he had posted a picture of his genitalia on Twitter.  It strikes me as misleading to call this story a “sex scandal.”  Since everything on Twitter is public, Mr Weiner’s offense was not illicit sexual relations, but indecent exposure.  As such, he is in a league with longtime Friendsville, Maryland mayor Spencer Schlosnagle, who in the mid-1990s pled guilty to charges stemming from several incidents when he exposed himself to passersby on the highway.  Mr Schlosnagle paid a fine, went to a psychiatrist, and was reelected.  He continues in office today.  I think that the case of Mr Schlosnagle shows a community and a political system with a rational attitude towards mental illness.  Mr Schlosnagle initially tried to deny the charges against him; when the prosecution made such denials impossible, he accepted punishment and sought counseling, thus reducing the likelihood that he will reoffend.  Since his behavior was a real nuisance, the prosecution was rational.  On the other hand, it was only a nuisance, not a serious threat to anyone in particular; therefore, the voters’ decision to reelect him once he had shown that he was addressing his mental health problems was also rational.   Schlosnagle disclosed that he had suffered sexual abuse as a child, thus disowning any model of masculinity that would require him to project an image of himself as invulnerable or invincible.  The description of Weiner as the main figure in a “sex scandal,” by contrast, both obscures the fact that he doesn’t seem to have had any sexual contact with anyone and presents him as a menacingly potent figure.  I suppose it makes sense that he would have an easier time playing along with that image of himself that with presenting himself as a sick man compelled to behave in a somewhat annoying fashion.

The Library of America has finally devoted a volume to Ambrose Bierce, and this issue includes an admiring review of  Bierce’s work and of the Library’s edition.  I liked this sentence: “Bierce, after all, has always been best known for being undeservedly unknown.”  Reviewer Benjamin Schwarz also makes some good points about Bierce’s lapidary style, such as this:

Bierce’s seminal contribution to American letters is that “sharp-edged and flexible style, like the ribbon of a wound-up steel tape-measure,” as Edmund Wilson perfectly defined it. But that style emerged from Bierce’s compulsion to reveal a truth that remains unacceptable—or only selectively acceptable—today. It’s all very nice to decry the horror of war, but to Bierce its obscenity and its meaninglessness were merely integral to those of life. Bierce’s friend the editor Bailey Millard explained why all the leading publishers of the day rejected Bierce’s war fiction: they “admitted the purity of his diction and the magic of his haunting power, but the stories were regarded as revolting.” Understandably so, given what Bierce knew to be our delusional and self-serving tendencies.

Schwarz approves of Bierce’s flatly declarative style, especially as regards the US Civil War in which Bierce fought with distinction.  He quotes Walt Whitman’s remark that “The real war will never get into the books,” then says: “And in fact, excepting Bierce’s work, it didn’t.”  That’s high praise indeed; Bierce, alone among the tens of thousands of authors who have published books on that conflict, succeeded in putting “the real war” into his books.  I’ve posted previously about Bierce’s characteristic pose as The Man Without Illusions; evidently this is a pose Schwarz accepts at face value, and a form of masculinity he values highly.

B. R. Myers contributes a brief review essay on Australian crime fiction.  He quotes this exchange from one such novel:

“I hear someone punched out that cunt Derry Callahan,” he said. “Stole a can of dog food too. You blokes investigatin that?”

Cashin frowned. “That right? No complaint that I know of. When it happens, we’ll pull out all the stops. Door-to-door. Manhunt.”

“Let’s see your hand.”

“Let’s see your dick.”

“C’mon. Hiding somethin?”

“Fuck off.”

Bern laughed, delighted, punched Cashin’s upper arm. “You fuckin violent bastard.”

Upon which Myers comments “I grinned right along with that, as if I hadn’t left high school hoping never to have to hear such exchanges again.”  Indeed, talk like that is common among males of many ages and nationalities, and I can sympathize with Myers’ wish to escape from it.  As with his admiration for that rather well-crafted specimen of it.

Friends Journal, September 2011

The September 2011 issue of Friends Journal includes a couple of brief pieces I wanted to note.

Geoffrey Kaiser writes of “Three Kinds of Singing in Meeting.”  Kaiser tells of an old document he found when he was visiting Quaker meetings in New England in 1980.  This document was an official statement that a monthly meeting* issued in 1675.  It classified singing in meeting for worship** into three categories: “Serious Sighing,” “Sensible Groaning,” and “Reverent Singing.”

Erik Lehtinen, at the time of writing an Episcopalian deacon, explains in “True Confessions of a Closet Quaker” that he has for some time been sneaking out of his church to attend a Friends*** meeting, and that he has decided to leave the Episcopal church and to join the Quakers.  Lehtinen writes that “Many seekers probably start by reading and being inspired by The Journal of George Fox.****”  Seekers who are graduates of an Anglican seminary may start that way, but I very much doubt that Fox’s journals, written as they were in haste, in the seventeenth century, and by a man whose ideas are challenging to moderns in many ways, are in fact very attractive to a significant percentage of any other population.  Still, it is useful to read Lehtinen’s description of Fox as “a fellow Anglican.”  Fox spent his youth in the Church of England, and never quite admitted that he had left that communion.

*”Monthly meeting” is a Quakerese expression that other Christian traditions might translate as “parish” or “local church”

**”Meeting for worship” is also Quakerese; one might say, “worship service”

***”Friends” is Quakerese for “Quakers.”  It’s a term that Quakers themselves find confusing, or claim to find confusing; they sometimes make a show of saying “friends- big ‘F’ and little ‘f,'” to highlight the fact that Friends can have friends who aren’t Friends.

****George Fox was the founder of Quakerism.  There are people who think that lines like “Friends can have friends who aren’t Friends” are hilarious; such people have also been known to look for opportunities to make puns about foxes.  So if you are thinking of joining with the Quakers, don’t say I didn’t warn you.