In which I demonstrate that I am the world’s nerdiest nerd

In a recent email exchange with the cofounders of this blog, known here as VThunderlad and Lefalcon, I shared some thoughts about Star Trek, including a synopsis of an idea for a new Star Trek movie.  Find the relevant bits below the jump.

Read the full post »

The unliked and uninjured

Earlier this week, Slate‘s Mark Joseph Stern wrote a piece asking incredulously “Do Anti-Gay Christians Really Face Employment Discrimination?”  Mr Stern cites blog posts by Princeton Professor Robert George and The American Conservatives always interesting, often apoplectic blogger Rod Dreher about a survey in which investment bank JP Morgan-Chase recently inquired into its employees positions with regard to the rights of sexual minorities.  Finding the survey a perfectly routine bit of corporate boilerplate, Mr Stern shows impatience with the concerns that Professor George and Mr Dreher voice.  “All of this is extravagantly silly, and I respect Dreher and George’s intellects too much to believe that they’re actually taking it seriously,” he writes.

I would agree that Professor George, Mr Dreher, and their fellows have made many hyperbolic statements regarding this and similar matters.  At the same time, I do think they are onto something.  I would refer to an item the retired Episcopal Bishop of New Hampshire, the Right Reverend Mr V. Gene Robinsonposted on The Daily Beast several months ago.  The Rt. Rev. Mr R, himself the first openly gay person consecrated a bishop in a traditional denomination, denied that anti-gay Christians in the USA are the targets of anything that should be called “persecution.”  At the same time he did acknowledge that they are coming to be a minority, not only numerically, but in the sense that they bear a stigma which sets them apart from the mainstream:

Here’s what victimization looks like: every day, especially in some places, LGBT people face the real possibility of violence because of their orientation or gender identity. Young people jump off bridges or hang themselves on playground swing sets because of the bullying and discrimination they face. In 29 states, one can be fired from one’s job simply for being gay, with no recourse to the courts. In most places, we cannot legally marry the one we love. Some of us have been kicked out of the house when we come out to our parents, and many young LGBT people find themselves homeless and on the streets because of the attitudes of their religious parents toward their LGBT children. And did I mention the everyday threat of violence?

Compare that to the very painful realization that one’s view of something like homosexuality is in the minority after countless centuries of being in the majority. It may feel like victimization to hang a shingle out to sell something or provide some service to the public, only to find that the “public” includes people one disagrees with or finds immoral in some way. It may feel like it has happened practically overnight, when it has actually been changing over a period of decades. Being pressed to conform to such a change in majority opinion must feel like victimization. But as a society, we would do well to distinguish between real victimization and the also-very-real discouragement felt by those who now find themselves in the minority.

I do not mean to brush aside as inconsequential the feelings of those who find themselves in the minority, whether it be around the topic of gender, race, or sexual orientation. But I do mean to question characterizing such feelings as discrimination, violation of religious freedom, and victimization. It’s time we called out our religious brothers and sisters for misunderstanding their recently-acquired status as members of a shrinking minority as victims.

I would amplify the good bishop’s remarks about “the feelings of those who find themselves in the minority.”  I would say that “feelings” is perhaps an unfortunate choice of words here, being as it is a word that often figures in non-apology apologies such as “I’m sorry I hurt your feelings,” which is a polite way of saying “I wish you hadn’t become upset when I was doing what any sensible person would regard as that right thing, you crybaby.”  The beliefs that motivate people who disapprove of homosexuality may be wrong; I am quite sure they are wrong, as a matter of fact, though I am chastened by Mr Robinson’s* own willingness to suspend final judgment on the theological ins and outs of the issue.  However, it is hardly reasonable to expect the members of this new minority group not to share the experience of every established minority group, who are from time to time frustrated when the image of the world that is presented to them in every movie, every book, every TV show, every presidential address, every classroom, every other place where the voice of The Mainstream is heard, is so much at odds with what they have seen and heard and felt in their own lives, from their own point of view, that it begins to seem as if they have been transported to a parallel universe.

I believe Mr Robinson would be quick to agree with this.  I heard him make a speech a few years ago in which he told an audience made up primarily of same-sexers that “we will never be anything other than a small minority group in society at large, no matter how large a majority we may form in this room at this moment.”  He went on to talk about the challenges inherent in minority status, especially the sense of not being heard that comes when an element so central to personal identity as one’s sexuality takes a form that is basically alien to most of the people one meets on a daily basis.  So when he tells his opponents that their new status as members of an unpopular minority does not by itself mean that they are victims of injustice, he is not trivializing their experiences or concerns.  Rather, he is suggesting that in the future he and they will have something in common.  Anti-gay Christians may never again be anything other than a small minority group in society at large, no matter how large a majority they may form in their own worship spaces.  And they can no longer expect culture high and low to be dominated by a worldview in which male and female are categories created by God and inscribed by God with specific meanings, meanings that include a concept of complementarity that exhausts the legitimate purposes of sexual activity.  Nor can they even expect the average person to have the vaguest knowledge of what their views are, or to be at all interested in learning about them.  They can hardly be faulted for considering this an unattractive prospect, yet it is no different from what any other minority group experiences.  On Mr Robinson’s account, the reduced visibility and inadvertent exclusions that come with minority status do not by themselves constitute unjust discrimination.

I don’t want to put words in Mr Robinson’s mouth; I’m sure he would be the first to concede that there is such a thing as institutional discrimination, and that injustices no one in the majority intends to commit or even knows are happening can at times wreak horrific consequences in the lives of the minority.  And while Mr Stern is blithely confident that laws against religious discrimination will give anti-gay Christians all the protection they need against any mistreatment they may suffer in the future, Mr Dreher’s American Conservative colleague Samuel Goldman** links to a recent article raising the question of whether “religious freedom” is even a coherent category in our current legal system.   So I see more grounds to the fears of this new minority than does Mr Stern.  I cannot be of much help to them; in the unlikely event that anti-gay Christians were to ask me how they could be sure of receiving fair treatment in a strongly pro-gay America, my suggestion would be that they abandon their false beliefs and join the rest of us in affirming the diversity of sexual expression in today’s world.  I’m sure that would be about as pointless as a Christian telling Muslims that if they don’t want to be smeared by association with terrorists, all they have to do is to be baptized.

*To avoid confusion, let me explain: The customary form in which the names of Anglican clergy are presented is “[Ecclesiastical Honorific] [Courtesy Title] [Proper Name]” at first reference, and “[Courtesy Title] [Proper Name]” at subsequent references.  That’s why I introduced Mr Robinson as “the Right Reverend Mr Robinson,” then switched to plain “Mr Robinson.”  My wife works for the Episcopal Church, and I occasionally read the novels of Anthony Trollope, so I’m aware of all these things.

**Like Mr Dreher, Mr Goldman is always interesting.  Unlike him, he is never apoplectic.

Science and the argument from authority

Back when the earth was young and I was an undergraduate, a friend of mine named Philip told me with great satisfaction that the chemistry professor who had agreed to be his advisor was the world’s foremost authority on the reaction which he planned to study.  Later in that same conversation, I mentioned something about authority in science.  “Oh, authority counts for nothing in science!” Philip earnestly assured me.

Well, I said, “nothing” is a rarity.  Perhaps there is some small residue of authority in science.  No, no, Philip insisted, there was absolutely no place for appeals to authority in scientific discourse.

I produced a hypothetical example.  Say he was working on the reaction which so interested him.  After all these years I don’t remember what it was called, unfortunately.  And say his new advisor, Professor Whatever His Name Was, were to amble into the lab, look over his shoulder, furrow his brow, and after a few moments say “I can’t put my finger on it, but I think you’re doing something wrong.”

“I’d be devastated!” Philip exclaimed.  “I don’t suppose you’d rest until you’d figured out what it was that was bothering him, even if it meant a series of sleepless nights?”  “I wouldn’t, no,” Philip agreed.

“Whereas, if someone like me, who knows as little as a person can about chemistry, were to make a similarly vague remark, you’d ignore it completely.”  “I sure would,” said Philip.

“So, Professor Whatever His Name Is has earned the authority to set you working frantically to check and recheck your work, while I have earned no such authority.”  Philip agreed that this was the case, and that to a certain extent, therefore, authority was a meaningful concept in the practice of science.

I bring up this story, not only because it gave me a rare opportunity to play the role of Socrates in a real-life Platonic dialogue, but because it seems timely.  Monday afternoon, io9 published a link to an undated essay by Jason Mitchell, Associate Professor of the Social Sciences at Harvard.  Professor Mitchell’s essay, titled “On the emptiness of failed replications,”   argues that there are many reasons why an attempt to replicate the results of a published study might fail to do so, and that such failures should often, even usually, not be used as a reason for setting aside the original claims.  Professor Mitchell’s argument has at its heart an appeal to authority.  He writes:

Science is a tough place to make a living.  Our experiments fail much of the time, and even the best scientists meet with a steady drum of rejections from journals, grant panels, and search committees.  On the occasions that our work does succeed, we expect others to criticize it mercilessly, in public and often in our presence.  For most us, our reward is simply the work itself, in adding our incremental bit to the sum of human knowledge and hoping that our ideas might manage, even if just, to influence future scholars of the mind.  It takes courage and grit and enormous fortitude to volunteer for a life of this kind.

So we should take note when the targets of replication efforts complain about how they are being treated.  These are people who have thrived in a profession that alternates between quiet rejection and blistering criticism, and who have held up admirably under the weight of earlier scientific challenges.  They are not crybabies.  What they are is justifiably upset at having their integrity questioned.  Academia tolerates a lot of bad behavior—absent-minded wackiness and self-serving grandiosity top the list—but misrepresenting one’s data is the unforgivable cardinal sin of science.  Anyone engaged in such misconduct has stepped outside the community of scientists and surrendered his claim on the truth.  He is, as such, a heretic, and the field must move quickly to excommunicate him from the fold.  Few of us would remain silent in the face of such charges.

Because it cuts at the very core of our professional identities, questioning a colleague’s scientific intentions is therefore an extraordinary claim.  That such accusations might not be expressed directly hardly matters; as social psychologists, we should know better that innuendo and intimation can be every bit as powerful as direct accusation.  Like all extraordinary claims, insinuations about others’ scientific integrity should require extraordinary evidence.  Failures to replicate do not even remotely make this grade, since they most often result from mere ordinary human failing.  Replicators not only appear blind to these basic aspects of scientific practice, but unworried about how their claims affect the targets of their efforts. One senses either a profound naiveté or a chilling mean-spiritedness at work, neither of which will improve social psychology.

What my friend Philip and I agreed on those many years ago was that science had an advantage over other forms of inquiry because, while it does have its authorities, those authorities are always open to challenge.  It may be very likely, in my hypothetical example, that Professor Whatever His Name Was could not immediately explain why Philip’s procedure was wrong simply because organic chemistry is a very complex field and he could only vaguely remember the most relevant point until he had gone through the whole experiment in detail.  However, Philip himself or any other competent researcher who checked over his work in the same way would come to the same results as would Professor Whatever His Name Was, if not as quickly or in as clever a manner as Professor Whatever His Name Was may well have done.  Science therefore promises, not to slay authority, but to tame it.  Scientists can earn authority and use it guide their colleagues without inflicting fatal damage on their fields every time they make a mistake, because there is a system for identifying and correcting the mistakes even of the most august figures.

Professor Mitchell is therefore not wrong to protest that one ought to be mindful of the reputations scientists have earned, and circumspect about impugning those reputations, however indirectly.  On the other hand, his strictures against using replication as a standard for the reliability of scientific claims go so far as to raise the question of how a scientist who has accumulated an impressive set of credentials could ever be proven wrong.  It is therefore not surprising that the io9 posting of Professor Mitchell’s essay has sparked a ferocious response from readers accusing him of threatening to ruin science for everyone.  Indeed, the headline on that posting was “If You Love Science, This Will Make You Lose Your Shit,” the tag io9 editor Annalee Newitz added to the post was “HOLY CRAP WTF,” and it is illustrated with this gif:

To io9’s credit, the comments include some thoughtful and nuanced replies, as for example this one from a sociologist explaining why she believes both that her discipline represents an important source of knowledge and that it is misleading to use the word “science” to describe it.

I’d also mention a response to Professor Mitchell’s essay by Discover’s famously pseudonymous “Neuroskeptic.”  Neuroskeptic praises Professor Mitchell for identifying a naivete in those who are quick to regard a failure to replicate as proof positive that the original finding was flawed, but goes on to argue that Professor Mitchell himself exhibits a similar naivete in defending the opposite habit:

Whereas the replication movement sees a failure to find a significant effect as evidence that the effect being investigated is non-existent, Mitchell denies this, saying that we have no way of knowing if the null result is genuine or in error: “when an experiment fails, we can only wallow in uncertainty” about what it means. But if we do find an effect, it’s a different story: “we can celebrate that the phenomenon survived these all-too-frequent shortcomings [experimenter errors].”

And here’s the problem. Implicit in Mitchell’s argument is the idea that experimenter error (or what I call ‘silly mistakes’) is a one-way street: errors can make positive results null, but not vice versa.

Unfortunately, this is just not true. Three years ago, I wrote about these kinds of mistakes and recounted my own personal cautionary tale. Mine was aspreadsheet error, one even sillier than the examples Mitchell gave. But in my case the silly mistake created a significant finding, rather than obscuring one.

There are many documented cases of this happening and (scary thought) probably many others that we don’t know about. Yet the existence of these errors is the fatal spanner in the works of Mitchell’s whole case. If positive results can be erroneous too, if errors are (as it were) a neutral force, neither the advocates nor the skeptics of a particular claim can cry ‘experimenter error!’ to silence their opponents.

The phrase “spreadsheet error” may remind politically-oriented readers of the Reinhart-Rogoff Affair, a spreadsheet error underlying a 2010 paper by economists Carmen Reinhart and Kenneth Rogoff.  That paper had a significant impact on policymaking in the USA and elsewhere before the error was exposed in 2013.

The Reinhart-Rogoff Affair took a prominent place in my mind, and I think it is safe to say in the minds of many other observers, as an example of just how untrustworthy the governing elites of the USA truly are.  Ever since the late 1990s, Washington and Wall Street have made a series of clownishly ill-advised decisions.  Many of these decisions were not only decried by experts at the time as likely to lead to disaster, but were in fact hugely unpopular with the general public.  In every case, the predicted disasters have come to pass, and our rulers have reacted to these disasters at first with denial, then with bewilderment, then with apparent amnesia as they propose a repetition of exactly the policies that had failed before.  When those same elites look to science for a warrant for their policies, it seems to bother them not at all when the studies they have cited are discredited.  Seeing how deadly is the entrenched ignorance of political and business elites, the idea of insulating distinguished scientists from criticism raises the prospect that they may in time come to form a class that is as detached from reality as are those who wield power in Washington and on Wall Street.  If such an event comes to pass, future Reinharts and Rogoffs can be as sloppy as they like, provided their claims serve the interests of those who hold the levers of opportunity.

Some what-ifs

I recently posted a much-too-long comment on Peter Hitchens’ blog.   Mr Hitchens had posted about one of his recurrent themes, that, contrary to what the popular phrase “special relationship” might suggest, the United States does not in fact treat the United Kingdom in a markedly more indulgent fashion than it brings to its treatment of its other allies.  He gave a series of examples of hard bargains the US had driven in its relations with the UK.  The last of these examples was the aid the US gave to Britain in the period 1940-1941, which was conditioned on Britain’s yielding to the US a large portion of its gold reserves, its shares in many US and Latin American firms, and its naval bases in the Western hemisphere. To this I responded as follows:

Well, with regard to US policy towards the British Empire in 1940 and 1941, I do think you are overlooking rather an important point. It did seem quite likely from May of 1940 on that Britain might very well surrender to Germany. The expectation that Britain would surrender seems to have motivated, for example, Hitler’s declaration of war on the USA. Without Britain among the allied powers, the USA would have been as impotent in Europe in the 1940s as Britain and France were in Poland in 1939. In view of that expectation, Hitler would likely have thought of his declaration of war on the USA on 11 December 1941 much as Argentines may have thought of their country’s declaration of war on Germany on 27 March 1945, a costless gesture designed to appease a nervous ally.

If we look back at the events between May 1940 and December 1941, not in the light of the Allies’ eventual victory, but of the United Kingdom’s probable defeat, both Washington’s demands and London’s acquiescence in them become far less of a scandal. Even if Germany had not chosen to occupy Britain after its defeat, it is likely that the Nazi regime would have found ways to help itself to at least as much of Britain’s gold reserves and other financial assets as the USA in fact claimed, making the Reich a major presence in business in the USA and the leading economic power in Latin America. Had the Nazis added Britain’s naval bases and other imperial assets in the Western Hemisphere to this economic power, the USA would have been entirely incapable of making a contribution to any war against either Germany or Japan.

In that light, I think we can see the Roosevelt government’s demands and the Churchill government’s concessions as a kind of super-Dunkirk. Without actually making British surrender more likely, these concessions represented the choice of a postwar environment in which the far Western boundary of German power would in no case exceed the shores of the Atlantic. Even in the event of the absolute worst case scenario for the UK, in which the Germans occupied and subjugated Britain, a great power would still exist somewhere in the world that was neither fascist nor communist, with a population that speaks English and courts that occasionally cite Magna Carta. Such a power might not be in a position to intervene militarily on the island of Britain, but its example could embolden guerrilla resistance to the Germans. A United Kingdom government of the period may even have harbored the fond wish that the continued viability of the USA might foster a certain residue of respect for Englishness even among Nazi occupiers. This fond wish may look silly in retrospect, as we consider what we know of the Nazi regime, but at the time might not have been an altogether contemptible basis for policy.

The alternative surrender scenario, in which the British Empire had held onto enough of its assets for its fall to terminate the USA as a world power, would in the short term have given Germany and Japan free hands in their expansionist programs. Considering how wildly those programs were inflated beyond each country’s ability to support them, in particular with regard to Germany’s invasion of Russia and Japan’s invasion of China, it seems likely that they would eventually have collapsed and brought the regimes down with them.

But that only makes the idea of Germany capturing a more-or-less-intact British Empire the more frightening. On the one hand, the Germans, unbothered by the nuisance of a Western front, would doubtless have had time to complete their extermination of European Jewry and to make great headway in their genocidal plans against Gypsies and others. On the other, the force that would eventually have defeated the Germans would not have included the USA, the UK, or any other democratic governments. The Soviet Union alone would have defeated the Reich, and the Red Army would have swept into all the territories it had once controlled. Perhaps that would have been rather a different Soviet Union than the one that actually existed in the late 1940s or early 1950s; it’s easy to imagine that Stalin, for example, would not have survived had the Second World War gone much worse than it did for the USSR. But even if the Wehrmacht had done as well against the Soviet Union as Napoleon did against the Tsar, surely it would in the end have been defeated even more thoroughly than was the Grande Armee.

And without the USA in the Western Pacific, Japan’s eventual, surely inevitable defeat in China would have come when the Kuomintang forces were even more completely exhausted than they were in 1945. That would have left Mao’s Red Army to pick up the pieces, not only in mainland China, but in surrounding countries as well. With no American forces in the region to offer an alternative, the Japanese occupations may have proved merely a prelude to a domination of East Asia by Chinese Communists, as the victories of the Third Reich may have been a prelude to the domination of the rest of the Eastern hemisphere by the Soviet Union.

A nightmare world, certainly. And, as with all nightmares, it grows from long chains of contingency. But I don’t think that any of these contingencies are either inherently unlikely to have happened, or unlikely to have haunted the minds of British and American policymakers in the period May 1940-December 1941.

This comment far exceeds the Daily Mail‘s limit of 500 words, a limit of which I was unaware when I submitted it.  (I had never posted a long comment to the Daily Mail‘s site before, amazingly enough.)  I am most grateful to Mr Hitchens for waiving that limit and allowing my post to stand as it is.

A few weeks ago, I read, for the first time, Philip K. Dick’s The Man in the High Castle.  I suppose the influence of that alternate-history novel can be seen in this comment.  I would add that, unlike Dick, I don’t propose a scenario in which the USA would be occupied by Nazi Germany and militarist Japan, merely one in which German influence in the Western hemisphere and the absence of a staging area from which to launch attacks against German positions in Europe and Africa made it impossible for the USA to fight against the Third Reich.

Scientific Arrogance

The other day, Ed Yong linked to an essay by Ethan Siegel.  Mr Siegel extols the virtues of science, both Science the process for gaining knowledge about nature and Science the body of knowledge that humans have acquired by means of that process.  Mr Siegel then quotes an interview Neil deGrasse Tyson gave to Nerdist, in which Mr Tyson expressed reservations about the value of philosophical study as part of the education of a young scientist.  In that interview, Mr Tyson and his interlocutors made some rather harsh-sounding remarks.  Take this segment, for example, as transcribed by Massimo Pigliucci:

interviewer: At a certain point it’s just futile.

dGT: Yeah, yeah, exactly, exactly. My concern here is that the philosophers believe they are actually asking deep questions about nature. And to the scientist it’s, what are you doing? Why are you concerning yourself with the meaning of meaning?

(another) interviewer: I think a healthy balance of both is good.

dGT: Well, I’m still worried even about a healthy balance. Yeah, if you are distracted by your questions so that you can’t move forward, you are not being a productive contributor to our understanding of the natural world. And so the scientist knows when the question “what is the sound of one hand clapping?” is a pointless delay in our progress.

[insert predictable joke by one interviewer, imitating the clapping of one hand]

dGT: How do you define clapping? All of a sudden it devolves into a discussion of the definition of words. And I’d rather keep the conversation about ideas. And when you do that don’t derail yourself on questions that you think are important because philosophy class tells you this. The scientist says look, I got all this world of unknown out there, I’m moving on, I’m leaving you behind. You can’t even cross the street because you are distracted by what you are sure are deep questions you’ve asked yourself. I don’t have the time for that.

interviewer: I also felt that it was a fat load of crap, as one could define what crap is and the essential qualities that make up crap: how you grade a philosophy paper?

dGT [laughing]: Of course I think we all agree you turned out okay.

interviewer: Philosophy was a good Major for comedy, I think, because it does get you to ask a lot of ridiculous questions about things.

dGT: No, you need people to laugh at your ridiculous questions.

interviewers: It’s a bottomless pit. It just becomes nihilism.

dGT: nihilism is a kind of philosophy.

Mr Tyson’s remarks have come in for criticism from many quarters.  The post by Massimo Pigliucci from which I take the transcription above is among the most notable.

I must say that I think some of the criticism is overdone.  In context, it is clear to me that Mr Tyson and his interlocutors are thinking mainly of the training of young scientists, of what sort of learning is necessary as a background to scientific research.  In that context, it’s quite reasonable to caution against too wide a range of interests.  It would certainly not be wise to wait until one had developed a deep understanding of philosophy, history, literature, music, art, etc, before getting down to business in one’s chosen field.

It’s true that Mr Tyson’s recent fame as narrator of the remake of the television series Cosmos puts a bit of an edge on his statements; that show is an attempt to present the history of science to the general public, and to promote a particular view of the place of science in human affairs.  It would be fair to say that the makers of Cosmos, Mr Tyson among them, have exposed some of their rather sizable blind spots in the course of the project (most famously in regard to Giordano Bruno,) and a bit of time spent studying the philosophy of science may very well have served to temper the bumptious self-assurance that let them parade their howlers in worldwide television broadcasts.  And it is true, as Mr Pigliucci documents, that Mr Tyson has a history of making flip and ill-informed remarks dismissing the value of philosophy and other subjects aside from his own.  Still, the remarks from the Nerdist podcast are pretty narrow in their intended scope of application, and within that scope, having to do with apprentice scientists, I wouldn’t say that they are examples of arrogance, or that they are even wrong.

I’m reminded of a problem that has faced those who would teach Latin and ancient Greek to English speakers over the centuries.  The languages are different enough from English that it seems like a shame to start them later than early childhood.  If a student starts Latin at five and Greek at six, as was the norm for boys destined for the German Gymnasia or the English public schools in the nineteenth century, that student will likely attain a reading proficiency in the classical languages at about eight or nine years of age that a student who starts them in later life may never attain.  However, the point of learning the languages is to be able to read classical literature.  What is a nine-year-old to make of Horace or Pindar or Vergil or Sophocles or Thucydides or Tacitus?  Few of the real masterworks are intelligible as anything other than linguistic puzzles to anyone under 40.  It often happens to me that I assign such things to students who are returning to college in middle age.  They usually come to me afterward and tell me that they were surprised.  They had read them when they were in the 18-25 age bracket that includes most of my students, and hadn’t found anything of interest in them.  Rereading them later in life, the books meant a tremendous amount to them.  I trot out a very old line on these occasions, and say “It isn’t just you reading the book- the book also reads you.”  Meaning that the more life experience the reader brings, the greater the riches the reading offers.

I suppose the best thing to do would be to learn the languages in early childhood while studying mathematics and the natural sciences, to study ancient literary works for several years as specimens in the scientific study of linguistics or as aids to archaeology, and to come back to them later in life, when one can benefit from reading them on their own terms.  The same might apply to philosophy, bits of which might be slipped in to the education of those aged 25 and younger, but which ought really to be introduced systematically only to those who have already confronted in practice the sorts of crises that have spurred its development over the centuries.

Be that as it may, the concept of scientific arrogance is one that has been deftly handled by one of my favorite commentators, cartoonist Zach Weiner.  I’d recommend two Saturday Morning Breakfast Cereal strips on the theme, this one about emeritus disease and this one about generalized reverence for specialized expertise.

Catastrophic Success

A few weeks ago, Mozilla fired its CEO, Brendan Eich, in response to a wave of criticism about his donation, in the year 2008, of $1000 to the successful campaign to prohibit same-sex couples from marrying in the state of California.  Public discussion of this event has not yet died down.  For example, Richard Kim of The Nation has made interesting remarks about it here and here.  So I suppose I can put my oar in.

When the matter first came up, I would have guessed it was going to play out like this: A variety of groups and individuals would find fault with Mr Eich.  He and the company would respond with assurances that his private political views would have no effect on company policy, and would point to his record of working well with members of sexual minority groups.  Criticism would continue to mount, and calls for a boycott of Mozilla would pick up steam.  Mr Eich would apologize for having supported the 2008 campaign, would make a large donation to some pro-equality group, Mozilla would announce a new, souped-up policy to protect the rights of same-sexers among its employees, and the company would pay a bunch of professional gays and lesbians to come and lecture its executives about the plight of sexual minorities.  After a few weeks, the controversy would have faded, Mr Eich would be getting on with business, and corporate types would have been reminded that there is a cost to siding against gender-neutral marriage.  That may have turned out to be an unpleasant sort of game, but it’s a game that has been played many times and that would have ended with an unambiguous, if rather small-scale, win for the rights and political clout of same-sexers.

In reality, the first two parts of this scenario played out according to the prediction, but Mr Eich did not apologize or make a large donation to a pro-equality group, the company did not soup  up its policy on the rights of same-sexers, and no grand inquisitors of gaydom have been invited to Mozilla.  The whole game was short-circuited by Mr Eich’s departure.

There is a military term of art which I believe has an application here: catastrophic success.  George W. Bush popularized this term in 2004, describing the previous year’s invasion of Iraq.  When an enemy force capitulates more rapidly and more completely than one expects, thereby creating a chaotic situation with which the victorious force is not prepared to cope, that is a catastrophic success.  And that is what the end of Mr Eich’s tenure at Mozilla has presented to his opponents.   I remarked on this in a comment on one of Richard Kim’s pieces, where I said, among other things:

Would-be CEOs are no more likely to react to Eich’s fall by becoming pro-equality than they are to avoid pro-equality causes that may become controversial five or ten or twenty-five years from now. So the big-budget parts of “the Gay Movement” may very well be drying up surprisingly soon.

That is to say, while the anti-Eich campaign may have been an attempt to send the message that same-sexers should be taken seriously, the message that ambitious corporate types are likely to receive is that they should avoid controversy altogether unless it directly promotes the company’s bottom line.  So we’re likely to see a lot less corporate sponsorship or donations from executives to groups that promote the rights of same-sexers as a consequence of the Eich departure. Again, I don’t blame the anti-Eich campaign for this- if the matter had played out as I had expected, it would have been a plus for those groups, and it was certainly not unreasonable to expect that it would play out in that way.  On the other hand, it’s hard to see how we can find fault with Mr Eich and the company for declining to play the game according to the usual rules.

Of course, political quietism is not the only possible way for ambitious corporate executives to respond to the Eich affair.  If most companies are staffed by executives who decline to support any cause other than their own profits, then there will be passionate sections of the public who will be attracted to firms that aggressively identify themselves with one side or another of a particular issue.  We see that already, of course.  I suspect the Eich affair means that we’ll see a lot more of it.  If you live in a part of the country where same-sex rights are popular, then you’ll see companies that either go out of the way to identify themselves with those rights or are walls of silence regarding them.  If you live in a more conservative part of the country, you’ll find the opposite.  And that, I suspect, may reverse the tide of public opinion that has carried same-sex rights so far in recent years by creating social spaces in which the cost of supporting those rights increases.  I don’t know how best to respond to this, but I very much suspect that Mozilla’s response shows that threats of boycott will not help.

Worlds in Collision

There have been several interesting items in recent issues of The Nation.

Reviewing John Judis’ Truman, American Jews, and the Origins of the Arab/Israeli Conflict, Bernard Avishai argues that President Harry S Truman had far fewer options in formulating policy towards events in and around Mandatory Palestine than Mr Judis claims.  Mr Avishai’s closing sentences are worth quoting:

Understanding Israel’s founding in 1948 as a necessary event with tragic consequences, and not as a presidential mistake forced by political pressure, will not make Obama less wary of AIPAC or his relationship with Netanyahu less tortured. But it could make his tact more obviously noble.

“Tact” may itself be an extraordinarily tactful choice of words to characterize Mr O’s relationship with Israel and the Americans who support the Israeli right-wing, but I would say that “necessary event with tragic consequences” is usually an accurate description of major occurrences in world history.  There may be some agent or other who was at some point in a position to alter the course of events, but that point may have passed long before anyone realized the significance of what was going on.  Certainly by the time President Truman took office, the establishment of a Jewish state in Palestine was beyond the power of any US president to prevent, even assuming any US president were to be so heedless of public opinion as to want to prevent it.  The fact that President Truman so thoroughly convinced himself of the contrary as to announce to the faculty of the Jewish Theological Seminary in 1953 that “I am Cyrus” serves to remind us that the extreme self-confidence that men need if they are to rise to high political office often leaves them vulnerable to the most absurd self-deceptions.  Not that politicians have a monopoly on self-deception; Mr Avishai mentions Wolfgang Schivelbusch’s The Culture of Defeat, a book which shows how little relationship the commonly accepted opinions on all sides in the USA have to any facts concerning their country’s Civil War of 1861-1865.

A book about Immanuel Velikovsky prompts Paula Findlen to write an engaging essay about Velikovsky’s career and her own youthful enthusiasm for his work.  For my part, I wonder if Velikovsky’s eccentric theories about comets and colliding heavenly bodies set science back significantly.  Scientists are now comfortable talking about impacts that led to the formation of the Moon, triggered mass extinctions, etc, but in the 1970s, when Velikovsky’s work was in vogue, they were noticeably reluctant to consider such theories, perhaps for fear of being mistaken for Velikovskyans.

In September 2000, Kurt Vonnegut gave a speech in which he spoke ill of Thomas Jefferson, and explained why he had the right to do so.  I speak ill of Thomas Jefferson myself quite frequently.  I often read Jefferson’s deplorable works and study his deplorable acts, the better to deplore them, and my education advances in proportion to the amount of time I spend in his deplorable presence in this way.

In a recent issue, Richard Kim expressed exasperation with social conservatives concerned that the declining popularity of their views on sex in general and on gender neutral marriage in particular has destined them for marginalization.   Mr Kim points out that social conservatives still wield a great deal of power in the USA and that American courts have been quite deferential to religious liberty concerns.  The magazine rather undercuts Mr Kim’s point by running his piece under the headline “The Bigot’s Lament” and giving it a subhed saying that “the religious right nurses its persecution complex.”  If people are going to label you a bigot and dismiss your concerns as symptoms of a “persecution complex,” you are probably right to worry that you are being pushed to the margins.  Rod Dreher wrote a series of posts on his blog at The American Conservative a few weeks ago in which he speculated that in the future, people who share his belief that homosexual relationships are not the same kind of thing as heterosexual relationships may have to keep that belief a secret or face loss of employment and public humiliation, even as same-sexers have long had to keep their sexuality secret in order to avoid the same penalties.  Responding to a critique from Andrew Sullivan, Mr Dreher wrote:

This line from Andrew is particularly rich:

In the end, one begins to wonder about the strength of these people’s religious convictions if they are so afraid to voice them, and need the state to reinforce them.

This is the crux of the problem. Let’s restate this: “One begins to wonder about the strength of the love of gay couples if they are so afraid to come out of the closet, and need the state to protect them.”

How does that sound? To me, it sounds smug and naive and unfeeling, even cruel, about the reality of gay people’s lives. If they aren’t willing to martyr themselves, then they must not really love each other, right? And hey, if they need the state to protect them from a wedding photographer who won’t take their photos, how much do they really love each other?

You see my point.

I am glad we don’t live in that world anymore. We don’t live in that world anymore because people like Andrew insisted that gay lives had more dignity than the majority of Americans believed. Again, they did us all a favor by awakening us morally to what it is like to live in a country where what matters the most to you is treated in custom and in law as anathema.

I do think there is a realistic chance that in a decade or two it will be a career-killer virtually everywhere in the USA to profess religious beliefs that disapprove of same-sex sex and elevate opposite-sex sex to privileged status in the moral order.  I’m not entirely opposed to this happening; I think such beliefs are wrong, and the sooner they are consigned to the status of exhibits in a museum of discredited ideas the better off everyone will be.  On the other hand, while antigay beliefs may be losing popularity in the USA and other rich countries, and also in regions like Latin America that make a point of reminding the world of their affinities with the rich countries, they are far from dying out altogether.  That means that we can expect a sizable minority of closeted antigays to persist in the USA for quite some time to come.  And outside the rich countries, especially in Africa and the Muslim world, hostility to same-sexers is certainly not fading.  If immigration from these regions to the USA rises in the years to come, as it seems likely to do, a strong stigma against beliefs that oppose same-sex sex may lead to bitter confrontations and harsh stands on both sides.  An American counterpart to the late Pim Fortuyn may not be an impossibility for long.

These are concerns for tomorrow. The day after tomorrow, it is possible that a new stigma may attach itself to same-sexers, the stigma of membership in a genetically unmodified lower class.  In that case, it might be desirable that the period leading up to the shift should reinforce norms of mutual respect and fair play, rather than aggression and triumphalism.  Or it might not be; perhaps the collision with the new world will blot out whatever habits we may have  cultivated in the old one.  Assuming, of course, that there is enough of a genetic contribution to the physical basis of homosexual attraction for genetic modification to bring this particular collision about in the first place.

WEIRD laughter

Recently, several websites I follow have posted remarks about theories that are meant to explain why some things strike people as funny.

Zach Weinersmith, creator of Saturday Morning Breakfast Cereal, wrote an essay called “An artificial One-Liner Generator” in which he advanced a tentative theory of humor as problem-solving.

Slate is running a series of articles on theoretical questions regarding things that make people laugh.  The first piece, called “What Makes Something Funny,” gives a lot of credit to a researcher named Peter McGraw, who is among the pioneers of “Benign Violation Theory.”  This is perhaps unsurprising, since Professor McGraw and his collaborator Joel Warner are credited as the authors of the piece.  Professor McGraw and Mr Warner summarize earlier theories of humor thus:

Plato and Aristotle introduced the superiority theory, the idea that people laugh at the misfortune of others. Their premise seems to explain teasing and slapstick, but it doesn’t work well for knock-knock jokes. Sigmund Freud argued for his relief theory, the concept that humor is a way for people to release psychological tension, overcome their inhibitions, and reveal their suppressed fears and desires. His theory works well for dirty jokes, less well for (most) puns.

The majority of humor experts today subscribe to some variation of the incongruity theory, the idea that humor arises when there’s an inconsistency between what people expect to happen and what actually happens.

Professor McGraw and Mr Warner claim that incongruity theory does not stand up well to empirical testing:

Incongruity has a lot going for it—jokes with punch lines, for example, fit well. But scientists have found that in comedy, unexpectedness is overrated. In 1974, two University of Tennessee professors had undergraduates listen to a variety of Bill Cosby and Phyllis Diller routines. Before each punch line, the researchers stopped the tape and asked the students to predict what was coming next, as a measure of the jokes’ predictability. Then another group of students was asked to rate the funniness of each of the comedians’ jokes. The predictable punch lines turned out to be rated considerably funnier than those that were unexpected—the opposite of what you’d expect to happen according to incongruity theory.

To which one might reply that when Mr Cosby and Ms Diller actually performed their routines, they didn’t stop after the setup and ask the audience to predict the punchline.  Nor would any audience member who wanted to enjoy the show be likely to try to predict the punchline.  Doing so would make for an entirely different experience than the one the audience had paid for.

Be that as it may, Professor McGraw and Mr Warner go on to claim that their theory of “benign violation” is supported by empirical evidence:

Working with his collaborator Caleb Warren and building from a 1998 HUMOR article published by a linguist named Thomas Veatch, he hit upon the benign violation theory, the idea that humor arises when something seems wrong or threatening, but is simultaneously OK or safe.

After extolling some of the theory’s strengths, the authors go on:

Naturally, almost as soon as McGraw unveiled the benign violation theory, people began to challenge it, trying to come up with some zinger, gag, or “yo momma” joke that doesn’t fit the theory. But McGraw believes humor theorists have engaged in such thought experiments and rhetorical debates for too long. Instead, he’s turned to science, running his theory through the rigors of lab experimentation.

The results have been encouraging. In one [Humor Research Laboratory] experiment, a researcher approached subjects on campus and asked them to read a scenario based on a rumor about legendarily depraved Rolling Stones guitarist Keith Richards. In the story—which might or might not be true—Keith’s father tells his son to do whatever he wishes with his cremated remains—so when his father dies, Keith decides to snort them. Meanwhile the researcher (who didn’t know what the participants were reading) gauged their facial expressions as they perused the story. The subjects were then asked about their reactions to the stories. Did they find the story wrong, not wrong at all, a bit of both, or neither? As it turned out, those who found the tale simultaneously “wrong” (a violation) and “not wrong” (benign) were three times more likely to smile or laugh than either those who deemed the story either completely OK or utterly unacceptable.

In a related experiment, participants read a story about a church that was giving away a Hummer H2 to a lucky member of its congregation, and were then asked if they found it funny. Participants who were regular churchgoers found the idea of mixing the sanctity of Christianity with a four-wheeled symbol of secular excess significantly less humorous than people who rarely go to church. Those less committed to Christianity, in other words, were more likely to find a holy Hummer benign and therefore funnier.

Lately, social scientists in general have been more mindful than usual of the ways in which North American undergraduates are something other than a perfectly representative sample of the human race.  Joseph Henrich, Steven Heine, and Ara Noranzayan have gone so far as to ask in the title of a widely cited paper whether the populations most readily available for study by psychologists and other social scientists are in fact  “The weirdest people in the world?”  In that paper, Professors Henrich, Heine, and Noranzayan use the acronym “WEIRD,” meaning Western, Educated, Industrialized, Rich, Democratic.  Their abstract:

Behavioral scientists routinely publish broad claims about human psychology and behavior in the world’s top journals based on samples drawn entirely from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies. Researchers – often implicitly – assume that either there is little variation across human populations, or that these “standard subjects” are as representative of the species as any other population. Are these assumptions justified? Here, our review of the comparative database from across the behavioral sciences suggests both that there is substantial variability in experimental results across populations and that WEIRD subjects are particularly unusual compared with the rest of the species – frequent outliers. The domains reviewed include visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, reasoning styles, self-concepts and related motivations, and the heritability of IQ. The findings suggest that members of WEIRD societies, including young children, are among the least representative populations one could find for generalizing about humans. Many of these findings involve domains that are associated with fundamental aspects of psychology, motivation, and behavior – hence, there are no obvious a priori grounds for claiming that a particular behavioral phenomenon is universal based on sampling from a single subpopulation. Overall, these empirical patterns suggests that we need to be less cavalier in addressing questions of human nature on the basis of data drawn from this particularly thin, and rather unusual, slice of humanity. We close by proposing ways to structurally re-organize the behavioral sciences to best tackle these challenges.

It would be particularly easy to see why a theory like Benign Violation would have a special appeal to undergraduates.  Undergraduate students are rewarded for learning to follow sets of rules, both the rules of academic disciplines which their teachers expect them to internalize and the rules of social behavior appropriate to people who,like most undergraduates, are living independent adult lives for the first time.  So, I suppose if one wanted to defend Superiority Theory (as for example mentioned by Aristotle in his Poetics, 1449a, p. 34-35,) one would be able to use the same results, saying that students simultaneously saw themselves as superior both to the characters in the jokes who did not follow the usual rules and to those who would enforce those rules in too narrowly literalistic a fashion to fit with the overall approach of higher education, where innovation and flexibility are highly valued.  Here the WEIRD phenomenon comes into play as well, since cultures vary in their ideas of what rules are and what relationship they have to qualities like innovation and flexibility.  Moreover, one could also say that the judgment that a particular violation is or is not benign itself implies superiority over those involved in the violation, and that this implication of superiority is what generates laughter.

Also, because undergraduates are continually under pressure to internalize one set of rules after another, they often show anxiety related to sets of rules.  This may not be the sort of thing Sigmund Freud had in mind when he talked about Oedipal anxiety, but it certainly does drive undergraduates to seek relief.  Example of action that is at once quite all right and by no means in accordance with the rules may well provide that relief.

Incongruity theorists may find comfort in Professor McGraw’s results, as well.  The very name “Benign Violation” as well as experimental rubrics such as “wrong” and “not wrong” are incongruous combinations by any definition.  So a defender of Incongruity Theory may claim Benign Violation as a subcategory of Incongruity Theory, and cite these results in support of that classification.

Professor McGraw is evidently aware of these limitations.  He and Mr Warner explain what they did to rise above them:

[T]hree years ago, he set off on an international exploration of the wide world of humor—with me, a Denver-based journalist, along for the ride to chronicle exactly what transpired. Our journey took us from Japan to the West Bank to the heart of the Amazon, in search of various zingers, wisecracks and punch lines that would help explain humor once and for all. The result is The Humor Code: A Global Search for What Makes Things Funny, to be published next week—on April Fool’s Day, naturally. As is often the case with good experiments—not to mention many of the funniest gags—not everything went exactly as planned, but we learned a lot about what makes the world laugh.

It isn’t April First yet, so I don’t know how well they have done in their efforts to expand their scope.

One sentence that struck me wrong in Professor McGraw and Mr Warner’s piece was this one, about Superiority Theory, that it “seems to explain teasing and slapstick, but it doesn’t work well for knock-knock jokes.”  I’m not at all sure about that one.  In a knock-knock joke, there are two hypothetical characters who take turns delivering five lines of dialogue.  The first character to speak is the Knocker (whose first line is always “Knock-knock!”)  The second character to speak is the Interlocutor (whose first line is always “Who’s there?”)  The Knocker’s second line is an unsatisfactory answer to this question.  The Interlocutor’s second line begins by repeating this incomplete answer, then adds the question word “who?”  The Knocker’s third line then delivers the punchline in the form of a repetition of the unsatisfactory answer followed by one or more additional syllables that change the apparent meaning of the initial unsatisfactory answer.

Knock-knock jokes became popular in the USA in the 1950s, as part of a national craze.  The first joke recorded in this mid-twentieth century craze, I have read, is the following:

K: Knock-knock!

I: Who’s there?

K: Sam and Janet.

I: Sam and Janet who?

K: Sam and Janet evening! (sung to the tune of this song)

Apparently all of the jokes that brought the form into such prominence in the 1950s that they are still beloved today by seven-year-olds of all ages took this form, in which the punchline involved the Knocker bursting into song with a popular Broadway tune of the period.

I think the jokes from this original craze almost have to be examples of superiority.  The Knocker is confident that the Interlocutor will be surprised when the punchline is presented under the usual conditions of the joke.  This is not to deny that if the joke were interrupted and the Interlocutor were asked to predict the punchline, after the manner of Professor McGraw’s students the Interlocutor might be able to do so.  When it is presented the Interlocutor will join in his or her satisfaction at being part of the relatively elite portion of the population who recognize current Broadway hits when they hear them.

As knock-knock jokes have become more familiar over the decades, meta-knock-knock jokes have gained a following.  For example, a person named Alice might play the Knocker in this joke:

K: Knock knock!

I: Who’s there?

K: Alice.

I: Alice who?

K: Alice (in a tone suggesting that she is wounded that the Interlocutor doesn’t recognize her)

The met-knock-knock joke suggests superiority to the genre of knock-knock jokes.  If first-order knock-knock jokes are popular among seven-year-olds of all ages, meta-knock-knock jokes are popular among eight-year-olds of all ages, suggesting superiority to those who still persist in telling first-order knock-knock jokes.

The world’s most hated knock-knock joke is this meta-knock-knock:

K: Knock, knock.
I: Who’s there?
K: Banana.
I: Banana who?
K: Knock, knock.
I: Who’s there?
K: Banana.
I: Banana who?
K: Knock, knock.
I: Who’s there?
K: Orange.
I: Orange who?
K: ORANGE YOU GLAD I DIDN’T SAY BANANA!

This joke attacks the several parts of the shared understanding between Knocker and Interlocutor.  The joke is more than five lines long, the fifth line does not take the form original unsatisfactory response + additional syllable or syllables, the Knocker expects the Interlocutor to repeat his or her two lines multiple times, and the punchline does not include a repetition of the original unsatisfactory response.  For the experienced Interlocutor, these attacks are an undue imposition on the Knocker-Interlocutor relationship.  For anyone else, the whole thing would be utterly pointless.

Hated as the joke is, Knockers of a particular sort, mostly eight-year-old boys, seem unable to resist it.  Willing Interlocutors can rely on these boys to laugh uproariously every time they drag them through the ritual responses.  Here too, Superiority Theory seems to be the only explanation for the boys’ laughter and the strain tolerating the joke puts on the Interlocutors.  The Knockers who enjoy the joke laugh at their own power to inflict it on their Interlocutors.

Each time a potential Interlocutor is confronted with “Orange you glad I didn’t say banana,” the joke gets a bit more annoying.  Perhaps this is because of an aspect of politeness recently referenced on yet another of my favorite sites, Language Log.  There it was mentioned that Penelope Brown and Stephen Levinson, founders of “Politeness Theory,” have provided conceptual tools to enable us to distinguish between situations in which statements offering information the hearer should already have suggest that the hearer does not already know that information and thereby offend the hearer and those which do not carry that suggestion and which therefore do not offend the hearer.  A joke with a painfully obvious punchline may fall in the first category, as do the reiterated responses in “Orange you glad I didn’t say banana.”  Casual remarks about the weather and other forms of small talk usually fall in the second category, as do formalized utterances generally.

Meeting Alison

I wish I’d been in New Zealand to see brilliant young cartoonist Sarah E. Laing meet her hero, comics titan Alison Bechdel.  Exxtraordinary to compare the photo below with the comic Ms Laing is giving Ms Bechdel– powers of prophecy that one has.

Sarah Laing's avatarLET ME BE FRANK

This is the trailer for The Curioseum – what do you think? I was pretty excited to see all my illustrations animated, albeit in a low-fi way.

Also, I gave Alison Bechdel my Metro comic, the one with her in it.

BiRkusdCEAAp4WA

She seemed pretty excited to see herself there, and she accepted my bundle of Let Me Be Frank comics, but I think I scared her with my enthusiasm. When I saw her at the NZ comics panel the next day she looked a little guarded and apologised for not reading my comics yet. I told her it was ok – she could throw them in the recycling if she liked – but I hope she doesn’t. I felt a bit sorry for her – she probably has half-crazed cartoonists foisting comics on her all the time and she’s too nice a person to tell us to piss off. We…

View original post 46 more words

The Atlantic, April 2014

In her cover story about trends in parenting styles in the US and Britain, Hanna Rosin tells several charming anecdotes contrasting her mother’s approach to raising her some years ago to her own approach to raising her daughter today.  Ms Rosin follows up with data showing that her mother’s relatively laissez-faire methods were typical of Americans in the 1970s and 1980s, while her own much more intensive style of supervision is typical of the early 21st century.  Statistics do not show that the newer approach has led to any improvement in the safety of children, and in fact support claims that such close supervision harms children in a number of ways.  Here are a couple of paragraphs from the heart of Ms Rosin’s article:

I used to puzzle over a particular statistic that routinely comes up in articles about time use: even though women work vastly more hours now than they did in the 1970s, mothers—and fathers—of all income levels spend much more time with their children than they used to. This seemed impossible to me until recently, when I began to think about my own life. My mother didn’t work all that much when I was younger, but she didn’t spend vast amounts of time with me, either. She didn’t arrange my playdates or drive me to swimming lessons or introduce me to cool music she liked. On weekdays after school she just expected me to show up for dinner; on weekends I barely saw her at all. I, on the other hand, might easily spend every waking Saturday hour with one if not all three of my children, taking one to a soccer game, the second to a theater program, the third to a friend’s house, or just hanging out with them at home. When my daughter was about 10, my husband suddenly realized that in her whole life, she had probably not spent more than 10 minutes unsupervised by an adult. Not 10 minutes in 10 years.

It’s hard to absorb how much childhood norms have shifted in just one generation. Actions that would have been considered paranoid in the ’70s—walking third-graders to school, forbidding your kid to play ball in the street, going down the slide with your child in your lap—are now routine. In fact, they are the markers of good, responsible parenting. One very thorough study of “children’s independent mobility,” conducted in urban, suburban, and rural neighborhoods in the U.K., shows that in 1971, 80 percent of third-graders walked to school alone. By 1990, that measure had dropped to 9 percent, and now it’s even lower. When you ask parents why they are more protective than their parents were, they might answer that the world is more dangerous than it was when they were growing up. But this isn’t true, or at least not in the way that we think. For example, parents now routinely tell their children never to talk to strangers, even though all available evidence suggests that children have about the same (very slim) chance of being abducted by a stranger as they did a generation ago. Maybe the real question is, how did these fears come to have such a hold over us? And what have our children lost—and gained—as we’ve succumbed to them?

Also in this issue, several authors are asked to name the best fictional character of all time.  Children’s author R. L. Stine convinced me:

Aside from being amiable, Mickey Mouse has no discernible personality of any kind, yet he has captivated the world, appeared in hundreds of films, and sold billions of dollars’ worth of merchandise. Has any other fictional character held sway over so many countries for so long?

To build an empire like that of Disney on the basis of “no discernible personality of any kind” is indeed an achievement I would have thought impossible had it not actually been done.

Michael O’Donnell reviews some recent work on the passage of the 1964 Civil Rights Act, and seems mystified at the reluctance of some writers to give President Lyndon Johnson his due in that process.

Robert D. Kaplan seems to be less prominent than he was before the 2003 Iraq War; he may be the only person in the USA whose career took a hit for supporting the war.  Not that he is backing down; his piece in this issue is called “In Defense of Empire.”  I suppose we have to salute him for his willingness to stick by his principles.

At any rate, Mr Kaplan’s argument exhibits the some of same bizarre weaknesses in reasoning that underpinned so much of the rhetoric he and his fellow warhawks deployed in favor of invading Iraq to topple Saddam Hussein.  As he and others habitually did in those days, Mr Kaplan makes a generalization and flatly refuses to analyze it, insisting on applying his glossy abstractions in several senses at once.  So, Mr Kaplan tells us in this piece that empires are more likely than homogeneous nation-states or loose confederations to “protect minorities,”  but that dysfunctional empires sometimes fail in their mission to “protect minorities.”

Now one need not be an expert in such things to realize that a statement like “empires protect minorities” needs some unpacking.  Sometimes an imperial power will align itself with an unpopular minority group, promoting the interests of that group and to some extent governing through it.  The minority’s unpopularity makes it dependent on the imperial power for protection, and therefore more likely than the majority to collaborate with whatever schemes that power may put forward.  That very collaboration exacerbates the minority’s unpopularity and vulnerability.  And of course there are many other ways in which imperial powers divide and rule their subjects, many of which involve favoring minorities as against majorities.  An sober examination of these methods might leave some people willing to tolerate imperialism from time to time, but it would hardly be likely by itself to constitute a case “In Defense of Empire.”

Derek Thompson explains “How National Basketball Association Teams Fool Themselves Into Betting Too Much on the Draft.”  Mr Thompson’s explanation identifies fallacies that distort decision-making in non-sports related organizations as well:

In most professional sports leagues, including the NBA, the worst teams are first in line to snag the most-promising amateur players in the next draft. When the ripening crop of amateurs looks especially tantalizing (this year’s is projected to be historically good), multiple teams will suddenly compete to be so uncompetitive that, through sheer awfulness, they will be blessed to inherit the top pick. One anonymous general manager told ESPN the Magazine earlier this season, “Our team isn’t good enough to win,” so the best thing is “to lose a lot.”

In a way, there is a dark genius behind the tanking epidemic. In what other industry could you persuade your customers to root for the worst possible product? But tanking puzzles academics like David Berri, the author of the 2006 book The Wages of Wins and a widely read commentator on sports economics. “Tanking simply does not work,” he told me. Nearly 30 years of data tell a crystal-clear story: a truly awful team has never once metamorphosed into a championship squad through the draft. The last team to draft No. 1 and then win a championship (at any point thereafter) was the San Antonio Spurs, which lucked into the pick (Tim Duncan) back in 1997 when the team’s star center, David Robinson, missed all but six games the previous season because of injuries. The teams with the top three picks in any given draft are almost twice as likely to never make the playoffs within four years—the term of an NBA rookie contract, before the player reaches free agency—as they are to make it past the second round.

Why are teams and their fans drawn to a strategy that reliably leads to even deeper failure? The gospel of tanking is born from three big assumptions: that mediocrity is a trap; that scouting is a science; and that bad organizations are one savior away from being great. All three assumptions are common, not only to sports, but also to business and to life. And all three assumptions are typically wrong.

All three of these ideas seem to spring from an addiction to a messianic view of life, in which the best things can come only to those who have suffered the worst things (so, never to the merely mediocre, but perhaps to those who lose every game for months,) there exists a true path to greatness that will be revealed to those who seek it by the right means(so, the fetishization of science, including the anointing of such obviously non-scientific pursuits as basketball scouting as sciences,) and a charismatic figure is destined to come to the lowly in their darkest hour and to lead them on that true path (so, sacrificing a whole season of potentially competitive play in the hopes of attracting such a savior.)  For all I know, messianism may reflect a cosmic truth, as Christians and others say that it does, but it certainly does seem misplaced in the world of professional basketball.

Jenny Xie writes about a graphic designer named Nikki Sylianteng, who received many parking tickets because she was confused by the famously complex street signs that are supposed to tell New York City’s residents where they may and may not leave their cars.  Ms Sylianteng designed some street signs according to a simpler scheme.  She tacked her signs up next to city signs giving the same information and invited the public to tell her what they thought of them.  Here’s Ms Sylianteng’s website.

Barbara Ehrenreich has written a book called Living With a Wild God.  In it, Ms Ehrenreich mentions an strange psychological break she experienced in her youth.  She was walking by herself in a desert town when all of a sudden she was transported by a wave of ecstasy and the world seemed to be a radically different place.  Ms Ehrenreich has no idea what that was all about.  Though she recognizes the feeling in descriptions that talented religious persons give of their mystical experiences, Ms Ehrenreich is herself quite sure that whatever happened to her was entirely of this world.  In a brief notice of the book in this issue, Ann Hulbert summarizes this story and quotes a remark of Ms Ehrenreich’s:

The young Barbara had been keeping a hyper-articulate journal as she puzzled over the meaning of life, but she found no coherent words for the predawn blazing onrush of … what? Was she crazy? God wasn’t in her vocabulary. In the years that followed, Ehrenreich the biology grad student, social activist, journalist, and brilliant cultural critic and historian was struck dumb, too.

Now she has come up with the words, and I’m tempted to credit Ehrenreich with managing a miracle. But she resolutely avoids rhetoric in that “blubbery vein”—which is why her book is such a rare feat. “As a rationalist, an atheist, a scientist by training,” she struggles to make sense of the epiphany without recourse to the “verbal hand-wavings about ‘mystery’ and ‘transcendence’ ” that go with the territory. There was nothing peaceful or passive about the ecstatic state that seized her: “It was a furious encounter with a living substance that was coming at me through all things at once.” There is nothing pious about her reckoning with her past self, and with “a palpable Other, or Others.” Ehrenreich has no interest in conversion: “I believe nothing. Belief is intellectual surrender.” She wants, and inspires, open minds.

I don’t know whether Ms Hulbert has quoted Ms Ehrenreich fairly, but if she has I am surprised.  “Belief is intellectual surrender.”  So it is.  That’s the point, believers call for surrendering oneself altogether to the supernatural, in the case of monotheistic religions surrender to God.  Therefore, the challenge is to prove that intellectual surrender is bad, not to prove that belief is intellectual surrender.  Ms Ehrenreich is one of America’s foremost public intellectuals, and so I suspect she knows that, and that Ms Hulbert’s quotation was cut short by limitations of space.