Worlds in Collision

There have been several interesting items in recent issues of The Nation.

Reviewing John Judis’ Truman, American Jews, and the Origins of the Arab/Israeli Conflict, Bernard Avishai argues that President Harry S Truman had far fewer options in formulating policy towards events in and around Mandatory Palestine than Mr Judis claims.  Mr Avishai’s closing sentences are worth quoting:

Understanding Israel’s founding in 1948 as a necessary event with tragic consequences, and not as a presidential mistake forced by political pressure, will not make Obama less wary of AIPAC or his relationship with Netanyahu less tortured. But it could make his tact more obviously noble.

“Tact” may itself be an extraordinarily tactful choice of words to characterize Mr O’s relationship with Israel and the Americans who support the Israeli right-wing, but I would say that “necessary event with tragic consequences” is usually an accurate description of major occurrences in world history.  There may be some agent or other who was at some point in a position to alter the course of events, but that point may have passed long before anyone realized the significance of what was going on.  Certainly by the time President Truman took office, the establishment of a Jewish state in Palestine was beyond the power of any US president to prevent, even assuming any US president were to be so heedless of public opinion as to want to prevent it.  The fact that President Truman so thoroughly convinced himself of the contrary as to announce to the faculty of the Jewish Theological Seminary in 1953 that “I am Cyrus” serves to remind us that the extreme self-confidence that men need if they are to rise to high political office often leaves them vulnerable to the most absurd self-deceptions.  Not that politicians have a monopoly on self-deception; Mr Avishai mentions Wolfgang Schivelbusch’s The Culture of Defeat, a book which shows how little relationship the commonly accepted opinions on all sides in the USA have to any facts concerning their country’s Civil War of 1861-1865.

A book about Immanuel Velikovsky prompts Paula Findlen to write an engaging essay about Velikovsky’s career and her own youthful enthusiasm for his work.  For my part, I wonder if Velikovsky’s eccentric theories about comets and colliding heavenly bodies set science back significantly.  Scientists are now comfortable talking about impacts that led to the formation of the Moon, triggered mass extinctions, etc, but in the 1970s, when Velikovsky’s work was in vogue, they were noticeably reluctant to consider such theories, perhaps for fear of being mistaken for Velikovskyans.

In September 2000, Kurt Vonnegut gave a speech in which he spoke ill of Thomas Jefferson, and explained why he had the right to do so.  I speak ill of Thomas Jefferson myself quite frequently.  I often read Jefferson’s deplorable works and study his deplorable acts, the better to deplore them, and my education advances in proportion to the amount of time I spend in his deplorable presence in this way.

In a recent issue, Richard Kim expressed exasperation with social conservatives concerned that the declining popularity of their views on sex in general and on gender neutral marriage in particular has destined them for marginalization.   Mr Kim points out that social conservatives still wield a great deal of power in the USA and that American courts have been quite deferential to religious liberty concerns.  The magazine rather undercuts Mr Kim’s point by running his piece under the headline “The Bigot’s Lament” and giving it a subhed saying that “the religious right nurses its persecution complex.”  If people are going to label you a bigot and dismiss your concerns as symptoms of a “persecution complex,” you are probably right to worry that you are being pushed to the margins.  Rod Dreher wrote a series of posts on his blog at The American Conservative a few weeks ago in which he speculated that in the future, people who share his belief that homosexual relationships are not the same kind of thing as heterosexual relationships may have to keep that belief a secret or face loss of employment and public humiliation, even as same-sexers have long had to keep their sexuality secret in order to avoid the same penalties.  Responding to a critique from Andrew Sullivan, Mr Dreher wrote:

This line from Andrew is particularly rich:

In the end, one begins to wonder about the strength of these people’s religious convictions if they are so afraid to voice them, and need the state to reinforce them.

This is the crux of the problem. Let’s restate this: “One begins to wonder about the strength of the love of gay couples if they are so afraid to come out of the closet, and need the state to protect them.”

How does that sound? To me, it sounds smug and naive and unfeeling, even cruel, about the reality of gay people’s lives. If they aren’t willing to martyr themselves, then they must not really love each other, right? And hey, if they need the state to protect them from a wedding photographer who won’t take their photos, how much do they really love each other?

You see my point.

I am glad we don’t live in that world anymore. We don’t live in that world anymore because people like Andrew insisted that gay lives had more dignity than the majority of Americans believed. Again, they did us all a favor by awakening us morally to what it is like to live in a country where what matters the most to you is treated in custom and in law as anathema.

I do think there is a realistic chance that in a decade or two it will be a career-killer virtually everywhere in the USA to profess religious beliefs that disapprove of same-sex sex and elevate opposite-sex sex to privileged status in the moral order.  I’m not entirely opposed to this happening; I think such beliefs are wrong, and the sooner they are consigned to the status of exhibits in a museum of discredited ideas the better off everyone will be.  On the other hand, while antigay beliefs may be losing popularity in the USA and other rich countries, and also in regions like Latin America that make a point of reminding the world of their affinities with the rich countries, they are far from dying out altogether.  That means that we can expect a sizable minority of closeted antigays to persist in the USA for quite some time to come.  And outside the rich countries, especially in Africa and the Muslim world, hostility to same-sexers is certainly not fading.  If immigration from these regions to the USA rises in the years to come, as it seems likely to do, a strong stigma against beliefs that oppose same-sex sex may lead to bitter confrontations and harsh stands on both sides.  An American counterpart to the late Pim Fortuyn may not be an impossibility for long.

These are concerns for tomorrow. The day after tomorrow, it is possible that a new stigma may attach itself to same-sexers, the stigma of membership in a genetically unmodified lower class.  In that case, it might be desirable that the period leading up to the shift should reinforce norms of mutual respect and fair play, rather than aggression and triumphalism.  Or it might not be; perhaps the collision with the new world will blot out whatever habits we may have  cultivated in the old one.  Assuming, of course, that there is enough of a genetic contribution to the physical basis of homosexual attraction for genetic modification to bring this particular collision about in the first place.

WEIRD laughter

Recently, several websites I follow have posted remarks about theories that are meant to explain why some things strike people as funny.

Zach Weinersmith, creator of Saturday Morning Breakfast Cereal, wrote an essay called “An artificial One-Liner Generator” in which he advanced a tentative theory of humor as problem-solving.

Slate is running a series of articles on theoretical questions regarding things that make people laugh.  The first piece, called “What Makes Something Funny,” gives a lot of credit to a researcher named Peter McGraw, who is among the pioneers of “Benign Violation Theory.”  This is perhaps unsurprising, since Professor McGraw and his collaborator Joel Warner are credited as the authors of the piece.  Professor McGraw and Mr Warner summarize earlier theories of humor thus:

Plato and Aristotle introduced the superiority theory, the idea that people laugh at the misfortune of others. Their premise seems to explain teasing and slapstick, but it doesn’t work well for knock-knock jokes. Sigmund Freud argued for his relief theory, the concept that humor is a way for people to release psychological tension, overcome their inhibitions, and reveal their suppressed fears and desires. His theory works well for dirty jokes, less well for (most) puns.

The majority of humor experts today subscribe to some variation of the incongruity theory, the idea that humor arises when there’s an inconsistency between what people expect to happen and what actually happens.

Professor McGraw and Mr Warner claim that incongruity theory does not stand up well to empirical testing:

Incongruity has a lot going for it—jokes with punch lines, for example, fit well. But scientists have found that in comedy, unexpectedness is overrated. In 1974, two University of Tennessee professors had undergraduates listen to a variety of Bill Cosby and Phyllis Diller routines. Before each punch line, the researchers stopped the tape and asked the students to predict what was coming next, as a measure of the jokes’ predictability. Then another group of students was asked to rate the funniness of each of the comedians’ jokes. The predictable punch lines turned out to be rated considerably funnier than those that were unexpected—the opposite of what you’d expect to happen according to incongruity theory.

To which one might reply that when Mr Cosby and Ms Diller actually performed their routines, they didn’t stop after the setup and ask the audience to predict the punchline.  Nor would any audience member who wanted to enjoy the show be likely to try to predict the punchline.  Doing so would make for an entirely different experience than the one the audience had paid for.

Be that as it may, Professor McGraw and Mr Warner go on to claim that their theory of “benign violation” is supported by empirical evidence:

Working with his collaborator Caleb Warren and building from a 1998 HUMOR article published by a linguist named Thomas Veatch, he hit upon the benign violation theory, the idea that humor arises when something seems wrong or threatening, but is simultaneously OK or safe.

After extolling some of the theory’s strengths, the authors go on:

Naturally, almost as soon as McGraw unveiled the benign violation theory, people began to challenge it, trying to come up with some zinger, gag, or “yo momma” joke that doesn’t fit the theory. But McGraw believes humor theorists have engaged in such thought experiments and rhetorical debates for too long. Instead, he’s turned to science, running his theory through the rigors of lab experimentation.

The results have been encouraging. In one [Humor Research Laboratory] experiment, a researcher approached subjects on campus and asked them to read a scenario based on a rumor about legendarily depraved Rolling Stones guitarist Keith Richards. In the story—which might or might not be true—Keith’s father tells his son to do whatever he wishes with his cremated remains—so when his father dies, Keith decides to snort them. Meanwhile the researcher (who didn’t know what the participants were reading) gauged their facial expressions as they perused the story. The subjects were then asked about their reactions to the stories. Did they find the story wrong, not wrong at all, a bit of both, or neither? As it turned out, those who found the tale simultaneously “wrong” (a violation) and “not wrong” (benign) were three times more likely to smile or laugh than either those who deemed the story either completely OK or utterly unacceptable.

In a related experiment, participants read a story about a church that was giving away a Hummer H2 to a lucky member of its congregation, and were then asked if they found it funny. Participants who were regular churchgoers found the idea of mixing the sanctity of Christianity with a four-wheeled symbol of secular excess significantly less humorous than people who rarely go to church. Those less committed to Christianity, in other words, were more likely to find a holy Hummer benign and therefore funnier.

Lately, social scientists in general have been more mindful than usual of the ways in which North American undergraduates are something other than a perfectly representative sample of the human race.  Joseph Henrich, Steven Heine, and Ara Noranzayan have gone so far as to ask in the title of a widely cited paper whether the populations most readily available for study by psychologists and other social scientists are in fact  “The weirdest people in the world?”  In that paper, Professors Henrich, Heine, and Noranzayan use the acronym “WEIRD,” meaning Western, Educated, Industrialized, Rich, Democratic.  Their abstract:

Behavioral scientists routinely publish broad claims about human psychology and behavior in the world’s top journals based on samples drawn entirely from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies. Researchers – often implicitly – assume that either there is little variation across human populations, or that these “standard subjects” are as representative of the species as any other population. Are these assumptions justified? Here, our review of the comparative database from across the behavioral sciences suggests both that there is substantial variability in experimental results across populations and that WEIRD subjects are particularly unusual compared with the rest of the species – frequent outliers. The domains reviewed include visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, reasoning styles, self-concepts and related motivations, and the heritability of IQ. The findings suggest that members of WEIRD societies, including young children, are among the least representative populations one could find for generalizing about humans. Many of these findings involve domains that are associated with fundamental aspects of psychology, motivation, and behavior – hence, there are no obvious a priori grounds for claiming that a particular behavioral phenomenon is universal based on sampling from a single subpopulation. Overall, these empirical patterns suggests that we need to be less cavalier in addressing questions of human nature on the basis of data drawn from this particularly thin, and rather unusual, slice of humanity. We close by proposing ways to structurally re-organize the behavioral sciences to best tackle these challenges.

It would be particularly easy to see why a theory like Benign Violation would have a special appeal to undergraduates.  Undergraduate students are rewarded for learning to follow sets of rules, both the rules of academic disciplines which their teachers expect them to internalize and the rules of social behavior appropriate to people who,like most undergraduates, are living independent adult lives for the first time.  So, I suppose if one wanted to defend Superiority Theory (as for example mentioned by Aristotle in his Poetics, 1449a, p. 34-35,) one would be able to use the same results, saying that students simultaneously saw themselves as superior both to the characters in the jokes who did not follow the usual rules and to those who would enforce those rules in too narrowly literalistic a fashion to fit with the overall approach of higher education, where innovation and flexibility are highly valued.  Here the WEIRD phenomenon comes into play as well, since cultures vary in their ideas of what rules are and what relationship they have to qualities like innovation and flexibility.  Moreover, one could also say that the judgment that a particular violation is or is not benign itself implies superiority over those involved in the violation, and that this implication of superiority is what generates laughter.

Also, because undergraduates are continually under pressure to internalize one set of rules after another, they often show anxiety related to sets of rules.  This may not be the sort of thing Sigmund Freud had in mind when he talked about Oedipal anxiety, but it certainly does drive undergraduates to seek relief.  Example of action that is at once quite all right and by no means in accordance with the rules may well provide that relief.

Incongruity theorists may find comfort in Professor McGraw’s results, as well.  The very name “Benign Violation” as well as experimental rubrics such as “wrong” and “not wrong” are incongruous combinations by any definition.  So a defender of Incongruity Theory may claim Benign Violation as a subcategory of Incongruity Theory, and cite these results in support of that classification.

Professor McGraw is evidently aware of these limitations.  He and Mr Warner explain what they did to rise above them:

[T]hree years ago, he set off on an international exploration of the wide world of humor—with me, a Denver-based journalist, along for the ride to chronicle exactly what transpired. Our journey took us from Japan to the West Bank to the heart of the Amazon, in search of various zingers, wisecracks and punch lines that would help explain humor once and for all. The result is The Humor Code: A Global Search for What Makes Things Funny, to be published next week—on April Fool’s Day, naturally. As is often the case with good experiments—not to mention many of the funniest gags—not everything went exactly as planned, but we learned a lot about what makes the world laugh.

It isn’t April First yet, so I don’t know how well they have done in their efforts to expand their scope.

One sentence that struck me wrong in Professor McGraw and Mr Warner’s piece was this one, about Superiority Theory, that it “seems to explain teasing and slapstick, but it doesn’t work well for knock-knock jokes.”  I’m not at all sure about that one.  In a knock-knock joke, there are two hypothetical characters who take turns delivering five lines of dialogue.  The first character to speak is the Knocker (whose first line is always “Knock-knock!”)  The second character to speak is the Interlocutor (whose first line is always “Who’s there?”)  The Knocker’s second line is an unsatisfactory answer to this question.  The Interlocutor’s second line begins by repeating this incomplete answer, then adds the question word “who?”  The Knocker’s third line then delivers the punchline in the form of a repetition of the unsatisfactory answer followed by one or more additional syllables that change the apparent meaning of the initial unsatisfactory answer.

Knock-knock jokes became popular in the USA in the 1950s, as part of a national craze.  The first joke recorded in this mid-twentieth century craze, I have read, is the following:

K: Knock-knock!

I: Who’s there?

K: Sam and Janet.

I: Sam and Janet who?

K: Sam and Janet evening! (sung to the tune of this song)

Apparently all of the jokes that brought the form into such prominence in the 1950s that they are still beloved today by seven-year-olds of all ages took this form, in which the punchline involved the Knocker bursting into song with a popular Broadway tune of the period.

I think the jokes from this original craze almost have to be examples of superiority.  The Knocker is confident that the Interlocutor will be surprised when the punchline is presented under the usual conditions of the joke.  This is not to deny that if the joke were interrupted and the Interlocutor were asked to predict the punchline, after the manner of Professor McGraw’s students the Interlocutor might be able to do so.  When it is presented the Interlocutor will join in his or her satisfaction at being part of the relatively elite portion of the population who recognize current Broadway hits when they hear them.

As knock-knock jokes have become more familiar over the decades, meta-knock-knock jokes have gained a following.  For example, a person named Alice might play the Knocker in this joke:

K: Knock knock!

I: Who’s there?

K: Alice.

I: Alice who?

K: Alice (in a tone suggesting that she is wounded that the Interlocutor doesn’t recognize her)

The met-knock-knock joke suggests superiority to the genre of knock-knock jokes.  If first-order knock-knock jokes are popular among seven-year-olds of all ages, meta-knock-knock jokes are popular among eight-year-olds of all ages, suggesting superiority to those who still persist in telling first-order knock-knock jokes.

The world’s most hated knock-knock joke is this meta-knock-knock:

K: Knock, knock.
I: Who’s there?
K: Banana.
I: Banana who?
K: Knock, knock.
I: Who’s there?
K: Banana.
I: Banana who?
K: Knock, knock.
I: Who’s there?
K: Orange.
I: Orange who?
K: ORANGE YOU GLAD I DIDN’T SAY BANANA!

This joke attacks the several parts of the shared understanding between Knocker and Interlocutor.  The joke is more than five lines long, the fifth line does not take the form original unsatisfactory response + additional syllable or syllables, the Knocker expects the Interlocutor to repeat his or her two lines multiple times, and the punchline does not include a repetition of the original unsatisfactory response.  For the experienced Interlocutor, these attacks are an undue imposition on the Knocker-Interlocutor relationship.  For anyone else, the whole thing would be utterly pointless.

Hated as the joke is, Knockers of a particular sort, mostly eight-year-old boys, seem unable to resist it.  Willing Interlocutors can rely on these boys to laugh uproariously every time they drag them through the ritual responses.  Here too, Superiority Theory seems to be the only explanation for the boys’ laughter and the strain tolerating the joke puts on the Interlocutors.  The Knockers who enjoy the joke laugh at their own power to inflict it on their Interlocutors.

Each time a potential Interlocutor is confronted with “Orange you glad I didn’t say banana,” the joke gets a bit more annoying.  Perhaps this is because of an aspect of politeness recently referenced on yet another of my favorite sites, Language Log.  There it was mentioned that Penelope Brown and Stephen Levinson, founders of “Politeness Theory,” have provided conceptual tools to enable us to distinguish between situations in which statements offering information the hearer should already have suggest that the hearer does not already know that information and thereby offend the hearer and those which do not carry that suggestion and which therefore do not offend the hearer.  A joke with a painfully obvious punchline may fall in the first category, as do the reiterated responses in “Orange you glad I didn’t say banana.”  Casual remarks about the weather and other forms of small talk usually fall in the second category, as do formalized utterances generally.

Meeting Alison

I wish I’d been in New Zealand to see brilliant young cartoonist Sarah E. Laing meet her hero, comics titan Alison Bechdel.  Exxtraordinary to compare the photo below with the comic Ms Laing is giving Ms Bechdel– powers of prophecy that one has.

LET ME BE FRANK

This is the trailer for The Curioseum – what do you think? I was pretty excited to see all my illustrations animated, albeit in a low-fi way.

Also, I gave Alison Bechdel my Metro comic, the one with her in it.

BiRkusdCEAAp4WA

She seemed pretty excited to see herself there, and she accepted my bundle of Let Me Be Frank comics, but I think I scared her with my enthusiasm. When I saw her at the NZ comics panel the next day she looked a little guarded and apologised for not reading my comics yet. I told her it was ok – she could throw them in the recycling if she liked – but I hope she doesn’t. I felt a bit sorry for her – she probably has half-crazed cartoonists foisting comics on her all the time and she’s too nice a person to tell us to piss off. We…

View original post 46 more words

The Atlantic, April 2014

In her cover story about trends in parenting styles in the US and Britain, Hanna Rosin tells several charming anecdotes contrasting her mother’s approach to raising her some years ago to her own approach to raising her daughter today.  Ms Rosin follows up with data showing that her mother’s relatively laissez-faire methods were typical of Americans in the 1970s and 1980s, while her own much more intensive style of supervision is typical of the early 21st century.  Statistics do not show that the newer approach has led to any improvement in the safety of children, and in fact support claims that such close supervision harms children in a number of ways.  Here are a couple of paragraphs from the heart of Ms Rosin’s article:

I used to puzzle over a particular statistic that routinely comes up in articles about time use: even though women work vastly more hours now than they did in the 1970s, mothers—and fathers—of all income levels spend much more time with their children than they used to. This seemed impossible to me until recently, when I began to think about my own life. My mother didn’t work all that much when I was younger, but she didn’t spend vast amounts of time with me, either. She didn’t arrange my playdates or drive me to swimming lessons or introduce me to cool music she liked. On weekdays after school she just expected me to show up for dinner; on weekends I barely saw her at all. I, on the other hand, might easily spend every waking Saturday hour with one if not all three of my children, taking one to a soccer game, the second to a theater program, the third to a friend’s house, or just hanging out with them at home. When my daughter was about 10, my husband suddenly realized that in her whole life, she had probably not spent more than 10 minutes unsupervised by an adult. Not 10 minutes in 10 years.

It’s hard to absorb how much childhood norms have shifted in just one generation. Actions that would have been considered paranoid in the ’70s—walking third-graders to school, forbidding your kid to play ball in the street, going down the slide with your child in your lap—are now routine. In fact, they are the markers of good, responsible parenting. One very thorough study of “children’s independent mobility,” conducted in urban, suburban, and rural neighborhoods in the U.K., shows that in 1971, 80 percent of third-graders walked to school alone. By 1990, that measure had dropped to 9 percent, and now it’s even lower. When you ask parents why they are more protective than their parents were, they might answer that the world is more dangerous than it was when they were growing up. But this isn’t true, or at least not in the way that we think. For example, parents now routinely tell their children never to talk to strangers, even though all available evidence suggests that children have about the same (very slim) chance of being abducted by a stranger as they did a generation ago. Maybe the real question is, how did these fears come to have such a hold over us? And what have our children lost—and gained—as we’ve succumbed to them?

Also in this issue, several authors are asked to name the best fictional character of all time.  Children’s author R. L. Stine convinced me:

Aside from being amiable, Mickey Mouse has no discernible personality of any kind, yet he has captivated the world, appeared in hundreds of films, and sold billions of dollars’ worth of merchandise. Has any other fictional character held sway over so many countries for so long?

To build an empire like that of Disney on the basis of “no discernible personality of any kind” is indeed an achievement I would have thought impossible had it not actually been done.

Michael O’Donnell reviews some recent work on the passage of the 1964 Civil Rights Act, and seems mystified at the reluctance of some writers to give President Lyndon Johnson his due in that process.

Robert D. Kaplan seems to be less prominent than he was before the 2003 Iraq War; he may be the only person in the USA whose career took a hit for supporting the war.  Not that he is backing down; his piece in this issue is called “In Defense of Empire.”  I suppose we have to salute him for his willingness to stick by his principles.

At any rate, Mr Kaplan’s argument exhibits the some of same bizarre weaknesses in reasoning that underpinned so much of the rhetoric he and his fellow warhawks deployed in favor of invading Iraq to topple Saddam Hussein.  As he and others habitually did in those days, Mr Kaplan makes a generalization and flatly refuses to analyze it, insisting on applying his glossy abstractions in several senses at once.  So, Mr Kaplan tells us in this piece that empires are more likely than homogeneous nation-states or loose confederations to “protect minorities,”  but that dysfunctional empires sometimes fail in their mission to “protect minorities.”

Now one need not be an expert in such things to realize that a statement like “empires protect minorities” needs some unpacking.  Sometimes an imperial power will align itself with an unpopular minority group, promoting the interests of that group and to some extent governing through it.  The minority’s unpopularity makes it dependent on the imperial power for protection, and therefore more likely than the majority to collaborate with whatever schemes that power may put forward.  That very collaboration exacerbates the minority’s unpopularity and vulnerability.  And of course there are many other ways in which imperial powers divide and rule their subjects, many of which involve favoring minorities as against majorities.  An sober examination of these methods might leave some people willing to tolerate imperialism from time to time, but it would hardly be likely by itself to constitute a case “In Defense of Empire.”

Derek Thompson explains “How National Basketball Association Teams Fool Themselves Into Betting Too Much on the Draft.”  Mr Thompson’s explanation identifies fallacies that distort decision-making in non-sports related organizations as well:

In most professional sports leagues, including the NBA, the worst teams are first in line to snag the most-promising amateur players in the next draft. When the ripening crop of amateurs looks especially tantalizing (this year’s is projected to be historically good), multiple teams will suddenly compete to be so uncompetitive that, through sheer awfulness, they will be blessed to inherit the top pick. One anonymous general manager told ESPN the Magazine earlier this season, “Our team isn’t good enough to win,” so the best thing is “to lose a lot.”

In a way, there is a dark genius behind the tanking epidemic. In what other industry could you persuade your customers to root for the worst possible product? But tanking puzzles academics like David Berri, the author of the 2006 book The Wages of Wins and a widely read commentator on sports economics. “Tanking simply does not work,” he told me. Nearly 30 years of data tell a crystal-clear story: a truly awful team has never once metamorphosed into a championship squad through the draft. The last team to draft No. 1 and then win a championship (at any point thereafter) was the San Antonio Spurs, which lucked into the pick (Tim Duncan) back in 1997 when the team’s star center, David Robinson, missed all but six games the previous season because of injuries. The teams with the top three picks in any given draft are almost twice as likely to never make the playoffs within four years—the term of an NBA rookie contract, before the player reaches free agency—as they are to make it past the second round.

Why are teams and their fans drawn to a strategy that reliably leads to even deeper failure? The gospel of tanking is born from three big assumptions: that mediocrity is a trap; that scouting is a science; and that bad organizations are one savior away from being great. All three assumptions are common, not only to sports, but also to business and to life. And all three assumptions are typically wrong.

All three of these ideas seem to spring from an addiction to a messianic view of life, in which the best things can come only to those who have suffered the worst things (so, never to the merely mediocre, but perhaps to those who lose every game for months,) there exists a true path to greatness that will be revealed to those who seek it by the right means(so, the fetishization of science, including the anointing of such obviously non-scientific pursuits as basketball scouting as sciences,) and a charismatic figure is destined to come to the lowly in their darkest hour and to lead them on that true path (so, sacrificing a whole season of potentially competitive play in the hopes of attracting such a savior.)  For all I know, messianism may reflect a cosmic truth, as Christians and others say that it does, but it certainly does seem misplaced in the world of professional basketball.

Jenny Xie writes about a graphic designer named Nikki Sylianteng, who received many parking tickets because she was confused by the famously complex street signs that are supposed to tell New York City’s residents where they may and may not leave their cars.  Ms Sylianteng designed some street signs according to a simpler scheme.  She tacked her signs up next to city signs giving the same information and invited the public to tell her what they thought of them.  Here’s Ms Sylianteng’s website.

Barbara Ehrenreich has written a book called Living With a Wild God.  In it, Ms Ehrenreich mentions an strange psychological break she experienced in her youth.  She was walking by herself in a desert town when all of a sudden she was transported by a wave of ecstasy and the world seemed to be a radically different place.  Ms Ehrenreich has no idea what that was all about.  Though she recognizes the feeling in descriptions that talented religious persons give of their mystical experiences, Ms Ehrenreich is herself quite sure that whatever happened to her was entirely of this world.  In a brief notice of the book in this issue, Ann Hulbert summarizes this story and quotes a remark of Ms Ehrenreich’s:

The young Barbara had been keeping a hyper-articulate journal as she puzzled over the meaning of life, but she found no coherent words for the predawn blazing onrush of … what? Was she crazy? God wasn’t in her vocabulary. In the years that followed, Ehrenreich the biology grad student, social activist, journalist, and brilliant cultural critic and historian was struck dumb, too.

Now she has come up with the words, and I’m tempted to credit Ehrenreich with managing a miracle. But she resolutely avoids rhetoric in that “blubbery vein”—which is why her book is such a rare feat. “As a rationalist, an atheist, a scientist by training,” she struggles to make sense of the epiphany without recourse to the “verbal hand-wavings about ‘mystery’ and ‘transcendence’ ” that go with the territory. There was nothing peaceful or passive about the ecstatic state that seized her: “It was a furious encounter with a living substance that was coming at me through all things at once.” There is nothing pious about her reckoning with her past self, and with “a palpable Other, or Others.” Ehrenreich has no interest in conversion: “I believe nothing. Belief is intellectual surrender.” She wants, and inspires, open minds.

I don’t know whether Ms Hulbert has quoted Ms Ehrenreich fairly, but if she has I am surprised.  “Belief is intellectual surrender.”  So it is.  That’s the point, believers call for surrendering oneself altogether to the supernatural, in the case of monotheistic religions surrender to God.  Therefore, the challenge is to prove that intellectual surrender is bad, not to prove that belief is intellectual surrender.  Ms Ehrenreich is one of America’s foremost public intellectuals, and so I suspect she knows that, and that Ms Hulbert’s quotation was cut short by limitations of space.

Unreal city

The other day a man named Fred Phelps died.  He died alone, and no funeral is planned for him.  Unlike most who exit life under these circumstances, Mr Phelps was a well-known public figure.  A disbarred lawyer, Mr Phelps declared his family to be a church and himself to be its pastor.  For many years, the Phelpses have traveled the USA, forming picket lines outside funerals and other events carrying signs with such lovely slogans as “God Hates America,” “God Hates Fags,” and “Thank God for Dead Soldiers.”  The Phelpses don’t conduct funerals for their own dead, and had evidently disowned their patriarch before his death.

I saw the Phelpses in action twice.  Once, they were picketing a military funeral, celebrating the death of a 19 year-old soldier.  A few years later, they were picketing the funeral of a 16 year old girl who had taken her own life.

So I have first-hand evidence that there was such a person as Mr Phelps.  But I can’t entirely get over my disbelief that he could possibly have existed.  What could possibly possess anyone to devote his life, and to influence his children to devote their lives, to such a perverse endeavor?  I’m not a psychologist, so I don’t know whether mental illness would be a viable explanation.  And I’m not a theologian, so I don’t know if it would be appropriate to attribute Mr Phelps’ behavior to demonic possession.

I can sympathize with those who suspect that Mr Phelps was some sort of agent provocateur serving the interests of those who wanted to discredit the causes he ostensibly supported; certainly the level and form of publicity he attracted shows that he made himself remarkably useful to precisely the people he targeted with his message of hate, and a conspiracy in which he was deliberately acting to promote equal rights for same-sexers by tarring the opposition with the brush of sheer lunacy would be intelligible in a way that a world in which someone could sincerely believe that what Mr Phelps was doing was a worthwhile way to spend a life would not be.  Having seen the Phelpses up close, though, I can’t accept so simple a theory.  They struck me as something radically alien and radically hostile to the life of the world.  Any explanation of them, I suspect, would have to be, not only complex, but also a challenge to our usual assumptions.