Dave Brubeck died today, I’m sad to say. I’ve always had a soft spot for him. A few numbers I’d mention are his versions of “Linus and Lucy,” “These Foolish Things,” “The Way You Look Tonight,” and “Le Souk.” I shook his hand once, after a concert in 1987.
All posts for the month December, 2012
Posted by acilius on December 5, 2012
The ancient Roman calendar gave special names to two days in each month: the Kalendae (in English, “Calends,”)which was the first day of the month; and the Idus (in English the “Ides,”) which was the fifteenth day of March, May, July, and October, but the thirteenth day of every other month. Other days were specified by counting the days until the next Calends or Ides. So, the last day of April was pridie Kalendas Maias, the first day before the Calends of May. There was some special significance to what came to be called the Nonae (in English the “Nones,”) that is to say, the ninth day before the Ides. So, in March, May, July, and October, the Nones would fall on the seventh day of the month, and in other months they would fall on the fifth day. So today, being the fifth, is the Nonae Decembris. As far as the formal language of law and religion were concerned, this arrangement around the Calends and the Ides constituted the whole internal structure of the month. The Romans did experiment with various forms of the week, most notably an eight-day week that determined when markets would be held. Undoubtedly these sequences of days would also have influenced the Romans’ perceptions of time, even if they were not regularly integrated into the official calendar.
I bring this up because of an xkcd strip that appeared a week ago today. Cartoonist Randall Munroe used Google’s Ngram search to tabulate the number of occurrences of each date by its name (ordinal number + month name) in English-language books since 2000.
His results suggest that our months do have some kind of internal structure that is not illustrated on our usual calendars. Those simply display numbers in a grid of weeks. Yet Mr Munroe’s findings suggest that there is more to it than that. As the mouseover text points out, in eleven of twelve months the eleventh is mentioned much less often than any other date. The exception is of course September, where references to the events of 11 September 2001 propel that date to the very top of the list of frequently named dates. Yet this pattern was already well-established before 2001, and there is no obvious explanation for it.
Some variations in frequency are relatively easy to explain. The first of the month is usually a day when many bills and reports are due, and so the first is among the most named dates of each month. Holidays are also prominent; notice, though, that the eleventh of November, Veterans’ Day in the USA and Remembrance Day in the countries of the Commonwealth, is no bigger on Mr Munroe’s chart than the little elevenths of the other months. The 15th of April is quite prominent; that has traditionally been the day when income taxes were due in the USA. But, in addition to the mystery of the obscured elevenths, we also notice that the fourth and nineteenth are bigger than average in most months. Why would that be? Perhaps it doesn’t mean anything, but perhaps there is some explanation that would become obvious if we were in the habit of thinking of calendars, not as the grids of weeks that are usually tacked on walls in the West, but as structures built around major days, structures like those the ancient Romans used. Too bad we can’t raise some ancient Romans from the dead and put them in charge of investigating the question, their perspective might result in a most fruitful study. I suppose the best substitute would be classical scholars who have spent time studying the ancient Roman calendar.
Posted by acilius on December 5, 2012
On Counterpunch, Jean Bricmont warns us to “Beware the Anti-Anti-War Left.” Professor Bricmont explains:
The anti-anti-war left has no influence on American policy, but that doesn’t mean that it has no effect. Its insidious rhetoric has served to neutralize any peace or anti-war movement. It has also made it impossible for any European country to take such an independent position as France took under De Gaulle, or even Chirac, or as Sweden did with Olof Palme. Today such a position would be instantly attacked by the anti-anti-war left, which is echoed by European media, as “support to dictators”, another “Munich”, or “the crime of indifference”.
What the anti-anti-war left has managed to accomplish is to destroy the sovereignty of Europeans in regard to the United States and to eliminate any independent left position concerning war and imperialism. It has also led most of the European left to adopt positions in total contradiction with those of the Latin American left and to consider as adversaries countries such as China and Russia which seek to defend international law, as indeed they should.
When the media announce that a massacre is imminent, we hear at times that action is “urgent” to save the alleged future victims, and time cannot be lost making sure of the facts. This may be true when a building is on fire in one’s own neighborhood, but such urgency regarding other countries ignores the manipulation of information and just plain error and confusion that dominate foreign news coverage. Whatever the political crisis abroad, the instant “we must do something” reflex brushes aside serious reflection on the left as to what might be done instead of military intervention. What sort of independent investigation could be carried out to understand the causes of conflict and potential solutions? What can be the role of diplomacy? The prevailing images of immaculate rebels, dear to the left from its romanticizing of past conflicts, especially the Spanish Civil War, blocks reflection. It blocks realistic assessment of the relationship of forces as well as the causes of armed rebellion in the world today, very different from the 1930s, favorite source of the cherished legends of the Western left.
Professor Bricmont traces the rise of the Anti-Anti-War Left to the end of the Cold War:
The demonization campaigns prevent peaceful relations between peoples, cultural exchanges between citizens and, indirectly, the flourishing of the very liberal ideas that the advocates of interference claim to be promoting. Once the anti-anti-war left abandoned any alternative program, it in fact gave up the possibility of having the slightest influence over world affairs. It does not in reality “help the victims” as it claims. Except for destroying all resistance here to imperialism and war, it does nothing. The only ones who are really doing anything are in fact the succeeding U.S. administrations. Counting on them to care for the well-being of the world’s peoples is an attitude of total hopelessness. This hopelessness is an aspect of the way most of the Left reacted to the “fall of communism”, by embracing the policies that were the exact opposite of those of the communists, particularly in international affairs, where opposition to imperialism and the defense of national sovereignty have increasingly been demonized as “leftovers from Stalinism”.
Interventionism and European construction are both right-wing policies. One of them is linked to the American drive for world hegemony. The other is the framework supporting neoliberal economic policies and destruction of social protection. Paradoxically, both have been largely justified by “left-wing” ideas : human rights, internationalism, anti-racism and anti-nationalism. In both cases, a left that lost its way after the fall of the Soviet bloc has grasped at salvation by clinging to a “generous, humanitarian” discourse, which totally lacks any realistic analysis of the relationship of forces in the world. With such a left, the right hardly needs any ideology of its own; it can make do with human rights.
Joan Walsh is in a position to examine the inner workings of the Anti-Anti-War Left. In a piece for Salon, Ms Walsh looks at what appear to be the first visible signs of Hillary Clinton’s campaign for the Democratic Party’s 2016 presidential nomination. Discussing a video that media mogul Haim Saban produced in tribute to Ms Clinton, Ms Walsh begins with a quote from David Remnick:
The film was like an international endorsement four years in advance of the Iowa caucus and the New Hampshire primary. The tone was so reverential that it resembled the sort of film that the Central Committee of the Communist Party might have produced for Leonid Brezhnev’s retirement party if Leonid Brezhnev would only have retired and the Soviets had been in possession of advanced video technology. After it was over there was a separate video from the President.
Comparisons between the Obama administration and the Brezhnev regime strike me as remarkably inapt on one level. Leonid Brezhnev pursued identifiable goals by rational, if brutal, means in his occupation of Afghanistan; Ms Clinton and her colleagues have approached the Soviets’ level of brutality in that country, though their motivations are entirely confined to the electoral politics of the USA. Of course, Mr Remnick’s statement has to do with the propaganda these two regimes produced, and there may be some superficial similarities there. Be that as it may, Ms Walsh writes:
If Clinton is serious about not running, she should keep copies of the video handy to cheer her up in case she ever doubts her legacy or gets bored. (I’d maybe edit out the Henry Kissinger parts, but that’s just me.) But if she’s serious about running, she should burn the video and never watch it again. It’s an artifact of our self-congratulatory global national security and finance elite, and it belongs in a time capsule. If it were shared widely, it could cost her as many votes as it wins her. And trust me: Bruno Mars’ “Just the Way You Are” is not going to wear well over the years to come.
Ms Walsh goes on:
Remnick’s reporting from the Saban Forum underscored the foreign policy challenges of a Clinton candidacy. Although the Obama administration certainly pushed the Middle East peace process harder than Bush officials did, the prospects for peace may be dimmer than ever. Clinton’s warm-up act at the forum was hawkish Foreign Minister Avigdor Lieberman, who argued (to little pushback) that “settlements are not an obstacle to peace. The opposite is true.” Clinton followed him, boasting of opposing last week’s symbolic U.N. vote for Palestinian statehood and supplying Israel with the “Iron Dome” weaponry that protects it from Hamas rockets, while asking Israel for “generosity” toward the Palestinians. She was rewarded Monday with the Netanyahu government announcing plans to expand its settlements on Palestinian land.
Even after she leaves as secretary of state, Clinton will continue to face tough questions about U.S. Middle East policy if she runs for president. As well she should. The GOP crusade against Susan Rice is personal and unfair, especially since questions about the State Department’s security situation in Benghazi, the role of the CIA at the consulate, as well as the administration’s ongoing Libya policy, are more appropriately asked of Clinton. And have no fear, they will be, should she run. The 2016 election will at least partly be about whether the Obama administration’s policies have made Americans safer and the world more just. The answer to both questions may turn out to be yes, at least within the confines of reasonable 21stcentury political expectations (I recognize that’s kind of a cop-out qualifier, but the question deserves an article, or a book, or books, or a whole library, of its own). But it’s a debate worth having, and Clinton would be either blessed or cursed with having to defend the Obama side.
Ms Walsh thus takes her place on the Anti-Anti-Anti-War Left. Why not simply call this the Anti-War Left? Her closing paragraphs make clear that Ms Walsh’s political world is circumscribed by the boundaries set by the Anti-Anti-War Left:
In 2008, I fought hard against the ahistorical, inaccurate notion that the middle-class Clintons, a married couple, could be considered a political “dynasty” à la the Bush family dynasty. Still, I would wince at yet another Clinton-Bush contest. But if it came to that, I would, of course, enthusiastically support Hillary Clinton over Jeb Bush – and so would most of the country.
But it’s a long way from here to there, with a lot of domestic and international landmines that could make Clinton forgo the race or else doom her candidacy if she runs. I write as a Hillary admirer. But I think the fawning of her overclass admirers, as captured on the Saban video, could make her presidency not inevitable but impossible.
“I would, of course, enthusiastically support Hillary Clinton.” The Anti-Antis need take no notice of Ms Walsh; she will vote for them, she will of course vote for them, she will vote for them enthusiastically, no matter what they do. Her disagreements with them have no influence even over her own vote.
As in these last twenty years the Left in the USA and Europe has been largely co-opted by advocates of perpetual war for perpetual peace, so the Right has been co-opted by the War Party for sixty years. It was not always so. Until the United States entered the Second World War late in 1941, organized opposition to wars had usually expressed itself most effectively in American politics in the form of movements from the Right. It was the Federalists who led opposition to the War of 1812, the Whigs who led opposition to the Mexican War in the 1840s, conservatives of all stripes who scrambled for decades to prevent the Civil War and tried to broker a compromise peace during it, the arch-conservatives of the Anti-Imperial League who raised the loudest voices against the war with Spain and the annexation of the Philippines at the end of the nineteenth century, and the America First Committee of 1939-1941 that still ranks as the largest antiwar organization in American history. In the aftermath of the Second World War, an international situation appeared in which it was difficult to imagine any sort of order emerging without a dramatic expansion of American power in the world, and the military establishment built during the war had become so prominent a part of the USA’s economic system that only the bravest politicians could imagine a return to the pre-war America in which military spending was a tiny percentage of Gross Domestic Product and the USA barely had a standing army. The leaders of the American Right therefore turned away from the anti-interventionist tradition to which they were rightful heirs. They had not developed a coherent militarist ideology, and had little of value to contribute to the formulation of an activist foreign policy. What they could and did do was devote themselves to attacking dissenters left and right. Senator Joseph McCarthy is remembered in a harsh light because of his attacks on left-leaning figures who looked skeptically on Cold War policies; McCarthy’s sometime defender, the late William F. Buckley, Jr, is still lionized for his attacks on right-leaning figures who dared to doubt the same policies. This Anti-Anti-War Right reached its apotheosis in the Bush-Cheney administration, and its willful deafness to all who questioned its approach.
In a recent piece, economist Bruce Bartlett details how what he saw of the Anti-Anti-War Right in the Bush-Cheney phase led him to question his decades-long allegiance to the conservative movement, and shift markedly to the left. What most horrified him was the insularity of the Anti-Anti-War Right, its refusal to consider points of view that its leaders had not previously approved or to take notice of publications not on their recommended list:
In 2004 I got to know the journalist Ron Suskind, whose book The Price of Loyalty I had praised in a column. He and I shared an interest in trying to figure out what made Bush tick. Neither of us ever figured it out.
A couple of weeks before the 2004 election, Suskind wrote a long article for the New York Times Magazine that quoted some of my comments to him that were highly critical of Bush and the drift of Republican policy. The article is best remembered for his quote from an anonymous White House official dismissing critics like me for being “the reality-based community.”
The day after the article appeared, my boss called to chew me out, saying that Karl Rove had called him personally to complain about it. I promised to be more circumspect in the future.
Interestingly, a couple of days after the Suskind article appeared, I happened to be at a reception for some right-wing organization that many of my think tank friends were also attending. I assumed I would get a lot of grief for my comments in the Suskind article and was surprised when there was none at all.
Finally, I started asking people about it. Not one person had read it or cared in the slightest what the New York Times had to say about anything. They all viewed it as having as much credibility as Pravda and a similar political philosophy as well. Some were indignant that I would even suspect them of reading a left-wing rag such as the New York Times.
I was flabbergasted. Until that moment I had not realized how closed the right-wing mind had become. Even assuming that my friends’ view of the Times’ philosophy was correct, which it most certainly was not, why would they not want to know what their enemy was thinking? This was my first exposure to what has been called “epistemic closure” among conservatives—living in their own bubble where nonsensical ideas circulate with no contradiction.
My growing alienation from the right created problems for me and my employer. I was read the riot act and told to lay off Bush because my criticism was threatening contributions from right-wing millionaires in Dallas, many of whom were close personal friends of his. I decided to stick to writing columns on topics where I didn’t have to take issue with Republican policies and to channel my concerns into a book.
I naïvely thought that a conservative critique of Bush when he was unable to run for reelection would be welcomed on the right since it would do no electoral harm. I also thought that once past the election, conservatives would turn on Bush to ensure that the 2008 Republican nomination would go to someone who would not make his mistakes.
As I wrote the book, however, my utter disdain for Bush grew, as I recalled forgotten screw-ups and researched topics that hadn’t crossed my radar screen. I grew to totally despise the man for his stupidity, cockiness, arrogance, ignorance, and general cluelessness. I also lost any respect for conservatives who continued to glorify Bush as the second coming of Ronald Reagan and as a man they would gladly follow to the gates of hell. This was either gross, willful ignorance or total insanity, I thought.
My book, Impostor: How George W. Bush Bankrupted America and Betrayed the Reagan Legacy, was published in February 2006. I had been summarily fired by the think tank I worked for back in October 2005. Although the book was then only in manuscript, my boss falsely claimed that it was already costing the organization contributions. He never detailed, nor has anyone, any factual or analytical error in the book.
Among the interesting reactions to my book is that I was banned from Fox News. My publicist was told that orders had come down from on high that it was to receive no publicity whatsoever, not even attacks. Whoever gave that order was smart; attacks from the right would have sold books. Being ignored was poison for sales.
Some would argue that this “epistemic closure” takes more spectacular forms on the Anti-Anti-War Right than on the Anti-Anti-War Left. Even if that is so, it is likely because the Anti-Anti-War Left is newer. The Anti-Anti-War Left will likely continue to evolve in the same fashion as long as it continues to win elections.
In a recent Guardian column, Glenn Greenwald mentions left-of-center media figures who have, since last month’s election, been making remarks that mirror Mr Bartlett’s post-2004 assumption “that a conservative critique of Bush when he was unable to run for reelection would be welcomed on the right since it would do no electoral harm.” The reaction his former colleagues on the Right gave Mr Bartlett’s book showed him that such an assumption was “naive.” Mr Greenwald analyzes the excuses which Democratic-leaning media gave for whitewashing the Obama administration up to Election Day and concludes that the same excuses will apply equally well for all time to come:
Hendrik Hertzberg proclaims that they will now be even “more respectful” of Obama than they have been. Short of formally beatifying him, or perhaps transferring all their worldly possessions to him, is that even physically possible? Is there a reverence ritual that has been left unperformed, swooning praise left to be lavished upon him, heinous acts by him that have not yet been acquiesced to if not affirmatively sanctioned in the name of keeping him empowered? That media progressives will try to find ways to be even “more respectful” to the president is nothing short of scary.
As for the vow that media progressives will now criticize Obama more and hold him more accountable, permit me to say that I simply do not believe this will happen. This is not because I think those who are taking this vow are being dishonest – they may very well have convinced themselves that they mean it – but because the rationalization they have explicitly adopted and vigorously advocated precludes any change in behavior.
Over the past four years, they have justified their supine, obsequious posture toward the nation’s most powerful political official by appealing to the imperatives of electoral politics: namely, it’s vital to support rather than undermine Obama so as to not help Republicans win elections. Why won’t that same mindset operate now to suppress criticisms of the Democratic leader?
It’s true that Obama himself will no longer run in an election. But any minute now, we’re going to be hearing that the 2014 midterm elections are right around the corner and are of Crucial Significance. Using their reasoning, won’t it be the case that those who devote their efforts to criticizing Obama and “holding accountable” the Democrats will be effectively helping the Republicans win that election? Won’t Obama critics stand accused of trying to keep the Speaker’s gavel in the hands of the Tea Party rather than returning it to Nancy Pelosi, or of trying to hand Senate control over to Mitch McConnell (or, soon enough, of trying to give the White House to Marco Rubio instead of Hillary Clinton)?
Once one decides in the name of electoral expediency to abdicate their primary duty as a citizen and especially as a journalist – namely, to hold accountable those who wield the greatest political power – then this becomes a permanent abdication. That’s because US politics is essentially one permanent, never-ending election. The 2012 votes were barely counted before the political media began chattering about 2016, and MSNBC is already – as one of its prime time hosts put it – “gearing up” for the 2014 midterm.
I’ve described before how the permanent election cycle is the most potent weapon for keeping the citizenry (and media) distracted by reality-TV-show-type trivialities and horse-race excitement in lieu of focus on what the government is actually doing. But the other significant benefit of having all political disputes viewed through a partisan electoral prism is that it keeps partisans focused only on the evils of the other party and steadfastly loyal to their own. The desire to influence election outcomes in favor of one’s own party subsumes any sense that political officials from one’s own party should be checked in how they exercise their power.
Can the USA break its cycle of ever-more warlike politics? I would say that the cycle will be broken, and rather soon, but probably not by the electoral process. The USA is running short of funds and short of allies; its recent military campaigns have ended disastrously, most obviously in Iraq where thousands of American lives and hundreds of billions of American dollars have turned the country into a satellite of Iran. Eventually a foreign policy so idiotically mismanaged will have to exhaust the ability of the country that perpetrates it to project its power in the world. Perhaps the prediction William Graham Sumner made in his 1899 speech “The Conquest of the United States by Spain” will be fulfilled in this century:
We have beaten Spain in a military conflict, but we are submitting to be conquered by her on the field of ideas and policies. Expansionism and imperialism are nothing but the old philosophies of national prosperity which have brought Spain to where she now is. Those philosophies appeal to national vanity and national cupidity. They are seductive, especially upon the first view and the most superficial judgment, and therefore it cannot be denied that they are very strong for popular effect. They are delusions, and they will lead us to ruin unless we are hardheaded enough to resist them…
The perpetuity of self-government depends on the sound political sense of the people, and sound political sense is a matter of habit and practice. We can give it up and we can take instead pomp and glory. That is what Spain did. She had as much self-government as any country in Europe at the beginning of the sixteenth century. The union of the smaller states into one big one gave an impulse to her national feeling and national development. The discovery of America put into her hands the control of immense territories. National pride and ambition were stimulated. Then came the struggle with France for world-dominion, which resulted in absolute monarchy and bankruptcy for Spain. She lost self-government and saw her resources spent on interests which were foreign to her, but she could talk about an empire on which the sun never set and boast of her colonies, her gold-mines, her fleets and armies and debts. She had glory and pride, mixed, of course, with defeat and disaster, such as must be experienced by any nation on that course of policy; and she grew weaker in her industry and commerce and poorer in the status of the population all the time. She has never been able to recover real self-government yet. If we Americans believe in self-government, why do we let it slip away from us? Why do we barter it away for military glory as Spain did?
Like Sumner, I consider myself a patriotic American. At the conclusion of Sumner’s speech, he harks back to the political ideals of the founders of the United States:
No adventurous policies of conquest or ambition, such as, in the belief of our fathers, kings and nobles had forced, for their own advantage, on European states, would ever be undertaken by a free democratic republic. Therefore the citizen here would never be forced to leave his family or to give his sons to shed blood for glory and to leave widows and orphans in misery for nothing. Justice and law were to reign in the midst of simplicity, and a government which had little to do was to offer little field for ambition. In a society where industry, frugality, and prudence were honored, it was believed that the vices of wealth would never flourish.
We know that these beliefs, hopes, and intentions have been only partially fulfilled. We know that, as time has gone on and we have grown numerous and rich, some of these things have proved impossible ideals, incompatible with a large and flourishing society, but it is by virtue of this conception of a commonwealth that the United States has stood for something unique and grand in the history of mankind and that its people have been happy. It is by virtue of these ideals that we have been “isolated,” isolated in a position which the other nations of the earth have observed in silent envy; and yet there are people who are boasting of their patriotism, because they say that we have taken our place now amongst the nations of the earth by virtue of this war. My patriotism is of the kind which is outraged by the notion that the United States never was a great nation until in a petty three months’ campaign it knocked to pieces a poor, decrepit, bankrupt old state like Spain. To hold such an opinion as that is to abandon all American standards, to put shame and scorn on all that our ancestors tried to build up here, and to go over to the standards of which Spain is a representative.
As a patriot, I do not wish to see my country reduced to a “poor, decrepit, bankrupt old state.” I would much prefer to see an aroused citizenry, indignant at the crimes which, committed in its name, defile its honor in the eyes of the world, rise up and put a stop to those crimes. It would be worth a thousand Fourth of July celebrations to see Mr Obama and his living predecessors in the office of US president tried, convicted, and punished for the myriad atrocities they have ordered, beneath which the proud boasts of our Constitution lie buried. But I have little hope that such will happen. The Anti-Antis, left and right, will likely retain their monopoly on political power in the USA until the regime collapses under the weight of their misgovernment. What follows will be shabbier, meaner, and less menacing to the world at large; but the promise that this country made to its citizens and to all those who believed in its founding ideals in the days when there was real debate here will never be kept. Perhaps, after our decline is complete, the USA will reemerge as a socially progressive liberal democracy, as Spain has done. Considering what Spain went through on the path to its present condition, that prospect offers the patriotic American cold comfort indeed.
Posted by acilius on December 4, 2012
Bruce Schneier declares:
It’s a feudal world out there.
Some of us have pledged our allegiance to Google: We have Gmail accounts, we use Google Calendar and Google Docs, and we have Android phones. Others have pledged allegiance to Apple: We have Macintosh laptops, iPhones, and iPads; and we let iCloud automatically synchronize and back up everything. Still others of us let Microsoft do it all. Or we buy our music and e-books from Amazon, which keeps records of what we own and allows downloading to a Kindle, computer, or phone. Some of us have pretty much abandoned e-mail altogether … for Facebook.
These vendors are becoming our feudal lords, and we are becoming their vassals. We might refuse to pledge allegiance to all of them — or to a particular one we don’t like. Or we can spread our allegiance around. But either way, it’s becoming increasingly difficult to not pledge allegiance to at least one of them.
The whole piece is worth reading. For my part, I’ve often wondered if the Internet doesn’t fit Max Weber’s conception of a bureaucracy. Weber described six major characteristics of bureaucracy (here‘s a handy summary of his views.) First and most familiar in the popular use of the word, a bureaucracy has a formal hierarchical structure. While there is no group of people who are the president and board of directors of the Internet, the machines that make up the Internet do in fact relate to each other according
to set routines. Weber described bureaucracies staffed by human officials, but parts of his description still apply where, as in the functioning of the Internet, the officials are replaced by machines.
The second characteristic of bureaucracy in Weber’s description is a set of rules that consistently transform particular decisions made in one part of the structure into particular actions taken in other parts of the structure. In this regard every bureaucracy aspires to the condition of a machine; as a bureaucracy composed of machines, the Internet would in a sense represent the ultimate bureaucracy. Along with these rules comes a heavy emphasis on written documents and permanent records, to ensure that decisions are communicated from one part of the structure to another accurately and that they are converted into action appropriately. Here again, the Internet’s tendency to preserve data makes it the ideal form of bureaucracy.
Third, Weber says that bureaucracies are organized by functional specialty. Here we see two levels of organization taking place independently of each other. Of course, the machines are sorted together by their functions. At the same time, the people who use the Internet develop specializations in their ways of relating to it. Those who resist specialization remain on the fringes of the Internet. So, a general-interest blog like this one toddles along for years with a handful of readers; start a tumblr site devoted entirely to eighteenth-century cocktail recipes, and you might draw a thousand followers in a week. Through them, you can learn more about your topic than you had imagined possible. Because of the efficiency that results from the Internet’s specialization and consistency, users have strong incentives to specialize their own use of the system and to respect its rules. Thus, the Internet’s human users behave as they would if they were clients of a bureaucracy staffed by human officials.
Fourth, Weber’s bureaucracies have missions. These missions are not simply tasks for which groups might be established ad hoc, but are the overarching goals that justify the organization’s continued existence. Because so many people have stakes in the continued existence of large bureaucracies, their missions tend to become rather broad and ill-focused over time; the last thing anyone wants is for the bureaucracy that provides his or her livelihood to have completed its mission. A phrase like “the distribution of information,” precisely because it is so vague, is therefore a perfectly apt mission statement for a major bureaucracy.
Fifth, bureaucracies are impersonal structures, in which the relationship of one person to another is restricted to the roles that those people are playing. So, if Alice is a sales agent for her company and Bob is a purchasing agent for his, their business discussions are between vendor and client, not between Alice and Bob. When Internet cafes first appeared, nearly twenty years ago, a huge percentage of them had Peter Steiner’s cartoon from 5 July 1993 The New Yorker taped to the wall:
Now we’re living in the age of Facebook, and on the Internet everyone knows that you’re a dog, what you had for breakfast, where you like to do your business, etc. Still, there is an element of impersonality built into online interactions. So online political discussions, even on Facebook itself, quickly become interactions between supporter of Party X and supporter of Party Y, even when those supporters are close friends in other settings. Obviously people can turn each other into symbols of opinions they dislike in any social environment, but I don’t think it’s controversial to say that online discussions are particularly prone to this sort of reduction. Moreover, the most pleasant online relationships tend to be the simplest, those in which participants change their personas least often. If Alice and Bob meet at a site devoted to eighteenth-century cocktail recipes and interact simply as devotees of those recipes, I suspect they are likelier to look forward to hearing from each other than they will be if they start talking about other topics and expecting other kinds of emotional and intellectual support from each other. Offline, I would think it would be the opposite, that people who discuss only one topic and present themselves to each other in only one way are unlikely to become close. I’d be interested to see studies on this hypotheses, a quick Google Scholar search hasn’t shown me any but if you know of such, please enlighten me.
Sixth, employment in a bureaucracy is based on technical qualifications. Civil service exams, educational requirements, efficiency ratings, and other devices for measuring competence are not necessary if the best person for the job is the person who has inherited it as a matter of right. They are necessary if the best person is the ablest. Of course, every human bureaucracy exists within a society where there are laws, institutions, and ethical ideas that predate the rise of bureaucracy and survive independently of it. So one does not expect a certifying authority to require the person who owns a business to prove that s/he is the ablest person to oversee its operations. Nor does one expect anyone to require potential parents to demonstrate any particular abilities in order to earn a license authorizing them to produce children, or to raise the children they have produced. If all social life were subject to the demands of a single bureaucracy, we would expect to see such requirements. Indeed, as bureaucratization proceeds apace, we see ever more footprints of bureaucracy in areas which were once matters of right. In many parts of the USA, for example, voters are routinely required to produce identification before they are allowed to take ballots, even though there is no evidence that anyone has ever impersonated a voter, and absolutely no way to affect the outcome of an election by impersonating voters. These laws are accepted, not because they serve any legitimate purpose, but simply because it seems natural to the residents of a social world dominated by bureaucracy to be called on to produce one’s papers.
As for the Internet, there are technical specifications devices must meet in order to be connected. This automated bureaucracy rarely sorts its human users by technical qualifications, though they do sort themselves in much the way that the clients of bureaucracies staffed by humans sort themselves. And, as they do when interacting with bureaucracies staffed by humans, Internet users do tend to see themselves as clients receiving services rather than as citizens asserting their rights. Zach Weiner expressed that point very effectively in February, with his now-classic cartoon about the so-called “Stop Online Piracy Act” that was then before the US Congress:
So you can see why I have thought it made sense to look at the Internet as a bureaucracy in Max Weber’s sense. Perhaps, though, it makes more sense to follow Mr Schneier and look at it as a feudal realm. While every element of a bureaucracy is, at least in theory, accountable to some overall authority that regulates that bureaucracy, the elements to which we trust our online security are accountable to no one. As Mr Schneier writes:
In this new world of computing, we give up a certain amount of control, and in exchange we trust that our lords will both treat us well and protect us from harm. Not only will our software be continually updated with the newest and coolest functionality, but we trust it will happen without our being overtaxed by fees and required upgrades. We trust that our data and devices won’t be exposed to hackers, criminals, and malware. We trust that governments won’t be allowed to illegally spy on us.
Trust is our only option. In this system, we have no control over the security provided by our feudal lords. We don’t know what sort of security methods they’re using, or how they’re configured. We mostly can’t install our own security products on iPhones or Android phones; we certainly can’t install them on Facebook, Gmail, or Twitter. Sometimes we have control over whether or not to accept the automatically flagged updates — iPhone, for example — but we rarely know what they’re about or whether they’ll break anything else. (On the Kindle, we don’t even have that freedom.)
The Good, the Bad, and the Ugly
I’m not saying that feudal security is all bad. For the average user, giving up control is largely a good thing. These software vendors and cloud providers do a lot better job of security than the average computer user would. Automatic cloud backup saves a lot of data; automatic updates prevent a lot of malware. The network security at any of these providers is better than that of most home users.
Feudalism is good for the individual, for small startups, and for medium-sized businesses that can’t afford to hire their own in-house or specialized expertise. Being a vassal has its advantages, after all.
For large organizations, however, it’s more of a mixed bag. These organizations are used to trusting other companies with critical corporate functions: They’ve been outsourcing their payroll, tax preparation, and legal services for decades. But IT regulations often require audits. Our lords don’t allow vassals to audit them, even if those vassals are themselves large and powerful.
In some of my darker moments, I’ve wondered if the USA is undergoing a revival of feudalism. Mr Schneier makes a strong case that it is, at least in this area.
Posted by acilius on December 3, 2012