Wednesday, March 28, 2012

Great leaders are not often good leaders

I have been extremely busy with college projects recently, and unable to update The Harvest - sorry about that.

Now, just a quick point. I remember on an Orkut discussion forum years ago I asked if 'great' leaders were very often bad leaders.

Public polls on the greatest ever citizens of some country often choose mighty leaders who won glorious victories. For a while one public vote in Russia saw Josef Stalin in the lead, although some people blamed this on computer hackers inflating his votes. A British equivalent saw Winston Churchill take first place and the Russians finally settled for the brilliant medieval prince Alexander Nevsky, who won decisive victories against Catholic European invaders when Russia was at its weakest. Stalin, Nevsky, Churchill - all were powerful military leaders who marched their nations to crucial victories in war. Stalin was also a terrible tyrant who had millions of people murdered. Churchill was an eager imperialist who recommended using chemical weapons 'against recalcitrant Arabs as an experiment'. He oversaw the use of paramilitary 'Black and Tans' to put down the Irish War of Independence, a force which became notorious for its indiscriminate violence:
They were in any case explicitly instructed to step outside the law, one police divisional commander instructing his men in a speech: "If a police barracks is burnt then the best house in the locality is to be commandeered, the occupants thrown into the gutter. Let them die there; the more the merrier."

He instructed them to shout "Hands up" at civilians, and to shoot anyone who did not immediately obey. He added: "Innocent persons may be shot, but that cannot be helped, and you are bound to get the right parties some time. The more you shoot, the better I will like you, and I assure you no policeman will get into trouble for shooting any man."
This is not to say that Churchill was an especially wicked leader by any means. I wondered, though, where were the peacetime leaders? Why did people not get excited by those competent rulers who kept their countries peaceful and prosperous and did not suggest gassing civilians?

Today I read this from David Henderson at Econlog:
Historians and journalists commonly survey other historians on the relative "greatness" of American presidents, and these rankings show remarkable consistency between surveys. In this paper we consider commonalities between highly ranked presidents and compare plausible determinants of greatness according to historians. We find that a strong predictor of greatness is the fraction of American lives lost in war during a president's tenure. We find this predictor to be robust and compare favorably to other predictors used in previous historical research. We discuss potential reasons for this correlation and conclude with a discussion of how historians' views might affect policy....

Beyond that, we should stop celebrating, and try to persuade historians to stop celebrating, presidents who made unnecessary wars. One way to do so is to remember the unseen: the war that didn't happen, the war that was avoided, and the peace and prosperity that resulted. If we applied this standard, then presidents Martin van Buren, John Tyler, Warren G. Harding, and Calvin Coolidge, to name four, would get a substantially higher rating than they are usually given.
This is a good point, I think. Henderson talks more about the incentives modern leaders may have, knowing that only wartime leaders get a legacy of greatness, but I don't have the time to read them just now. I am reminded, though, of Mike Duncan's History of Rome podcasts which occasionally described emperors who were fortunate to rule in a time of peace, yet felt they needed to wage and win some war to gain legitimacy. Much provoking of Germans and Persians was done by these leaders eager for glory.

All this makes me grateful to live in a tiny country with no imperial history, no memories of glorious battles, no experience of international war since independence in 1922, and apparently no appetite for adventuring abroad today. I'm content with good over great.

Tuesday, March 20, 2012

"Not really" trends

Over the St Patrick's Day weekend Irish television featured a number of programmes about Irish national identity and changing Irish attitudes. A common theme here was the division of Irish history into three broad periods:

1) Early independence when Irish people were emerging from many centuries of Protestant British domination. The narrative here is that the Irish Catholic majority were suspicious of government because it had been so long associated with the hated British. Economic stagnation, cultural repression under a Catholic conservatism, and mass-emigration led to a deep national insecurity. Pessimism and cynicism became the norm.

2) The Celtic Tiger period, when the economy boomed, Irish cultural exports became popular abroad, and migrants began to pour into the country. Here the narrative is that there was a new sense of economic and cultural confidence among the Irish. Riverdance made Irish dancing, which many remembered from stiff and sexless schoolday jigs, assertive and sexy. Overcome with excitement over the boom and easy credit, people spent wildly, grew uncharacteristically optimistic, and arrogant.

3) The economic crisis since 2007, which reset back to the default Irish setting of emigration, cultural and economic shame, and contented pessimism!

Discussion around this tried to relate specific cultural products to the eras from which they emerge. Riverdance, as I mentioned, seemed symbolic of the new assertive Ireland. The assumption underlying all this seemed to be that most Irish people were deeply responsive to the large-scale economic changes, and that personal opinions and attitudes and interests were moved by the macro-level societal changes.

I was full of doubts about this because my own tastes don't seem to have changed at all to keep up with society's shocks. As a child I loved old books like Treasure Island (1883), The Hobbit (1937) and The Hounds of the Morrigan (1985). Back then Ireland had yet to experience its big economic boom. As I grew older I read other things but right through my teens I still loved such unfashionable classics, while Ireland's economy grew and grew. Now, at 29, with the economy beginning to climb out of a major crisis, I still love this stuff.

I started listening to heavy metal in my late teens, around 1999. I listened to it through the September 11 attacks, the Afghanistan and Iraq Wars, the housing bubble, the Lisbon Treaty, the economic crisis, the collapse of Fianna Fáil and rise of Barack Obama. None of these big events made a damn bit of difference to my tastes in music. So why do these commentators assume that populations are moved more by macro-level economic and cultural changes than by the dramas and insights of their daily lives? Why interpret books and music and cinema as being inspired by, and representational of, their narrow period?

Mulling all this over, and thinking especially of my continued appreciation of JRR Tolkien's wonderful The Hobbit, I remembered that Tolkien himself was bothered by similar interpretations. With The Lord of the Rings, many critics thought they saw a grand analogy for World War II: Adolf Hitler as the terrible Sauron stretching a dark hand out from the Mordor of Germany. This seemed a perfect allegory for the time.

Yet Tolkien utterly rejected this interpretation, insisting in the foreword of the second edition of LOTR that he disliked allegory and pointing out that First World War had been the more traumatic event in his life. He added that Lord of the Rings was clearly not an apt allegory for World War II:
If it had inspired or directed the development of the legend, then certainly the Ring would have been seized and used against Sauron; he would not have been annihilated but enslaved, and Barad-dûr would not have been destroyed but occupied. Saruman, failing to get possession of the Ring, would in the confusion and treacheries of the time have found in Mordor the missing links in his own researches into Ring-lore, and before long he would have made a Great Ring of his own with which to challenge the self-styled Rule of Middle-earth. In that conflict both sides would have held hobbits in hatred and contempt: they would not long have survived even as slaves.
I have written here before that generalisations about groups can be useful without them being applicable to every single individual. But these vague claims about generations shifting in attitude and interest in response to macro-level trends are often not backed by any evidence at all. Have most people really gained and lost a sense of cultural confidence since 1990? With no evidence either way there is little point in even guessing.

Monday, March 19, 2012

What's the abortion debate about?

Opinions on abortion tend to coincide with views on other issues so much that abortion becomes deeply tied to personal and political identity. For pro-choice advocates, the pro-lifers are easily depicted as backward religious conservatives: sexist relics of a patriarchal past who still wish to force state power onto the bodies of women. For pro-life advocates, the pro-choicers are easily depicted as irresponsible and immoral seculars: representatives of a generation who would rather murder innocent infants than take responsibility for their own selfish behaviour.

All this is silly, and confuses the debate. At its heart the abortion issue is very simple.

When does life begin?
Both sides agree that new born infants are human beings with the human right to life. Neither side accepts that a baby could be killed for any reason. The pro-life people simply grant that right to the foetus right back to conception. Pro-choice people see the right to life beginning either at birth or at some stage during the pregnancy.

That's what the abortion debate is about. When does an individual develop the unassailable right to life? All this other stuff about religion and identity, about patriarchy and women's rights over their own bodies, is irrelevant. Figure out when the human right to life begins and the debate is over.

The only alternative way of explaining it that I can think of is that one might argue that a foetus does have a right to life, but that the mother's rights supercede it somehow. In the end there may be no clear-cut answer and governments may settle on awkward compromises that will satisfy neither lobby with their moralistic and identity-driven rhetoric.

Saturday, March 17, 2012

St Patrick's Day, religious taboos, and food


Twenty years ago today I was standing in the rain, shivering and drenched as wild gusts of wind blew drizzle up into my face, across my numbed fingers, and down inside the tin whistle clamped to my mouth. I was ten years old, dressed in my primary school marching band's green uniform and standing in formation with the rest of the children, about to march throughout my home town playing folk tunes as part of our St Patrick's Day parade. Parents and locals gamely lined the streets in scarves and raincoats to cheer us on through the squalls.

Yet we children did look forward to the day, largely because of a curious Irish convention that the Christian fast of Lent didn't count on St Patrick's Day. For that one happy day in the middle of Lent we were allowed to relapse and scoff sweets and biscuits and sugary drinks.

For non-Christian readers: Lent is a religious period over 40 days preceding Easter Sunday, commemorating Christ's 40 days of fasting in the desert, and over which many modern Christians hold some kind of fast.

In the past (and I gather in some Christian communities still) this fast was pretty severe but by the time I was growing up in the 1980s and 90s it really just meant not eating sweets or biscuits - hard enough as a young lad that the St Patrick's Day dispensation was dearly welcomed.

I've kept on the Lentan tradition and am eating noticeably healthier today (resisting a St Patrick's Day relapse) than I was a few weeks ago. The fast, so hard to handle as a child, is now actually pretty easy and I feel no temptation to break it. I go through my days eating fairly well and finding this easy. Why?

We are, I think, back to taboos. The struggle for self-control really all happened when I was a child, and when the issue was framed in a religious context. Fasting at Lent was depicted as a religious duty and, in my monoreligious area, everyone seemed to be doing it so there was no reason to question it. By sanctifying the fast, the element of personal choice was reduced: it must be done, therefore it is done. Now I feel no compulsion when I pass by the confectionery I would normally be drawn towards, because it is simply out, taboo until Easter Sunday.

I have touched on taboos a few times recently with the idea that some taboos might actually be healthy, stabilising forces for society. The longer democracies go without coups and civil wars, for example, the more entrenched they become and the stronger the taboo on seizing power through violence grows. The abandonment of norms and taboos accompanied the collapse of republicanism in Ancient Rome.

I see the difficulty many people have in regulating their diets, and compare this with the relative ease with which I stay off sugary snacks for Lent. Of course I will return to bad habits with Easter Sunday, but perhaps there may be a way for people to regulate their own diets and lifestyle through similar taboos, though I'm not sure how it would be applied. The taboo needs to be unconditional and sort of natural, inevitable, with clearly-defined borders. Once there is a kind of absolute commitment that the individual doesn't even think of questioning, it may be a lot easier to control one's desires.

Wednesday, March 14, 2012

The job-seekers' arms race

I have sometimes thought while preparing for job interviews that I was building a set of skills that might become permanently redundant if I actually got the job. That is, the things I have learned about writing a decent CV and presenting myself in interviews could get me a position in which I might potentially remain for the rest of my working life, thus negating any need to ever write good CVs or do good interviews ever again.

Over the years I was exposed to endless advice about CV-writing and interviews, much of which was useful and reasonable, yet it also seemed oddly vain since everyone else was getting the same advice. There were people whose job it was to travel around giving advice to large groups of job-seekers. Surely if everyone is getting the same advice and all improving their CV and interview skills then nobody is getting any relative advantage at all.

Today I realised that what I was thinking about is Robert Frank's arms race. Here, as with the process that produced the over-sized antlers of bull elks, we are talking about mostly relative improvements for individuals over their competitors, a zero sum game where some must lose for others to win. Perhaps employers would be encouraged to employ some extra staff if they found exceptional job-seekers, granted, but for the most part I imagine that employers' plans to take on new staff will not be influenced much by the collective interview skills of job-seekers. The competition is therefore a relative one between the people looking for work.

Individuals get a brief relative advantage over their rivals for jobs following such training, but if that pushes others to boost their own training then the advantage is wiped out. Perhaps there is no real benefit to job-seekers as a group, while there is a real expense in time and (sometime) money spent on training. Job-seekers competing with one another are locked in a kind of pointless arms race to boost their relative standing without actually becoming good at anything useful.

Or am I wrong? Feel free to share your thoughts!

Delicious: keeping The Harvest links somewhere handy

I refer to a lot of really interesting and useful online sources in this blog, so I have set up a Delicious account to save such resources. I am still adding links these days, but readers who are curious where they might find some of the information I've discussed here could find Delicious useful to keep track of me. Here it is, and I will be adding more with time.

Saturday, March 10, 2012

Evolution and Sexism

Social psychologist Dr Roy Baumeister of Florida State University argues that differences in the behaviour and wealth of men and women today are heavily determined by differences in male and female biology, a theme explore in his book Is There Anything Good About Men? In this podcast he discusses the book with economist Russ Roberts. I disagree with a few points Baumeister makes, especially his belief that society has become feminised and boys are being encouraged to behave like girls - I certainly don't see it here in Ireland. But his is an intriguing, challenging approach to explaining gender disparities.

For example, Baumeister argues that in primitive times most women would have children sooner or later, while a much smaller proportion of men would father children. Why? Here is Roberts discussing the book:
You refer to a DNA study on that, and the proportions are rather striking about how many of our ancestors are women, versus men. You assume it's 50-50--we each had a father and a mother, as you point out in the book.... The DNA research came out a few years ago and said: Well, no, it's twice as many women as men.

Now, laypersons have been surprised because they thought it should have been 50-50 and have a hard time understanding how it could be unequal. But when you talk to biologists and people like that, they are surprised the difference isn't bigger. Because in many species, 20% of the males but 90% of the females will reproduce.
For females reproduction was highly likely; they would sooner or later have sex and get pregnant. Males, though, were in a winner-takes-all situation where a wealthy elite could have great harems and father vast numbers of children while many more men never have children at all. For Roberts and Baumeister, this contributed to the spread of male genes associated with extreme aggression and competitiveness. Getting to the top meant securing mass-reproductive rights. Less competitive traits would disappear from the gene pool as those men failed to secure the wealth and status to boost sexual access to women. One intriguing implication is thrown up in an off-hand comment by Roberts:
Polygamy has been the norm more often than not; it's only the last couple of centuries we've begun to insist on monogamy; which in my understanding is a way of spreading the women around so that every man could have a woman. That equalizes things much, so that will drive in the long run the number of our ancestors closer toward sort of 50-50.
He says polygamy but means specifically polygyny (men having more than one wife). Baumeister points out that while modern discourse sometimes describes polygyny as being sexist against women - what woman wants to be the fifth wife of some grizzled old man? - really it was much more sexist against most men because it created big winners with multiple wives and losers, men who would never marry at all. That is an intriguing viewpoint: that monogamy was a way for men to reduce their risk of total failure. (I wonder could monogamous cultures have had a competitive advantage over polygynous cultures by having greater internal stability because the men weren't busy hacking one another to bits to grab more women?)

He points to the fact that when the Titanic sank, women were much more likely to survive than men - the very poorest women still survived at greater rates than the very richest men:
So, anyone who talks about patriarchy--those guys were the patriarchs. The rich, upper-crust. Quite males. And yet they couldn't even get seats in the lifeboats as long as there were poor, impecunious women in line. The women all had to go first.... I think most men know that in a pinch they will be expected to sacrifice their lives in order to let the women and children survive. Society values the lives of women and children better.
I had interpreted that kind of chivalry - 'women and children first' - as a sign of the cultural contempt for women in 1912, which saw them as being so pathetic that they needed to be protected like children. Baumeister's view is different. Women were more important for the survival of cultures and populations than men because a small number of men can have sex with lots of women and repopulate a depopulated region, while a large number of men are demographically useless if they lack women to give birth to the new generation. Chivalry probably granted cultures competitive demographic advantages compared with cultures which neglected to protect the child-bearers.

Baumeister makes a number of strong generalisations about male and female behaviour, some of which I'm doubtful of, but he adds this wonderful and important reminder:
Remember too we're not talking about: men are totally one way and women are totally a different way, we're talking about overlapping distributions.
Yes! This comes back to a point I was making a few days back:
When, for example, wage differences between men and women are discussed, it is the mean wages that are considered.... That the male mean is higher than the female mean does not mean that 'men earn more than women' because there will be high-income women and low-income men. But, knowing nothing more than a person's sex, our guess about their likely income is slightly improved.
That is what Baumeister is talking about here too: not all men differ from all women, rather the means and, especially, the distributions differ. Here is an example from UseableStats.com, showing the distributions of male and female heights. Yes the means are different, yes there are far fewer very tall women, but there is also lots of overlap:
Oddly enough, that graph has a characteristic that Baumeister says is also true for IQ: male results are more widely distributed than female, with more men at the extremes of both high and low height.

Anyway, simply knowing an individual's sex should not make us confidently predict their behaviour, since there is overlap in distributions. I imagine that discrimination based on sex or some other characteristic which we can determine by appearance is only really smart if we have no time to make a considered decision: if we have to decide in a second whether or not a stranger on a dark street is a criminal for example.

A final point. Much of what Baumeister argues is bound to be attacked as sexist or heteronormative. I also am uncertain about his arguments, but I am grateful to hear them, and that he is free to make them. Sometimes in debates over biology and gender I have found ideas like his shouted down and derided, dismissed off hand. Baumeister:
There is plenty of good social science, but I would say in the fields of gender that there are more people with axes to grind and more bias, and more political correctness, so that lowers the quality of stuff. You are not free to just follow your ideas or thoughts or your data, wherever they may lead.

Friday, March 9, 2012

African poverty and colonisation

Last year I argued that historical colonisation does not explain modern poverty very well:
Liberia was not a colony in the usual sense. Its population came from the US, but came to found a new state, not to serve the economic interests of the Americans. Its ruling elite were mixed race, Euro-Africans from the US, and they occasionally fought native African tribes to establish domination. Nonetheless Liberia ruled itself and none of the 19th century great powers controlled it.

Ethiopian history is simpler: it was conquered by Italy in 1935 and liberated in 1941. Of course Ethiopian leaders fought neighbouring African civilisations and internal rebels, but outside those brief years of Italian imperialism Ethiopia was never colonised.

If colonisation is the cause of modern poverty in developing countries, these two should be wealthy and content.

Today Liberia is the fourth poorest country in the world. Ethiopia is the 15th poorest. Liberia’s next door neighbour Cote d’Ivoire has a GDP per capita over three times higher, despite a century of French colonisation. Liberia also has the third highest infant mortality rate in the world, Ethiopia the 18th highest.
Since then I have had some criticism for this argument, with some complaining that colonisation was not a blanket experience that was the same in every location, so it was unfair to compare the success of countries like Ireland and Singapore since independence with the failures of Zimbabwe or Sudan.

Today I found a fascinating paper from Daron Acemoglu and James A Robinson at Massachusetts Institute of Technology, simply called Why is Africa Poor? They give a more subtle argument than either mine or the populist notion that European colonial abuses were entirely to blame for modern African poverty. Instead they focus on a number of factors:

1) Failure to form states
They say that sub-Saharan African regions tended to centralise power into state structures very late, many centuries after the same process had happened in much of Europe and Asia. Without a centralised state they had difficulty in organising large cooperative activities. Instead raiding and abducting by young men was rife and property rights were not protected.

2) Absolute monarchs
Those regions which did form into states usually ended up with absolutist monarchs of the kind that much of medieval Europe also endured. The problem here was that there was no distinction between the king and the state: the entire territory of the state belonged to the king, who could arbitrarily seize and redistribute land at any moment. 16th century Portuguese visitor Francisco Alvares described a country where an individual's territory would be seized and redistributed by the king every few years. With no property rights, individuals would not bother to look after the soil they worked and were going to shortly lose:
there is not even anyone to plant a tree because he knows that he who plants it rarely gathers the fruit.
Under absolute kings, a tiny elite of the king's cronies benefitted but the nation as a whole stagnated and failed to invest in its future.

3) Slavery
Slavery pre-dated the Atlantic trade (it existed in Eurasia also for many centuries), but the arrival of Europeans willing to buy slaves shifted West African economies into a perverse direction. Most slaves were prisoners of war, so the benefits of war soared as Europeans exchanged slaves for firearms and wealth. There was a general increase of lawlessness from small scale raids for slaves, and greedy rulers were incentivised to find people guilty of absurd and petty crimes for which they could be sold into slavery.

They argue that the slave boom largely distorted institutional development, but in some places led to the development of powerful states based entirely on slave-raiding. Between 1690 and 1710, they say, the state of Oyo was responsible for 80-90% of the slaves sold from the "Slave Coast". As well as those millions actually exported for slavery, many millions more likely died in the inflated carnage of the slave wars. When Europeans abandoned the slave trade, African slave kingdoms simply employed their slaves in Africa, selling raw materials to the West. During the gap between the end of European slave-purchasing and formal European colonisation, slavery in Africa may even have intensified, and it persisted in many places until the 20th century. In response, some communities retreated from roads to avoid the raiding, pushing them away from wider markets.

4) Technological backwardness
Sub-Saharan Africa was very slow to embrace basic technologies like the wheel and the plow. The authors admit that they 'lack satisfactory answers' for the slow uptake of such technologies, but suggest that the lack of property rights meant that nobody was going to invest in a new technology that would immediately be grabbed by the king.

5) Colonisation
For starters foreign rule of Africa until the mid-20th century prevented the kind of institutional development that helped Latin American countries slowly grow out of poverty.

More specifically there were 'dual economies' characterised by huge differences between an urban, developed and industrial society in the cities with rural, undeveloped and agricultural life with communal ownership of land in the countryside. The former are relatively prosperous, the latter very unproductive. In places like South Africa these divisions were enforced by the European colonists, fearful of competition from thriving African rivals and looking for cheap labour.

The British policy of ruling through local chiefs inhibited the development of property rights, and deepened the absolutist tendencies among African leaders, while the setting up of government monopolies undermined indigenous capitalism. Infrastructure was designed to facilitate extraction, not development or public goods. Colonisation intensified notions of ethnic differences, while the new African states had arbitrary borders that cut across ethnic lines, fuelling future conflicts.

6) Post-independence 'neopatrimonial' leadership
Since independence many sub-Saharan African countries have continued with the absolutist order that wrecked investment and innovation by ignoring property rights, and vesting authority in individual leaders instead of institutions. The governments were characterised by 'relationships of loyalty and dependence' by corrupt officials looking to fill their pockets.

They give the fascinating example of Sierra Leone where the British conquered a multiethnic region, directing a railway into the south to guarantee rapid access to the particularly rebellious Mende ethnic group. By the 1960s, after independence, the railway was the channel of exports for Sierra Leone, especially from the Mende. When a new government dominated by northern ethnic groups took power, prime minister Siaka Stevens had the railway to Mendeland ripped up, and the land sold off, because the Mende had overwhelmingly voted for the opposition. This was deeply destructive to Sierra Leone's economy, but the insecure government would rather wreck the country than risk overthrow at the hands of the hated Mende.

Virtually no public goods were developed - television broadcasts ceased when the Minister of Information sold the transmitter - and Sierra Leone sank into desperate poverty. By 2002 the GDP had declined by about 40% from its level at independence.

Similar scenarios plagued other African countries. Few countries, the authors claim, could agree on political institutions to solve conflicts peacefully. (This reminded me of a point I was making just a few weeks ago about the power of modern liberal democracies to create political stability and to win wars. The regular elections and peaceful exchange of power we take for granted in Western Europe are very unusual by historical standards, and desirable! Instead of the winner-takes-all civil wars of the Roman Empire, modern politicians know that their shot at power may be only a few years of patient politics away.)

In much of sub-Saharan Africa the corruption that rewarded cronies and ethnic insiders with loot taxed from the rest, along with the brutal suppression of ethnic outsiders and political enemies, caused deep instability. Every side was desperate to seize power, at all costs, because failure meant dreadful treatment. The result was regular civil war and instability.

Colonial chaos
The authors then argue that while some of these harmful tendencies existed in Africa before colonisation, they believe that colonisation exacerbated them. The relative success of Botswana, which was largely neglected by the British colonial power, is evidence for this point. Elsewhere weak states, absolutist institutions to perpetuate power amongst corrupt elites, insecure property rights, and a failure to provide public goods all led to deeply dysfunctional economies.

The good news now, not discussed in this paper, is that sub-Saharan Africa seems to be finally clicking into gear with economic growth, democratic growth, and a modest reduction of poverty:
For the first time since 1981, less than half of its population (47 percent) lived below $1.25 a day. The rate was 51 percent in 1981. The $1.25-a-day poverty rate in SSA has fallen 10 percentage points since 1999. 9 million fewer people living below $1.25 a day in 2008 than 2005.
What has happened? I'd like to know if Acemoglu and Robinson's hypotheses still stand, and if sub-Saharan Africa is seeing some kind of shift away from the weak and perverted institutions of old. In any case, if correct, they do challenge my view that colonisation 'doesn't matter much' for modern wealth or poverty. But the simplistic blame-heaping on European colonists for every African ill seems also unfair.

Thursday, March 8, 2012

The Darwin Economy and school uniforms or Arms Races Everywhere

Just a very short post to add to my last one about Robert Frank's The Darwin Economy. It occurred to me yesterday that another example of a rule to prevent an arms race is the school uniform.

Even at the time, from the ages of 12 to 18, I was aware and generally grateful that the mandatory uniform rule freed us from pressure to wear fashionable and expensive clothes. My small town school was no centre of fashion in any case, but the ordinary peer pressures of adolescence might have made us feel we had to spend money on expensive clothes to remain cool. Because the uniform placed us all on a level playing field, we didn't need to worry about it. Any competition shifted to the PE class, in which some boys would have expensive new football jerseys, and I did there sometimes feel embarrassed by my humbler gear.

This is interesting as a common sense example of an intervention to prevent a kind of arms race, albeit at a modest level. I was content with the uniform for those vulnerable years of early adolescence, but wouldn't have worn it voluntarily if it hadn't been mandatory because this would have drawn derision from my peers.

Wednesday, March 7, 2012

The Darwin Economy and Islamic niqabs

I am reading Robert H Frank's The Darwin Economy, which argues that Charles Darwin's insights on competition between individuals for mates have deep significance for modern economics.

Briefly, Frank distinguishes between objective improvements and relative improvements. For example, an athlete might run very fast indeed, but it is his or her relative speed compared with a rival athlete that determines whether or not a medal is won.
In nature an example is the strangely huge antlers of bull elks. The bull elks compete with one another for female mates. Those with larger antlers, he says, tend to win these competitions and mate more widely, passing their genes for big antlers down to offspring. As time passes, the antlers get bigger and bigger because the small-antler genes disappear. So, having huge antlers is a reproductive advantage for the individual male elk.

Frank points out, though, that huge antlers are also an impediment to movement, and increase the risk of the elk being caught by predators. Supposing all the male elks could agree to halve the size of their antlers overnight. The species would benefit, as mobility recovers and they avoid predators more easily, and the same individuals with the relatively bigger antlers would still win mates. Nothing would change but the collective population would become much healthier.

Frank gives lots of examples of relative well being in society, where individuals cannot all improve, but that there is always some kind of hierarchy. When houses are tiny and cramped, a bigger house is a real benefit to the occupant. But after a certain level, bigger and bigger mansions (Frank says) add little more satisfaction to the owners. They keep building bigger houses only because they want to out-compete the other rich people. They want not only the material benefits of wealth, but also the status benefits of relative wealth, of being richer than everyone else.

In Frank's view, this often creates a kind of arms race as individuals compete for status or mates by pouring resources into something which gives no tangible material benefit, only status and relative benefits.

For Frank one solution is simply a collective rule (a law) which covers all individuals and prevents this arms race. For example in ice hockey one study found that most players supported a rule to make wearing helmets mandatory, but none of them actually wore helmets. Why the discrepancy? Because helmets put players at a slight disadvantage to non-helmeted players whose vision is less impacted. So long as any player goes without a helmet, all the others are at a relative disadvantage to him. As individuals they reject that risk and go without helmets, but as a collective they want to keep their heads safe and so favour a universal helmet rule.

There is a lot to think about here, and I will probably follow other avenues on this book in later posts. Just one simple thought for now:

Might Islamic niqab play a similar role to the helmet rule? That is, I wonder if women are in a kind of arms race to look as attractive as possible to compete for good male mates. Any individual woman who refuses to join in this, who refuses to look attractive and wear clothes that flatter her body, may be out-competed by her rivals. But a universal niqab rule that forces all women to cover up and hide their bodies, might level the playing field. Perhaps - I'm not sure yet - it's also relative beauty that is important here. So by concealing their appearance, all women are freed from the pressure to compete for a mate against one another.
Of course I don't support any law that forces people to dress in a particular way. But I'm also aware that despite complaints that niqabs as signs of the oppression of women by Islam, some Muslim women do choose to wear it and some support laws to make it mandatory. If Frank is right, it seems plausible that some might welcome the field-levelling offered by a universal niqab law.

A final weird thought. Frank adds that sometimes actors will reject laws and agreements that try to regulate arms race situations because they think it is a race they can win. Ronald Reagan pushed for military developments in the 1980s in the hope that forcing the USSR into an arms race would cause their economic collapse. So I wonder if some especially attractive women who have been able to use their beauty to attract a good mate or more might resist niqab laws for the same reason. While the less attractive women might favour a universal niqab law to level the field, the more attractive might not want to lose their competitive advantage. So in countries like Saudi Arabia do more attractive women tend to prefer liberalisation and less attractive women tend to support universal niqab laws? Now that would be an interesting research project indeed.

Niqab image from Walter Callens at Wikipedia, elk image from קלאופטרה at Wikipeida.

Sunday, March 4, 2012

Building a wooden jewellery box

A friend, partly joking, asked me to make her a wooden jewellery box. On the back of my successful attempts to make wooden combs from yew and teak, I decided to give this a shot.

A box. This seemed a simple enough task, but I was limited by the wood I had available to me, most of which was either very plain softwood or was in large unshaped chunks of hardwood. This means that instead of beginning my project with nice, tidy, machine-squared planks I had to saw short planks of my own, which turned out to be far from perfectly square. This would cause problems later.

I started, anyway, with elm (always a favourite wood to work with - very hard and very beautiful) and some unknown hardwood, perhaps beech.

One strange effect of the elm's toughness is that the butt end of the wood was blackened by frictional heat as I ran it through the bandsaw:
In school we learned to join two pieces of wood at 90° with various kinds of joints for added strength. I was wary of engaging in these because they would be too much work! I hadn't done such joints since I was 15 and didn't trust my ability to do them on rock hard elm. Scared of the proper way of doing things I was eventually inspired by the blackened, burnt butt ends to settle with simple nailed joints that would make a feature of this contrast in colours.
The immediate problem, seen in the first photograph, is that these nails sometimes chipped little bits off, causing frustration and angst galore on my part.

Meanwhile, simply trying to square these clumsily-sawn pieces of wood became a nightmare of sanding, filing and rasping. The slightest unevenness meant that the wood simply did not join together in a square shape. In the end I abandoned perfect squareness: the box could only fit together in one way, slightly askew.

I decided to build two layers. That is, one smaller square box sat atop of a larger square box, each subdivided into four smaller sections. I took a short cut by using hardboard (a thin board formed of compressed wood fibres) to make these compartments, and to form the base of the upper square. Because hardboard has an unremarkable grey-brown surface, I painted all of this black. I also used a dark wood stain to blacken the butt ends of the beech lower layer, mimicking the heat-blackened ends on the elm. This meant that throughout the piece I now had a nice tension between the lighter coloured wood and the dark sections.
I wanted to use a proper wood for the base of the lower layer. After a long time trying different kinds of wood, I discovered a chunk of beautiful bog oak and sawed two sections of this, carefully planing and sanding them smooth so that they could fit together to form the base. Bog oak is regular oak which has been submerged in a peat bog for hundreds or thousands of years. Considering oak itself can live for many centuries, the likelihood is that this wood is very ancient indeed, formed when Ireland was covered in vast oak forests. Perhaps no Normans had arrived yet, or Vikings. Perhaps the people who lived around that tree knew of Romans, or it may have predated even the arrival of Caesar to northern Europe - which is wonderful indeed. It also happens to be beautiful.
Finally there was the lid. Much more searching and head-scratching continued until I found a piece of laburnum, a gorgeous hardwood with rich contrasts between light and dark. From this I sawed two pieces of wood and glued them together. For added strength I cut two pieces of bog oak and glued them across the grain. They immediately broke off, so I glued them again, and nailed them through. I then had to file the pointed ends of the nails off to leave the underside flat.

The knob on the top is made of two small squares of laburnum (I think), glued together and screwed in from beneath.
The whole thing was very vigorously sanded smooth. I finally added a layer of sanding sealer and clear satin varnish. Finished, after several months of on-off labour:
Rustic stuff and, with neither hinges nor locks, not suitable for more expensive jewellery! But I find the woods and contrasts in light and dark pleasing so I am pretty content with this.

Friday, March 2, 2012

Cat piss, Kenny, and not thinking of mischief

I forgot to mention on my last post, that if I'm right that it had simply never occurred to my Japanese students to vandalise their school, and that's why they never did it, it could challenge the modern liberal consensus on how to deal with other issues like underage sex or drugs. That consensus says that teenagers are going to become exposed to sex and drugs, so it is best to educate them earlier by giving them honest information, and trusting them to make the wise decision.

That may be the best option of a bad lot, but I wonder if such education could actually give children and adolescents ideas about harmful things they could do, things which would never have occurred to them otherwise. The classic fictional example comes from an episode of South Park in which counselor Mr Mackey warns the children against doing drugs:
Mr Mackey: Schoolchildren are often experimenting with dangerous ways to get high, hm'kay, like sniffin' glue, guzzlin' cough medicine, huffin' paint, hm'kay? But they're all bad. M'kay?
Butters: Mm-my cousin's in Florida, and said kids in their school get high off of cat pee.
Cartman: Cat pee?
Stan: That's not true. You can't get high off of cat urine, can you?
Mr. Mackey: Well, it's a it's not actually cat urine, but male cats, when they're marking their territory, uh spread concentrated urine to fend off other male cats and... a-and that could get you really high. M'kay? Re-really reeeally high. Okay? Probably shou-shouldn't have told you that just now. Hm'kay? Tha, that was probably bad.
Later that day, the boys try inhaling cat urine, leading to Kenny developing an addiction:

Because of course many adolescents defiantly resist making wise decisions, even when given full information about consequences. Prof Laurence Steinberg of Temple University, Philadelphia, wrote in 2007:
According to this view, the temporal gap between puberty, which impels adolescents toward thrill seeking, and the slow maturation of the cognitive-control system, which regulates these impulses, makes adolescence a time of heightened vulnerability for risky behavior....

Given extant research suggesting that it is not the way adolescents think or what they don’t know or understand that is the problem, a more profitable strategy than attempting to change how adolescents view risky activities might be to focus on limiting opportunities for immature judgment to have harmful consequences. More than 90% of all American high-school students have had sex, drug, and driver education in their schools, yet large proportions of them still have unsafe sex, binge drink, smoke cigarettes, and drive recklessly (often more than one of these at the same time; Steinberg, 2004). Strategies such as raising the price of cigarettes, more vigilantly enforcing laws governing the sale of alcohol, expanding adolescents’ access to mental-health and contraceptive services, and raising the driving age would likely be more effective in limiting adolescent smoking, substance abuse, pregnancy, and automobile fatalities than strategies aimed at making adolescents wiser, less impulsive, or less shortsighted. Some things just take time to develop, and, like it or not, mature judgment is probably one of them.
Steinberg distrusts adolescents' ability to make good choices even with perfect information. Rather than focusing on limiting opportunities to engage in risky behaviour it might - I'm really not sure - might be useful to just avoid discussing some risky behaviours at all. Supposing I had said to my disruptive Japanese students who didn't ever vandalise: "Right, boys, I'll be back in a few minutes. Just one thing. Absolutely DO NOT VANDALISE this room. That's the only thing I'm saying to you: DO NOT BREAK OR DESTROY ANYTHING here. Understand?'

I can imagine a light going on in their minds. Ding! Let the vandalism begin.

Thursday, March 1, 2012

Good taboos and the preservation of order

I was listening to a BBC Analysis podcast from Radio 4 today which discussed a strange experiment, explained below by Ramsay Raafat, from University College London (my emphasis added):
In the first study, an area where bikes were parked, the order condition had a prominent ‘no graffiti’ sign and a flyer was attached to their bikes. When the riders or owners returned to their bike and they had a decision as to what to do with the flyer essentially, only 33 per cent of the people chucked the flyer and got rid of it. Littered; broke a norm. Now when there was a slight manipulation. Everything’s the same - we have our bike shed, bikes, prominent ‘no graffiti’ sign - but now there’s graffiti in the area, so a norm has been violated. Now, interestingly, in this situation a whopping 69 per cent of the riders when they returned chucked the flyer. And so in this instance when one norm’s violated - the graffiti violation - there’s a massive effect on another norm of littering....

They had a post-box. Sticking out of it was an envelope with a five euro note attached. Now in the ordered condition - no litter and no graffiti - only 13 per cent of people stole, took the envelope. However, when there was graffiti or litter, a whopping 25 per cent or 27 per cent of people stole. That’s more than a doubling of norm violation.
The indication that one norm had already been violated not only encouraged people to join in with the violation, but it also weakened the strength of other norms. I was immediately reminded of the collapse of Ancient Roman taboos that saw the sanctity of legitimate republican government replaced with a rush of ambitious generals and corrupt plutocrats rebelling or murdering their way to the throne. 'If Sulla could do it, why can't I?' In this case the BBC narrator jokes that the loss of legitimacy by British members of parliament following the expenses scandal might weaken other social norms: 'Today expenses; tomorrow your hubcaps.'

Less dramatically, though, it touches on a constant problem here in Ireland, where littering is very widespread. Walk tiny, one-lane roads in the heart of the countryside and there are still drink cans and crisp packets flung into the grassy verge. Any taboo on littering is weak, long-ignored. Vandalism is also fairly widespread; just the other day I spied the skeletons of two utterly ruined bicycles still chained to a bike rack outside my local train station.

This experiment has a strange and fascinating implication: if the authorities very quickly sweep away litter and paint over graffiti, people will tend to become more law-abiding and respect these norms. The defence of minor norms and taboos could reduce the violation of more serious ones: clean the streets to cut crime.

Just anecdotally - I don't have any data for this - I do think that Dublin City has become tidier in recent years as street-cleaning machines have been employed to keep the litter down. How I'd love to see if this defence of public cleanliness has had any impact on other crime rates.

Good idea, bad idea
It might have another implication, however. When I taught English in the high school of a small Japanese town I had some disruptive and aggressive teenage students. The worst of these were quick to interrupt classes, pace about the room, intimidate other students, shout insults at new teachers and snigger jokes at my expense in Japanese. These were troublesome boys indeed.

But one thing they never did, and had ample opportunity to do, is vandalise the school property. Every morning the entire school staff held a meeting in a separate building to the classrooms. The students, for nearly half an hour, roamed the classroom building by themselves without supervision. When we teachers arrived there would be graffiti all over the desks, but in pencil: I actually saw students erase their own graffiti occasionally, leaving the desks immaculate.
Knowing that we had some quite disobedient and disruptive students, I was always amazed by this respect for school property. There could be a host of reasons for that, including the fact that the students themselves helped to tidy the school every evening. I thought, though, that the real reason students did not vandalise the school was because it had never occurred to them. There wasn't already a culture of vandalism and littering to give them the idea.

The taboo there is just the innocence caused by an absence of this dysfunction, and perhaps things would degenerate if my students had been exposed to a culture of vandalism and recreational property crime. Just as in the experiment above, signs of property abuse might have normalised this abusive behaviour. In my little Japanese town, the ubiquity of tidy order in the streets and in the school meant that disorderly vandalism simply didn't occur to my disruptive students as a behaviour to consider embracing.