Monday, April 26, 2010

"When we grew up and went to school..."

Roger Waters described a brutal, abusive teacher hammering his students in Pink Floyd's The Wall:

When we grew up and went to school,
There were certain teachers
Who would hurt the children anyway they could.
By pouring their derision upon anything we did,
Exposing every weakness, however carefully hidden by the kids.

But in the town it was well known
When they got home at night
Their fat and psychopathic wives
Would thrash them within inches of their lives

Perhaps when Waters was in school back in the 1950s this was how it was: abusive teachers battering the children. By the time I was growing up things were shifting back the other way. The teachers were being rendered impotent as their ability to discipline children was gradually removed by frantically protective parents.

When I taught teenagers in Japan I was amazed at the weakness of the teachers' arsenal in dealing with student disruption. I had classes where we teachers stood by with fixed smiles, practically waiting for the disruptive students to decide to stop messing about before we started to teach. The class was dictated by the bullies, not the teachers. Meanwhile the disciplined students were sitting there, doing nothing, morale and interest slipping away.

My Japanese students were mostly harmless enough and the disruption tended to be pretty innocent. How such a tolerant system would work in a rougher school I can't imagine. But I understand that teachers are placed under immense strain sometimes to somehow control the disruptive students without reverting to abuse, and still engaging the brightest students too. I also know that some students delight in tormenting teachers, safe in the knowledge that their constant, low-level abuse isn't enough to provoke any serious response.

And then, this happened...

Nottingham Crown Court was told students were filmed calling Mr Harvey a "psycho" moments before the attack.... after a girl with behaviour difficulties started being disruptive it was alleged he kicked her, Mr Rafferty said.

He added: "She left the classroom in a state of tears and some of the class took exception to the way she had been treated and started calling him a psycho.

"He didn't seem to respond to that and told the class to get on with their work."

The schoolboy victim then started waving a wooden metre rule and a metal Bunsen burner about in "high jinks" before he was attacked, the court heard.

Mr Harvey chased him round the classroom and the boy swore at him, the jury was told.

Mr Rafferty said: "That seems to have lit the blue touch paper because Mr Harvey grabbed him by his collar and started dragging him out of the classroom.

"He threw him to the ground and armed himself with a 3kg dumbbell and began to hit the boy about the head with it.

"He struck at least two blows to the head which caused serious injury, really serious injury.

"At the time the blows were being struck Mr Harvey was only heard to say one thing.

"What he was saying was 'die, die, die'."

I guess this teacher was emotionally unstable to start with. But his mad burst of violence is easy to understand as the final revenge of a bullied victim. The tables are turned in many schools now, and when teachers are denied legitimate, safe ways to discipline students some kind of unhealthy reaction (confrontation with or capitulation to the disruptive) may ensue.

Blame the Jews. (They're probably guilty.)

I'm used to popular conspiracy theories blaming Israel and "Zionist" Americans for terrorist attacks in Iraq, Pakistan, Palestine, Iran and so on. Only this January the Iranian government blamed Israel and the US for a bomb attack that killed a physicist in Tehran.

This excuse for internal violence is used a suspiciously large number of times, throwing into doubt how often, if ever, it reflects reality. Still, countries have often supported internal dissidents in enemy nations in the past so it may sometimes be plausible.

But I have just seen it pushed further than ever before, on a discussion forum where a Pakistani member had this to say:

"Israel is the cause and reason for 95% of violence and terror in the world."

Ninety-five percent! Israel has a population of around 7 million. The single Pakistani city of Karachi has a population of around 12 million. Yet the titchy little Israeli nation is supposedly more powerful than almost the combined human population (6,697,254,041 at latest estimate) so that its crimes extend all around the world!

In the same post the member remarked that anyone who supports Israel must be "brainwashed". Eh...

Tuesday, April 20, 2010

Adolescent Alienation, Modernity, and Revolution

Modern life can seem vulgar and purposeless, and violent radical movements use this to draw disaffected youngsters to their causes.

It can be easy to feel isolated from modern life, unmoved or confused by the race to accumulate wealth and property. There is a sad vulgarity to the supermarkets glowing with pointless plastic products, the constant bombardment of advertising companies selling unhappiness while simultaneously inflating a sense of unearned privilege.

Modern life can appear threatening and meaningless. Old communities dwindle and disappear. Old languages, religions and cultures collapse while the young turn to a global consumerist consensus. Security declines; brutal teenage gangsters fill the newspapers, stabbing and bludgeoning the weakest in society to satisfy their desires for those things the advertisers say they need. Urban society desperately separates itself from nature, from traditional life and the ideas that sustained millenia of human civilisation. The stupidest and weakest seem to dominate, the honourable despised. Life gets worse.

Yet this sense – of imminent collapse and the triumph of evil in modern times – seems to affect every generation. Here Biblical King David talks of the success of wicked men he saw around him, 3,000 years ago:

His ways are always prosperous;
he is haughty and your laws are far from him;
he sneers at all his enemies.
He says to himself, "Nothing will shake me;
I'll always be happy and never have trouble."
His mouth is full of curses and lies and threats;
trouble and evil are under his tongue.
He lies in wait near the villages;
from ambush he murders the innocent,
watching in secret for his victims.
- Psalm 10:5-8

Pope Pius XII thought urbanisation was the corruption of his generation, writing in 1946:

It holds up before the dazzled eyes of the country worker the bait of money and of a life of pleasure, in order to induce him to abandon the land and waste in the city, which mostly brings him merely deception in not only whatever savings he had laboriously accumulated, but also frequently health, strength, joy, honor, and life itself.

And Robert Pirsig thought he saw the rot setting in back in 1974, describing a worthless and shallow culture praying to the empty altar of advertisements.

Along the streets that lead away from the apartment he can never see anything through the concrete and brick and neon but he knows that buried within it are grotesque, twisted souls forever trying the manners that will convince themselves they possess Quality, learning strange poses of style and glamour vended by dream magazines and other mass media, and paid for by the vendors of substance. He thinks of them at night alone with their advertised glamorous shoes and stockings and underclothes off, staring through the sooty windows at the grotesque shells revealed beyond them, when the poses weaken and the truth creeps in, the only truth that exists here, crying to heaven, God, there is nothing here but dead neon and cement and brick.
- Zen and the Art of Motorcycle Maintenance, Robert Pirsig

When I was younger I also felt some of this contempt and disconnection with modernity, but over the years I slowly realised that for all its spiritless materialism, modernity is actually a miracle of peace and prosperity. We are a privileged generation, living without fear of smallpox, starvation or foreign oppresson. Technology has liberated us from basic survival concerns, and now we can fly to the far ends of the earth in hours, communicate with them in seconds. Racism, sexism and homophobia have become publicly unacceptable, anti-war marches hint at a growing sense of common humanity. Worth defending, all that.

But for a while I sensed decline and decay, and dealt with it quietly by myself. Since I never became part of any wider reactionary movement I was surprised when I later discovered many political and religious ideologies trying to tap into this youthful arrogance, this naïve dismissal of modernity. Jon Ronson's documentary Them: Adventures with Extremists showed violent Islamist preacher Omar Bakri Muhammad explaining how different Britain will be once the Islamists take over.

So I drove Omar into town by a route that avoided Soho. We passed a poster advertising the Spice Girls' debut album.
"Such a very stupid thing," mumbled Omar. "Spicy Girls."
"What will become of the Spice Girls when Britain is transformed into an Islamic nation?" I asked.
"They will be arrested immediately," he replied. "They will not even be existing in an Islamic state. OK. We can go on. Turn right at the lights."

I watched this and thought: ‘Ah!’ Here was my own adolescent contempt for popular culture in all its shallow catchiness, harnessed by a vicious fundamentalist politics.

It was a revelation, and I wondered how many of the young Islamist bombers, and the nationalist assassins and socialist rioters around the world were just fairly intelligent and alienated young men reacting against the dumbest edge of modernity they saw growing around them.

For this revulsion is voiced by all kinds of extremist movements, be they Pagan neo-Nazis, radical feminists, eco-terrorists, conservative Christians, Communists or Satanists. The radicals all see the rot, and they offer wild solutions, often bloody solutions.

Meanwhile modernity chugs along in all its mundane immorality, still producing wealth and health, still rejecting discrimination of minorities, still, in fact, improving most of the world.

Modernity is unheroic, unromantic, but I see a noble role there, in defending the ignoble absurbity of modern life. Unexciting too, but it may be worthwhile to stand up for normality with all its bullshit plastic ads, shallow sex-obsessed pop videos and Pirsig’s hellish ‘dream magazines’. It’s not worthless, all this – it just sometimes feels it.

Thursday, April 15, 2010

The media is biased, but not how you think

News media can twist the truth by accident, simply as a consequence of its structural limitations. Here are a few ways that news can lie:

1) Sensationalism, and the need to fill space
News editors want stories, and journalists are under pressure to supply them. Sometimes, when the stories are not forthcoming, journalists may feel pressurised to exaggerate stories, or even create conflicts that did not really exist.

In 2007 I was freelancing in a Dublin newspaper when news arrived that 110 children in small boats had capsized in a sudden squall at sea. I was rushed out to the harbour to find out what happened.

The harbour was crawling with media, all interviewing the perfectly safe and happy children. Some children explained that capsizing is extremely common, though usually it doesn't happen to everyone at the same time. The event had been carefully monitored by rescue teams so nobody was seriously hurt.

When I got back to the office I found my editor expecting a major story from this and with no other stories to take its place I had to tease out an uncomfortable article describing the events of what was essentially a non-event. So many other journalists had invested an afternoon in investigating this dead end that the capsizing gained totally disproportionate attention. The Irish Independent called it a 'near-disaster' and wrote:

A 'SMALL craft' warning from Met Eireann was ignored by race organisers who allowed more than 100 children to take part in a regatta which could easily have ended in tragedy.

Looking desperately for a news hook, they went for blaming the race organisers for risking the children. Of course the event did not end in tragedy, the children we interviewed were cheerful and amused at the attention - but some kind of angle was needed to justify the time spent reporting on it.

Another example is this astonishing RTÉ coverage of new Irish crime statistics:

There has been an increase in the number of murders in the first three months of this year when compared to last year, according to CSO crime figures.

A total of 16 people were murdered between January and March, one more than for the same period last year, while murder threats were up by over 60%.

Robbery, hijacking and extortion offences rose by a quarter, while there were also increases in burglary and theft offences. There were decreases in fraud and sex offences.

They lead with the worst of the bad news, leaving good news until the end. Yet the official statistics give a more complex picture, of both rising and falling crime rates:

Controlled Drug Offences decreased by 17.2%. There are also notable decreases in Group 03 Attempts or Threats to Murder, Assaults, Harassments and Related Offences (-17.7%), Group 04 Dangerous or Negligent Acts (-29.2%) and Group 15 Offences against Government, Justice Procedures and Organisation of Crime (-32.0%).

The official statistics include a graph showing that homicide rates are quite a bit lower in spring 2010 than they were in spring 2008, down from around 122 to 78 - an impressive improvement. The real problem area is burgulary, which has dramatically increased; RTÉ buries this news at the end of a sentence.

By going for the most sensational, negative figures, RTÉ gives an inaccurate view of crime trends in Ireland.

2) Sub-editors
Journalists don't usually write their headlines. The piece readers see is edited by sub-editors, to make it fit the limited space in the newspaper. The sub-editors also write the headlines, based on their interpretation of the story. Sometimes sub-editors might misunderstand the piece or, by choosing the most sensationalist and striking aspects of it, change its direction.

(When an article does begin with a sensationalist headline, it is sometimes worth checking the first sentence or two before dismissing it, as the writing may be better.)

Kevin Myers had this controversial article in the Irish Independent, entitled Africa is giving nothing to anyone - apart from AIDS.

Days later Myers wrote again:

I was sure that my column would arouse some hostility: my concerns were intensified when I saw the headline: "Africa has given the world nothing but AIDS." Which was not quite what I said -- the missing "almost" goes a long way; and anyway, my article was about aid, not AIDS.

So some sub-editor had altered the tone of his piece to make it stronger. The reader, however, only sees the name of the journalist and blames him for the exaggeration.

3) Consensus, Friendship and Community
When journalists mix only with people of similar political views they may begin to identify with a particular political stance. This can create complacency. Disagreeing views are dismissed offhand and those making alternative arguments are assumed to be brainwashed and ignorant.

This political identity can influence what a journalist believes is a good news story, and also what questions the journalist asks interviewees.

I sometimes found myself asking one question in interview after interview: "do you think the government should do more" to combat some social ill? To me, and presumably to the large number of journalists I see still using it, it seemed innocuous and obvious. But it was indicative of my assumptions on how society and politics work. Only later, when I began to read right-wing and libertarian perspectives on politics, did I begin to ask another question: "do you think the government should do less?"

The political consensus among journalists is easiest to spot when it differs from one's own views. This probably explains why so many radicals complain about media. To the BNP British media is liberal and politically-correct. To communists it is neo-liberal and xenophobic. To Islamists it is Zionist and anti-Muslim.

4) Topicality sparks panics
Major news stories seem to spawn more stories about similar topics. In 2002 two girls were abducted and murdered in England, a huge story at the time. Around that period, more stories, faintly connected, began to pop up; a stranger in a car approached some children and even though nothing happened, it was in the news. It felt for a while as if the threat of child abduction had soared.

Of course it had not. The Holly and Jessica case put child abduction into the limelight. Editors looked for stories that were "topical". Perhaps ordinary people also reported suspicious events more, running on the briefly increased fear created by the Holly and Jessica case.

By focusing on topical, related stories, news media exaggerated the topical threat.

Coverage of terrorism may have exaggerated the threat and the sense of worsening terrorism. There were far, far more terrorist attacks around the world in 1992 than in any year since. This is not the impression one gets from reading newspapers in the post-9/11 age.

Sometimes the debate about such panics can cause them to worsen. Recently, for example, there has been debate over the causes of child sexual abuse by Catholic clergy. Some liberal commentators blamed Catholic celibacy. Some conservative commentators blamed Catholic tolerance for homosexuals in seminaries after the liberal 1960s.

- Both arguments take for granted the idea that Catholic priests abuse more children than non-priests, which is completely unproven. The debate promoted the idea that Catholic clergy are more inclined to abuse children than others; a 2002 poll in the US found that "64 percent of those queried thought Catholic priests "frequently'' abused children". Those people were deeply misled.

Today the "paedophile priest" has become a stock character, even turning up in the great Irish comedy Father Ted ("Fup off, ya pedrophile!") even though a priest is no more likely to abuse children than a dentist, farmer or teacher.

5) Convenient sources
The journalist working to a tight deadline may tend to search for the easiest sources he or she can find. Some politicians are good at giving concise quotes and are usually available to talk. Others ramble on or are difficult to track down.

Even if the inconvenient source is the more relevant one to a story, journalists sometimes end up contacting the more convenient one instead. I used to find myself constantly phoning one particular Opposition spokesman who was always friendly and helpful. It was just convenience for me - to the reader it may have smacked of bias.

Sunday, April 11, 2010

My very own bullshit

There is a problem in news media, a problem that this blog made clear, quite by accident, over the last few months. The problem is with producing credibility where none is deserved.

Back in March I wrote this piece, a serious list of suggestions for readers to avoid being mislead by writers with an agenda to push.

Of all the blog posts on The Harvest, this is the one that provoked the most interest and support. Previous posts had attracted comments from friends, this one expanded beyond that group – I was beginning to convince strangers.

This was the result I’d hoped for, and set out to achieve by deliberately using authoritative language. I avoided the first person and made each point with straight-faced confidence, as though each claim was objective fact and not just an opinion of my own. Previous blog posts where I had written in the first person seemed to attract less interest and, to me, lacked authority.

Hmm. My very first warning on how to spot bullshit urges readers to be wary of ‘claims made with confidence but without evidence’, yet here I was phrasing opinion in the third person, implying utter confidence, as a way to sway readers.

Still, at least these opinions on how to spot bullshit are a bit obvious and self-evident. Two days ago I pushed things forward quite a bit with a dismissive rant about the decline of ‘high art’. I gave examples of beautiful early 20th century art, followed by horrible mid-late 20th century art that seemed designed merely to shock and annoy. I concluded that modern artists had lost their way, and that ‘low’ artists had compensated for this decline by producing powerful and ambitious pieces.

Check rule number two on how to spot bullshit: ‘Anecdotal evidence can disprove a generalisation, but cannot prove it’. Right, yet here I was, using a tiny number of examples of art belonging to different eras to imply massive-scale shifts in how art is produced. I was breaking my own rules, flagrantly.

In part, this was again a deliberate choice. The post on modern art was inspired by a trip to the Irish Museum of Modern Art a few weeks ago. I entered, looked around for ten baffled and bored minutes, and left. For years I have been going to modern art galleries, hoping to be surprised and inspired, and again and again walking away in disgust. I’d always assumed that there was something there that I was simply too ignorant to grasp – something that would reveal itself to me with time. This assumption was beginning to slide, though, and I increasingly wondered if modern art was just the Emperor’s New Clothes: vacuous, but supported by people pretending to be intelligent enough to enjoy it.

I thought of writing this in my blog, but realised again that the first person opinion piece would lack authority. The internet is full of people writing about their feelings and opinions. The first person would (correctly) render my piece no more authoritative than any other blog on the net.

So, partly as an experiment, I decided to write with absolute confidence, write as though I had decades of art history behind me, as though the few examples I showed were just a few out of thousands more I could reference.

It was bullshit: tidy, compelling bullshit.

Yet there was nothing unique about my approach. On the contrary, a great many commentators use this confident third person approach to writing what are in fact their own opinions; great, well-respected journalists and editors use it all the time.

To use an example of a commentator I respect and enjoy, let’s look at British reviewer Charlie Brooker’s show Screenwipe. Brooker’s look at how television functions is witty (I think) and insightful, but the absolute confidence he applies when interpreting television make his opinions appear fact. In this episode he explores the purpose of television advertising, and how it strives to manipulate the viewers.

The language of advertising is an art in itself, it has to walk a fine line between exaggeration and fact, implying here, suggesting there, and leaving you with a sense that more has been said than has actually been said.

Brooker then goes on to suggest a few ‘key phrases’ which advertisers use to exaggerate or mislead viewers, like ‘help’, e.g. a particular product helps to reduce wrinkles. This is useful, he says, because it is non-specific. It might not help very much, but it will help… a bit.

What Brooker does here, in providing viewers with ways that advertisers may try to mislead them, is close to what I was trying to do with my detecting bullshit post. It is fascinating and everything he says sounds immensely convincing to me.

None of it is backed up by statistical evidence, though. I think Screenwipe is wonderful, entertaining and useful. But I think one should be careful even in watching this great show; Brooker is giving his opinion, which rocks, I reckon, but it is still mere opinion, delivered in the third person as fact.

All this leaves journalists in a bit of a fix. The most honest thing is to be constantly open about what is opinion and what is fact. This might mean using ‘probably’, ‘maybe,’ and, worse, ‘I think’.

Yet even comparing this blog post with previous ones written in the third person, I find this one irritating and weak. This constant use of ‘I’, the repeated caveat of ‘I think’ and ‘maybe’, throws my arguments into doubt and irrelevance. I have become another complaining blogger… instead of an authoritative writer! Imagine how the great quotes of history would be written today if the authors were careful to indicate the difference between opinion and fact, and checked every claim with disclaimers:

‘Let the ruling classes tremble at a Communistic revolution. The proletarians have nothing to lose but their chains. They have a world to win.’
- Karl Marx

This becomes:

‘Let the ruling classes tremble at a possible Communistic revolution. The proletarians have nothing except recent advances in technology and medicine, growth in income, the end of slavery and increased individual freedom to lose. Oh, and their metaphorical chains. They have a world to win, assuming it all works out and doesn’t backfire in some way I don’t yet foresee.’

‘Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.’
- Ben Franklin

‘Those who would give up essential liberty, to purchase a little temporary safety, possibly deserve neither liberty nor safety. I mean, I suppose there is a reasonable middle ground here. I’m not saying we should leave the front door unlocked just because we like the freedom to wander out without a key. But you get me.’

‘You ask, what is our policy? I can say: It is to wage war, by sea, land and air, with all our might and with all the strength that God can give us; to wage war against a monstrous tyranny, never surpassed in the dark, lamentable catalogue of human crime.’
- Winston Churchill

‘Some of you might be asking, what is our policy? I can say: It is to wage war, by sea, land and air – if necessary – with all our might and with all the strength that God (assuming there is one) can give us (assuming God is on our side, which is anyone’s guess); to wage war against a monstrous tyranny (one of many, but we’ll ignore the others for now), never surpassed in the dark, lamentable catalogue of human crime – except maybe by the Mongols, Aztecs, the present Soviet Union, quite possibly some of our own colonial states in the recent past, and so on. But we’ll fight them anyway. Are you with me?’

It’s horrible stuff. Yet this method is the more honest of the two (I think), and the more accurate.

For me, I think I’ll try to be clearer when I use fact and opinion, reserve this authoritative voice for statements of fact and lapse back to the first person when I can’t. This is the better path. (Maybe. I think.)

I, Hobbit

JRR Tolkien's brilliant The Hobbit describes an important scene where Bilbo Baggins, fat and complacent middle-aged hobbit, speaks with Smaug, ancient and murderous dragon. Smaug was an old fighter and confident against dwarves, elves and men - and mocked Bilbo's attempts to debate. There was one gap in Smaug's knowledge, however:

But he did not tell Bilbo that there was one smell he could not make out at all, hobbit-smell; it was quite outside his experience and puzzled him mightily.

Since 2005 I have been debating on online international relations discussion forums, and often I have felt like Bilbo. I find myself surrounded by Pakistani nationalists on occasions, men and women long accustomed to arguing with Westerners and Indians, yet sometimes they seem puzzled by me.

I'm not nationalist, but my Irish citizenship rewards me here for it shoves me (as being a hobbit shoved Bilbo) outside their experience, outside a simple catogerisation of "Western". When they complain of Western Imperialism I can remind them that Ireland was colonised for centuries longer than India-Pakistan, and had been butchering (and mainly being butchered by) our British brethern since the 12th century.

I laugh a little saying this, because it is also clear that Ireland and UK today have generally very good relations and the two are culturally very similar. Ireland is another liberal democracy in Western Europe, a member of the EU, but its avoidance of war also helps me avoid the anger directed as the invaders of Iraq and Afghanistan. I find myself vastly outnumbered on some of these forums but, if not quite welcomed, tolerated.

Saturday, April 10, 2010

Modern art is BULLSHIT

That's not accurate, actually this modern art is human shit: tinned human excrement produced in 1961 by artist Piero Manzoni. A single can was sold in 2007 for 120,000 Euro. By the mid-20th century not all art was shit, but shit had become art.

The 20th century started off so well, not least here in Ireland with John Lavery's many paintings of his beautiful wife:

Or the fantastical frenzy of Harry Clarke's illustrations:

Clarke's images betray an obsession with story-telling; each image is bubbling with tension and motion. Things happen in Clarke's works, stories unfold.

In the early 20th century artists were pushing the envelope, testing the limits of art by expanding into new styles. We end up with gorgeous architecture like the Empire State Building and the Chrysler Building. The French fashion magazine La Gazette du Bon Ton published exquisite images of modern fashions, heavily influenced by classical styles.

Classical composers began feeling for the edges too, creating challenging works like Bartók's Sonata for Solo Violin from 1944 or this menacing piece from Shostakovich's Eight Symphony, written in the height of World War II's carnage - Russia 1943.

Something happened later, as artists began to run out of new spaces to expand into. By the end of the century extremely experimental works had challenged the definition and purpose of art. Criteria for judging art were whittled away: it no longer needed to be beautiful, moving or entertaining. It need not even be recognisable. This piece, for example...

...Gustav Metzger's Recreation of the First Public Demonstration of Auto-Destructive Art, featured a bag of rubbish which a cleaner at the Tate Britain assumed really was rubbish, and dumped.

For many people, this new art was alienating and ugly. Some of it is deliberately alienating, though, deliberately pushing to arouse disgust or annoyance in the viewers. Andres Serrano's Piss Christ is a photograph of a crucifix submerged in a glass of the artist's urine.

Anything could be art in this new age, but when so much tried to shock and disturb it grew wearying, because repeated shocks simply desensitise the viewer. Art involving corpses, faeces, urine and blood can only shock for a few years - and with more people growing up on violent action and horror films the strength of that shock is ever-decreasing, rendering offensive art ever more boring and worthless.

So throughout the 20th century high art pushed away from beauty and excitement, seceding this ground as it drifted into absurd, boring irrelevance. But the demand for beauty and purpose had not disappeared. The late 20th century saw it being satisfied, but not by the great high brow artists. Instead low brow genres became ever more ambitious, expanding into the territory abandoned by the elites.

Rock music
It started as dance music, rocking around the Christmas tree in blue swede shoes. By the 1970s Pink Floyd were taking rock incredibly seriously, using rock to look deeply at human life. Like the high artists, Floyd dealt with alienation and madness, but Floyd's music was beautiful, moving, exciting - they offered narrative instead of noise and in the roared chant 'TEAR DOWN THE WALL', a solution.

With much classical music drifting off into untouchable academic dullness, bands like Pink Floyd filled the space by creating intelligent, ambitious music that was still accessible to millions of ordinary people. Populist film composers like John Williams also satisfied, by creating memorable and thrilling themes to Jaws, Star Wars, Superman and so on.

Superhero Comics
Derided for their childish appeal to bright colours and simplistic plots, superhero comics had by the 21st century drifted into much deeper territory. Alan Moore's 1986-87 Watchmen explored fascism, Cold War paranoia and moral relativism - yet he used gorgeous, compelling artwork and a driving plot to do this. Watchmen is fascinating, entertaining and serious. Works like this in the 1980s led to the rebirth of Batman in the 2000s as a serious character in two ambitious, exciting but also thoughtful films.

In Japan astonishing animated films by directors like Hayoa Miyazaki (creator of Spirited Away, Princess Mononoke and My Neighbour Totoro) united beauty and narrative with deeper themes of environmentalism, religion and feminism. Katsuhiro Otomo's 1988 film Akira gave a dark, sumptuous depiction of a futuristic, corrupted Tokyo.

The Matrix Trilogy
pushed a new kind of superhero. These films made viewers question their assumptions about reality, while entertaining them with stunning action, a mind-bending plot and flawless cinematography. This, apparently, is "low art", while crapping into a can is "high art".

Computer Games
Always growing more realistic and more fantastic, a single game can give hundreds of hours of pleasure, while Piss Christ gives a few seconds of bored disgust. But the game - right at the cutting edge of technology and human innovation - is low art, remember, while a photo of a crucifix in urine is high art...

The growth of great popular arts over the last few decades undoes a lot of the damage done by the decline of high art. If the elites wish to continue pissing people off, they may. The better popular artists will fill their place with accessible, meaningful and beautiful music, cinema, literature and illustration. Art is not lost; only the artists are.

This post has attracted far and away the greatest number of page views on The Harvest blog. So it might be useful to point out that this aggressive and dismissive style was a bit of an experiment. Read here for an explanation!

Friday, April 9, 2010

Generalising about Europe hides the truth again

I blogged before about the danger of generalising about "Europe" or "the West" since these collections of states hide major internal diversity. I am currently reading a report by professor of economics Glen Whitman on the role of the free market in health innovations. Whitman was concerned that reform of the American health system would damage the ability of the market to innovate new drugs and medical devices.

He may be dead right, I'm not sure yet, but I was drawn to one of his pieces of evidence - evidence that seemed weak. Whitman, who believes that the stronger government control of health systems in European countries blocks innovation, pointed out that the entire EU plus Switzerland (total population 499 million people) produced fewer of 30 leading medical innovations since 1975 than the US (total population 307 million).

This looks convincing at first. But why does he lump all these countries together? The report lists every one of these 30 great innovations. Some are listed as having more than one country of origin so I take the simpliest method of calculation by considering every single reference as one innovation, even when it was half-developed in another country.

The results show that the US was at least one of the source countries for a huge number of innovations: 22 altogether.

But there is something strange about the result. Whitman includes all the EU and Switzerland, but the only EU countries on the list are Germany, UK, France and Sweden. So when he judges the entire EU (plus Switzerland), he's including huge populations in eastern Europe, Italy, Spain, Ireland, etc. which aren't on the list at all.

If we look at the specific countries that are on it, in terms of great innovation per million population, our understanding completely changes.

US: 7.16
UK: 13.11
Switzerland: 26.3
Sweden: 21.7

So while the EU as a whole appears to score poorly, specific countries within it far outperform the US.

Why include these other countries at all, then? Much of the present EU was east of the Iron Curtain, stagnating under Communism, during much of the duration of this research. To me this is a pity: otherwise this paper seems well-written and insightful, but its vague generalisation about the EU disguises the truth.

Tuesday, April 6, 2010

Demand examples from political radicals

Political radicals sometimes insist that their ideal political system has never been successfully implemented. This writer, for example, argues that the USSR was never communist:

Whatever view you may have of the USSR (and there are quite a few supporters of Stalinism out there), it was not Communism....

Well, by the way it actually worked, the most fitting description for it is State Capitalism. Simply, the state took on the role of the ultimate Capitalist and set about exploiting the workers....

Another common opinion on this Communism = USSR misunderstanding is the claim that Communism has proven to be a failure. This attempts to show that the path Russia took in the early 20th century is the only possible result any attempt for Communism can achieve and thus it is not worth struggling towards it. But this is not simply wrong, it is intellectually dishonest.

Here, by denying that the USSR was communist, the writer seeks to revive interest in communism as a practical economic system. Criticisms of communism are dismissed because in a real communist society, these problems would not arise.

Anarcho-capitalists sometimes say similar things about capitalism: the US is corporatist or socialist, not truly capitalist, and problems in right-leaning countries are the problems of whatever government intervention they do have, not problems of capitalism.

The latest example is from the Muslim Public Affairs Council in Ireland, who say that Ireland should adopt Islamic Sharia law. Asked which countries are already applying the law, they told Metro Éireann:

“To my knowledge there is no Islamic state that implements Sharia in its entirety,” he said, “and this in part explains why there are instances of injustice in Muslim lands.

“Partial implementation only brings part of the benefit,” he added.

So here again by denying the existence of their dream ideology in the real world, Islamists dismiss examples of real world violence and discrimination connected with Sharia. Saudi Arabia and Iran are troubled, not because they apply Sharia, but because they do not apply it enough.

Communists, anarchists, Islamists and other radicals may in fact be correct when they claim that their ideas have not been implemented in reality. But let's not take their word for it. It could be that their radical ideas have been applied already, and resulted in such poor outcomes that they were abandoned.

So it might be useful to demand real world examples when debating with political radicals of any persuasion.

Saturday, April 3, 2010

Colonisation does not explain modern poverty

Poverty in the developing world is often blamed on colonisation by the great powers of Europe. There is an easy way to test this assumption: compare former colonies with their neighbours that were not colonised.

In Africa this is difficult, as every nation was colonised for at least a brief period. The briefest, however were Liberia and Ethiopia.

Liberia had a strange background, arriving from the desire of abolitionists, African-Americans and white American racists to move some of USA’s growing black population back to Africa. In 1820 the first ship of 88 black emigrants left the US and attempted to found a settlement in Liberia. Over the following decade another 2,638 black Americans made the voyage and, despite clashes with the native Africans, the colony grew and stabilised. From 1842 it was governed by black leaders and in 1847 it became an independent state, winning recognition from numerous Western nations over the next twenty years.

Liberia was not a colony in the usual sense. Its population came from the US, but came to found a new state, not to serve the economic interests of the Americans. Its ruling elite were mixed race, Euro-Africans from the US, and they occasionally fought native African tribes to establish domination. Nonetheless Liberia ruled itself and none of the 19th century great powers controlled it.

Ethiopian history is simpler: it was conquered by Italy in 1935 and liberated in 1941. Of course Ethiopian leaders fought neighbouring African civilisations and internal rebels, but outside those brief years of Italian imperialism Ethiopia was never colonised.

If colonisation is the cause of modern poverty in developing countries, these two should be wealthy and content.

Today Liberia is the fourth poorest country in the world. Ethiopia is the 15th poorest. Liberia’s next door neighbour Cote d’Ivoire has a GDP per capita over three times higher, despite a century of French colonisation. Liberia also has the third highest infant mortality rate in the world, Ethiopia the 18th highest.

In other parts of the world the picture is the same. Ireland, long a colony of Britain, developed wealth greater than most of the European former colonial powers. In Asia, Hong Kong and Singapore became some of the richest regions on earth, while Taiwan and South Korea – victims of brutal Japanese colonialism – have almost caught up with their old imperial enemy. Thailand, never colonised, is now poorer than its southern neighbour Malaysia, after over a hundred years of British rule there.

A nation’s status as a former colony is an extremely poor indicator of wealth or health today. Perhaps the best way to demonstrate this is by comparing former colonies in the same region and under the same colonial power that have seen massive economic divergence since independence.

The first example is North and South Korea. Both were colonised by Japan, both emerged from a bloody civil war as very poor countries, with life expectancies of 47 (South Korea) and 49 (North Korea) in 1950: around the same as Latin American countries. Their shattered economies put their wealth per person around the same as Chad, Sri Lanka and Tunisia.

By 2007 South Korea had a life expectancy of 79, higher than the US, and wealth almost the same as New Zealand. North Korea, after decades of growth, had stagnated and then collapsed: life expectancy of 67 and wealth somewhere between Bangladesh and Senegal.

This divergence cannot be blamed on colonialism.

Another example compares Botswana with neighbouring Zimbabwe. Both were colonised by the British, both gained independence in the 1960s. In 1965 they both had life expectancies of 53 years. Zimbabwe was, however, the poorer of the colonies, with a GDP per capita about half that of Botswana.

Over the next few decades AIDS caused havoc in both countries. Botswana’s life expectancy dropped to 51 by 2007, Zimbabwe was down to 43. Their economic destinies, however, show rapid divergence.

1965 GDP per capita (in inflation-adjusted US dollars)
Zimbabwe: $552
Botswana: $1,034

2007 GDP per capita (in inflation-adjusted US dollars)
Zimbabwe: $479
Botswana: $12,401

Zimbabwe collapsed and is poorer today than it was when it gained independence. Botswana, on the other hand, developed a stable democratic government and a highly successful economy. It is today about as wealthy as Malaysia, Russia, Mexico or Romania.

That different kinds of colonisation had impacts on the colonised countries is natural, but the extent of that impact is not. Historical colonisation cannot be used as an excuse to explain the poor performance of modern economies – it simply doesn’t matter much.

Thursday, April 1, 2010

I scare people

I sometimes wonder if vulnerable people think I’m going to attack them when I pass them by on a street.

I have mused about the Youth Bulge effect in different countries with high populations of young men and corresponding high rates of political violence. Statistically speaking the average young man is much more likely to assault someone than a grandmother.

Not me, though. I feel uncomfortable with raised voices, let alone violence. Nonetheless I happen to belong to the category of people most likely to attack others, and I sometimes feel that vulnerable people look at me with caution because of it.

Walking fast one night in Dublin once, minding my own business, I overtook a middle-aged woman. She gasped and recoiled in terror as I passed*. If I had brown skin I might have called her reaction racism. Sharing her ethnicity, I don't have that explanation: it was sexism and ageism. I was a little annoyed, but couldn't blame her. People ‘like me’ tend to be more dangerous, her discriminatory little gasp was well-founded, if silly and unnecessary that time.

This observation is inspired by a BBC programme I watched recently about modern radical feminists. One young woman, who often dressed in fashionable, attractive outfits, was arguing that she should be allowed to dress however she wants without having to expect hassle from men. Her appearance should not provoke a different reaction from men.

I wondered about this, because I realised that my own appearance – right down to things I’m not able to change, like my age and sex – does affect how others behave towards me. Unlike her, I take this discrimination for granted. Clothes serve a communicative function beyond their protective function: appearance communicates something to others and people respond to that.

If I accept that my age and sex change how people behave towards me, and that the clothes I wear further changes that behaviour, so should she. Within reason, of course. I don't expect to be pepper-sprayed in the face for over-taking some old lady on a dark street and neither should she expect panting weirdos phoning because she exposes the occasional ankle. But some altered behaviour, even if it is just a cautious or lustful glance, is inevitable.

*I did the same thing in Japan once, overtaking an old woman on foot. When I passed her she showed surprise, but no fear. The relative absence of crime made people more relaxed, which was really nice.