Tag Archives: irrationality


I recently adopted the benign superstition that the new moon is an auspicious time to begin new  projects. For me this means that I aim (and generally fail) to start work on a new piece of writing, whether it’s a poem or a paper. But yesterday, being the day of the first new moon of the new year, it seemed particularly significant, so I thought I might set out my stall for the next twelve months, and in particular try and formulate my first research question.

For a while now, partly inspired by reading Dan Ariely, I’ve been telling people that I want to focus my attention on the topic of human irrationality, and, what is much the same thing, what passes for “rationality”.  Partly this unhealthy interest in, to use Bukowski’s phrase, “ordinary madness” is an attempt to answer Mrs. Kaamos’ galling challenge to my analytic mindset: “Sometimes it’s rational to be irrational.”

The study of irrationality  goes back to Ancient Greece, and in particular the curious case of akrasia or weakness of the will, familiar to anyone who has already broken their new year’s resolution, considered at some length by Aristotle.

The same issue runs through the Stoics and on through to the Early Modern philosophers, Descartes and also Spinoza who described the person subject to emotions – pretty much all of us really – as in bondage: “though he sees the better for himself, he is still forced to follow the worse.” And then around 300 years later, along comes Freud and it all starts getting really interesting, when he fiendishly places the source of irrationality in the covert drives of the unconscious.

There are, or course, other aspects of irrationality: error, bias and prejudice, hallucination, illusion and delusion. Why does it matter? Well, if you believe as I do, that we might all be going to hell in a handcart, a you might start to wonder about the collective irrationality of a species bent on its own destruction. To borrow from the medical model, before we can begin to discover the cure, it might help us if we try to understand the sickness.

But the irrationality module that piques my curiosity at the time of writing is the remarkable human capacity to remain convinced of a belief for which there is no evidence and which has no practical benefit. Why do we stubbornly persist in holding, even defending, such beliefs, often incompatible with other firmly held convictions, even if everyone around us tells us they are false?

Almost every time I come to write an essay, I reach a point where I fully believe that it is impossible for me to write it. I procrastinate, I go on Facebook, I stare at the screen, cursing the cursor.

This is not, I believe with complete certainty, because it’s rather arduous work and I’m not always completely thrilled by the topic, but because, without a shadow of a doubt, I know that I am incapable of doing it. At that moment, all the essays I have written  in the past are just statistical anomalies, freaks of my otherwise fixed nature.

The reality that I perceive is that I have either never really been able to write,  or even if I once have,  I’ve lost my touch.

And then, when I give up and in my frustrated exasperation just write something, guess what? I write something. And that something gets mucked around with until it suddenly emerges as an essay.

So, here it comes, the Instant Kaamos 2013 question of the year:

How and why do we protect, preserve – treasure even – our false beliefs against the corrosive effect of reason and evidence?

Does it hold water?

Watching Tony Blair’s testimony at the Chilcot enquiry last Friday reminded me of a joke beloved of Sigmund Freud. A man borrows a bucket from his neighbour. Later the neighbour complains that the bucket has a hole in it. The man, indignant, gives the following defence:

1. First of all, I never borrowed your bucket.

2. Secondly, when you gave it to me it had a hole in it.

3. Thirdly, when I gave it back to you, it was in perfect condition.

The humour in the situation (OK, not side-splittingly funny I admit) lies  in the aburdity of   the man thinking  that by listing three defences he strengthens his case. Actually, because of they are mutually inconsistent, all he does is undermine his own credibility.

Now to Blair. To justify his decision to invade Iraq, Blair makes the following claims:

A. First of all, the evidence showed that Saddam had WMD and  therefore posed a threat. So, it was right to invade Iraq.

B. Secondly, although Saddam did not have WMD he could have developed them and therefore posed a threat. So it was right to invade Iraq.

C. Thirdly, Saddam was a ruthless, murderous dictator who had used gas against his own people and the world is  a better place without him.  So it really doesn’t matter whether Saddam did have or could have developed WMD, it was right to invade Iraq.

Admittedly, there is a difference. First of all A, B and C are not strictly speaking, logically inconsistent. But without contradicting each other, the addition of each reason makes the others seem less convincing as being the genuine reason for the invasion. Since C suggests that A and B are irrelevant, then maybe we should conclude that getting rid of Saddam was the real reason. But if that were the case, why is Blair so reluctant to state unequivocally that the purpose of the invasion was regime change — unless simply because this would make the invasion illegal? Can we make anything of Blair’s talk of a “the danger of making a binary distinction between regime change and WMD?” Is this a sophisticated attack on a genuinely false dilemma, or just an attempt to fudge the issue?

I don’t know where to go with this exactly and whether any amount of work in this direction would prove Blair’s argument invalid — I partly posted this because I wanted to see if someone can find the flaw in what I’m saying…or tell me how to make good.

And I am not saying Blair is lying. That would require me to show that he does not believe what he is saying.  He might though be delusional, guilty of a form of self-deception.

Being inconsistent in our beliefs is a very human trait that happens to the best of us. One way we hold inconsistent beliefs is by compartmentalisation — putting them in different boxes and not looking at them at the same time. (In this respect, pace Descartes, the mind is not transparent to itself). Being rational and honest requires that we recognise when our beliefs conflict and own up to it, even at a cost of admitting we were wrong. And when we admit we were wrong, we generally apologise.