Proving Aristotle Wrong, Psychology Style


First, the good intention part. With the information overload we usually experience, our brain finds it difficult to thoroughly process each bit of info as it comes. Hence it resorts to a veritable laundry list of mental shortcuts or rules of thumb to make judgments and attributions. This way, time is saved and less mental effort is expended. Quite adaptive.

The hitch is, these heuristics, called cognitive biases, often lead to serious errors.

While you may want to believe otherwise, you can't deny that you find it difficult to reason in a perfectly logical way all the time.

Remember the last time you underestimated, grossly, how long it would take you to complete an assignment? That's planning fallacy, a cognitive bias pertaining to decision making. There are several others, pertaining to social attribution and memory, besides statistical or probability/belief based fallacies.

The cognitive biases we fall prey to most often (read: the names I found fancier than the rest) are:

1. The Semmelweis reflex- Rejecting out of hand any information without much inspection just because it contradicts a previously held idea or norm. So, established paradigm wins over new information. Something like "if the facts don't fit the theory, throw out the facts."

2. Self fulfilling Prophecy or the Pygmalion effect- Remember the sculptor who created a beautiful statue and later fell in love with it? That was Pygmalion. Alright, this is a prediction that causes itself to become true due to the very fact that the prediction was made. If your teachers are made to believe you are an "intellectual bloomer", you will turn out to be so, as this research proved.

Keep it in mind the next time you are tempted to call your boyfriend an ass.

3. Deformation Professionnelle- The tendency to examine issues from the vantage point of one's own profession only, at the cost of other possibly relevant perspectives. The classic example is the joke "when all you have is a hammer, every problem looks like a nail".

4. Confirmation Bias- The tendency to seek out information and people who reinforce our beliefs and preconceived notions. An example could be choosing that newspaper which follows the same political ideology as yours or favouring a yea-sayer subordinate. If one is not careful, one may hence miss out on important information that is deliberately screened, and make catastrophic decisions.

5. Reactance- The tendency to do the opposite of what one is instructed to do by others because of the perception that they are trying to restrict one's freedom. This is where reverse psychology comes from: a technique involving the advocacy of a belief or behavior that is opposite to the one desired, with the expectation that this approach will encourage the subject of the persuasion to do what actually is desired: the opposite of what is suggested.

6. Fundamental Attribution Error- The Big Brother of all cognitive biases, it is the tendency to overestimate the impact of dispositional factors (as opposed to environmental factors) on other's behaviour. Instead of the many external circumstances that could have led to the act, we choose to attribute it to the convenient "because he is that kind of a person." A closely related error is the 'actor-observer effect' due to which we have a tendency to attribute dispositional causes to others' behaviour but external causes to our own. Like 'you fell; I was pushed'.

There are hundreds others out there, and one way to minimise their effect is to remain open to diverse opinions and look up statistical figures when in an uncertain situation. No, being aware of these biases doesn't help much. Kahnemen, the leading authority in this field has said, "My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues."

So there's only so much one can do about it. Specially if one is intelligent.

Yes, Kahneman's studies showed that more cognitively sophisticated participants showed larger bias blind spots i.e. the assumption that others are more susceptible to thinking errors!

Even knowledge doesn't count much as more than 50% of Harvard, MIT and Princeton students gave incorrect answers to a simple question designed to test whether they could keep their cognitive biases at bay.