It was one of those nights when I was doing a web research on how humans make choices, that I stumbled upon a TED Talk by a psychologist who was explaining the riddle of experience versus memory. The Talk led me to Daniel Kahneman’s Thinking Fast and Slow, one of those books that shake the foundations of a discipline, and prepare the grounds for a paradigm shift and a new direction for the research of generations of academicians to come.
What I am about to summarize might not make sense to many of you. Kahneman’s work was not short of raising controversy in the social science that’s psychology. What I admire on the other hand, is Kahneman’s dissection of the odd behavior of our brain under certain conditions, which makes us make choices that, seen under another angle, simply do not make sense. Our brain is not the rational agent many imagine.
Now why a book review? Because books’ principle quickly falter into oblivion in one’s memory if no written synthesis is made. Also, explaining the concepts in simple terms will help me make sure I understand them myself. And who knows, a few of you might be intrigued as well. Kahneman’s book is not a “how to become happy in 10 lessons” book. By enriching our awareness on the shortcuts our brain takes to make decisions, Kahneman tries to raise our awareness in the unlikely hope of making us make more informed choices. This is my first book review, most probably not the last one, and I leave you to enjoy a summary of a selection of a few concepts laid out.
Kahneman argues that our brain is broken down into 2 “systems”, System 1 and System 2. System 1 makes most of our daily choices for us, while System 2 is lazy, has to be stimulated, and is rarely used outside classrooms. If I had to isolate my best understanding of System 1, I would name it Intuition. On the other hand, I would name System 2 Rationality. Intuition, when nurtured, such as the famous case Malcom Gladwell makes in his book Blink about the fireman who pulled his brigade out of a building for no reason he could explain, simply to witness the building collapse in front of him a few minutes later, can be trusted as expert intuition. On the other hand, most of us non experts, intuitively take shortcuts to make choices that lead us astray.
System 1 is a system prone to cognitive illusion. Much like visual illusion, cognitive illusions is a fancy term for a reasoning you make that you think makes sense, but when looking at it under another angle, does not anymore.
Try to answer the following question:
A bat and a ball cost $1.10. The bat costs $1 more than the ball. How much does the ball costs?
A number quickly came to your mind (generated in your system 1). If you think the ball costs 10 cents, think twice. The correct answer, computed by your system 2, is 5 cents.
Heuristics and Biases
Kahneman argues that our brain is not trained for thinking in terms of probabilities and statistics. We tend to induce (Nassim Taleb describes the problem of induction in Fooled by Randomness very well) conclusions from samples which are too small to be significant. Once a basketball player scores from 5 throws in a row, his teammates suddenly start passing him the ball thinking he’s on a “hot hand”, while statistics clearly prove that this lucky streak is perfectly explained by a simple random distribution
And if our brain does not find causes to explain the phenomena it goes through, it often tries to look for causality where there is none. A famous experiment done on pigeons can explain the concept: a few pigeons are starved in a cage and start to be fed a very slow rates, much slower than the rate at which they need the food given how hungry they are. The pigeons do not see what other pigeons are doing, and each of them starts to make gestures which they think will make the food come: one of them starts jumping, another starts to make movement with its wings. Is that a clue to explain superstition in humans, or data snooping among research analysts or financial traders who use technical analysis?
An experiment was done where people span a wheel of fortune marked from 0 to 100. The wheel was tricked to give only 2 outcomes, either 10 or 65. All the people were shortly thereafter asked a simple question:
What is your best guess of the percentage of African nations in the UN?
The people who got 10 in the wheel of fortune on average said 25%, whereas the people who got 65 on the wheel on average said 45%. The number they got on the wheel, although clearly not relevant to the question, did affect their judgement. This is a simple example of the anchoring heuristic, which explains how under uncertainty, our brain has a tendency to take whatever piece of information and extrapolate, without questioning the validity of the piece of information. This is the reason why sales people throw a price at you before you ask for it, and this is why when opposing parties meet to discuss a certain outcome, the party that speaks first has a certain advantage: the other party will take the benchmark thrown at him and extrapolate.
Another concept our brain has trouble dealing with is regression to the mean. Ask yourself the following question:
Why do highly intelligent women tend to marry men than are less intelligent than them?
You have certainly started coming up with all sorts of explanations such as women are more intelligent than men and this sort of nonsense, but you haven’t probably thought that the say 5% highly intelligent women among the entire women population are much more likely to choose among the 100% population of men than the 5% highly intelligent men.
To be Continued