A journey into human thinking

Even though Thinking Fast and Slow is a science book about the workings of the human brain and deals with intimidating words such as heuristics and biases, it is an easily accessible, well written and entertaining summary not only of how we think but also of how we are prone to make systematic errors of judgment, take short-cuts, leap to conclusions and are generally biased in our thinking.  

Psychologist Daniel Kahneman, who together with his colleague Amos Tversky, received a Nobel Prize in 2002 is best known for his work on the psychology of judgment, decision making and behavioural economics.  

Kahneman and Tversky’s paper from the 1970s “Judgment under Uncertainty: Heuristics and Biases” challenged two fundamental assumptions made by social scientists at the time, that people are generally rational and their thinking is normally sound and that emotions such as fear, affection and hatred explain most irrational human behaviour. 

Kahneman and Tversky instead documented systematic errors and simplifying shortcuts of intuitive thinking in normal people and demonstrated that these are the result of design faults “in the machinery of cognition rather than the corruption of thought by emotion”. Psychologists call these forms of reasoning heuristics, experience-based rules and techniques used by the mind to solve problems or to learn. 

Simplifying heuristics (rules of thumb) or availability heuristics, the ease with which information is retrieved from memory are examples for these functions. How does this affect us? Availability heuristic for instance leads us to believe that topics that receive a lot of media coverage are also more important than those that don’t. If we read and see a lot about adultery by politicians, we are inclined to intuitively estimate politicians are more adulterous than say university professors. 

 

Thinking slow and fast  

Throughout the book Kahneman uses a metaphor of fictitious characters, called System 1 and System 2, which respectively produce slow and fast thinking, and outlines the mutual influences between the two. These systems are not governed by different parts of the brain; they are more like two distinct ways of thinking. 

System 1 operates quickly with very little effort and no sense of voluntary control. It automatically takes in all the objects surrounding us in a three dimensional image, it judges faces and body language and makes immediate assessments of like and dislike or trustworthiness in people. 

System 2 in turn allocates attention to mental activities that require effort, such as multiplying 17 by 24, and it is associated with the notions of choice and concentration. But the intuitive “System 1 is more influential than our experience tells us, and it is the secret author of many of the choices and judgments we make”, says Kahneman.  

The problem is System 1 has biases, systematic errors it makes in specific circumstances, and it lacks understanding of logic and statistics. If System 1 is involved, the conclusion comes first and the arguments follow later. Another important limitation is that it cannot be turned off. “If you are shown a word on the screen in a language you know, you will read it—unless your attention is totally focused elsewhere,” writes Kahneman. 

System 2 is charged with overcoming the impulses of System 1. While this ensures a certain degree of self-control, the permanent questioning our own thinking would be incredibly tedious and System 2 is much too slow and inefficient to serve as a substitute for System 1 in making routine decisions. 

Most of the time System 1 works fine, the problem is when it doesn’t. “The best we can do is a compromise: Learn to recognise situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high,” writes Kahneman. 

 

Blindness  

The capacity of System 2 can find it limits very quickly, for most people with the mental multiplication of two-digit numbers or when remembering a long sequence of numbers and mentally adding three to each. 

In order to achieve what it needs to do, System 2 protects its most important activity and only assigns spare capacity to other tasks. If there is no spare capacity available, we can become blind to important facts.  

The Gorilla experiment that involves watching a video of a group of people in black or white shirts passing a basketball is a good indication of this blindness. Nobody has any difficulty completing the task of counting the number of passes between the people wearing white, but most people are completely unaware of the person wearing a gorilla costume entering the scene walking right through the players and waving at the camera. 

People are later amazed how they could have missed something so obvious, but Kahneman says: “We can be blind to the obvious, and we are also blind to our blindness.” One of the premises of the book is that it is easier to recognise other people’s mistakes than our own. 

Ever wondered why your colleagues are more likely to swear when under stress? System 2 is in charge of self-control, but when it is sufficiently engaged, self-control diminishes. “People who are cognitively busy are also more likely to make selfish choices, use sexist language and make superficial judgments in social situations,” says Kahneman. 

Why does the driver of a car stop talking when overtaking another car in a tight corner? System 1 takes over in emergencies and assigns total priority to self-protective actions, overriding the ability to use System 2.  

In addition, the functions of the brain can also be physiologically depleted when the level of glucose available to the nervous system drops. The consequences of this are disturbing. Kahneman quotes an Israeli study that showed that the approval rate of parole cases correlated with the blood sugar levels of the parole judges, who during their working day approved significantly more parole requests after a meal and absolutely none during the period immediately before lunch. 

 

Emotions and priming  

In dozens of experiments Kahneman describes in easy to understand language the framework of what science knows today about the workings and shortcomings of the mind. 

The book brings many astonishing insights and is surprising in how easily our thinking can be lead by general emotion and what psychologists call priming. “If you have recently seen or heard the word EAT, you are temporarily more likely to complete the word fragment SO_P as SOUP than as SOAP. The opposite would happen, of course, if you had just seen WASH. We call this a priming effect and say that the idea of EAT primes the idea of SOUP, and that WASH primes SOAP,” he explains. 

Other experiments have shown that money-primed people, for example after having been shown a dollar note, become more independent in their actions. Unfortunately they also become more selfish, more reluctant to be involved with others, to depend on others or to accept demand from others. This does of course not mean that we are completely at the mercy of primes. Their effects are very real but not necessarily large. 

Kahneman’s book is most interesting when it shows how the design faults of our cognitive system lead to erroneous conclusions. Whether it is the difficulty to distinguish between familiarity and truth or the automatic function to search for causality, even where there is none. 

Business analysts often explain why the stock market went up or down in a day with one or two single events, but the causality is in most cases just not there. It is also an example for what Kahneman calls substitution. This is a concept that is sometimes applied in problem solving. “If you can’t solve a problem, then there is an easier problem you can solve: find it.” 

The mind does the same. When faced with a difficult and complex question we are likely to substitute, often without realising, this question with another question that can be answered more easily. The question how happy are you these days is more likely answered with the response to what is my mood right now.  

An experiment among German students showed no correlation between the responses to: How happy are you these days and how many dates did you have last month? When another group was asked the questions in reverse, the correlation was very strong, showing that the present state of mind is very influential when people evaluate their happiness. Any emotionally significant question that alters a person’s mood, about money or parents, will have the same effect. 

“The students do not temporarily lose their ability to distinguish romantic life from life as a whole. If asked about the two concepts, they would say they are different,” writes Kahneman. “But they were not asked whether the concepts are different. They were asked how happy they were, and System 1 has a ready answer.” 

The effect is profound. The heuristic alternative to careful reasoning can sometimes work fairly well, because it can be impractical to assess the various factors that would influence a more detailed and rationalised answer. But sometimes it leads to serious errors. 

The next time someone purports to answer a complex question, like how the sovereign debt crisis in Europe is going to be resolved, ask yourself what question is the person really answering?  

 

One Oreo now or two cookies later?   

The shortcomings of human reasoning can also have an effect on economic decisions. Participants in an experiment, who were asked to imagine that they were given $50, exhibited different behaviour when they were told that they can “keep” $20 or must “lose” $30, although the result is effectively the same. 

Kahneman recommends being aware of System 1 characteristics like “loss aversion” which leads people to reject gambles that are in their favour, for example a 50-50 chance to win $200 or lose $100, and to make decisions that favour short-term effects over the long-term, even if the long-term reward is greater.  

Kahneman describes a famous experiment by Walter Mischel. It involves four year old children that are given a cruel choice between a small reward (one Oreo), which they could have at any time, or a larger reward (two cookies) for which they would have to wait 15 minutes under difficult conditions, alone in an empty room without toys. 

This conflict between instant gratification and a bigger payout after exhibiting self-control can sometimes be witnessed in politics or the economy. It is not too much of a stretch to compare the idea of offering a temporary break in pension contributions for a small improvement in cash flow, while completely ignoring the effect compounding interest has over many years, to 4-year-olds with self-control issues.  

But for the individual the problem runs even deeper. Kahneman writes 10 to 15 years after the experiment a large gap had opened between those 4-year-olds who had resisted temptation and those who had not. “The resisters had higher measures of executive control in cognitive tasks, and especially the ability to reallocate their attention effectively. As young adults, they were less likely to take drugs. A significant difference in intellectual aptitude emerged: the children who had shown more self-control as 4-year-olds had substantially higher scores on tests of intelligence.” 

If there is only one book you are going to read this year, make sure it is this one. The findings in terms of the mechanics and limitations of human reasoning and the choices we make will stay with you for a long time. 

 

Thinking, Fast and Slow. By Daniel Kahneman. Farrar, Straus & Giroux; 352 pages. 

NO COMMENTS