Thinking, Fast and Slow by Daniel Kahneman

Intuition or deliberation? Where you can (and can't) trust your brain

Two distinct systems determine our behaviour, one automatic and the other deliberate.






A captivating drama plays out in our heads, a film-like plot involving two significant characters, complete with twists, dramas, and tensions. These two personalities are the impulsive, automatic, intuitive System 1 and the thoughtful, careful, and calculating System 2. As they compete, their interactions shape how we think, make judgments and decisions, and behave.


System 1 is the part of our brain that functions spontaneously and unexpectedly, frequently beyond our conscious control. If you hear a deafening and unexpected noise, you can use this approach at work. What are you doing? You probably immediately and unconsciously turn your attention to the sound. That's System 1.


This mechanism is a product of our evolutionary history; the ability to make such quick decisions and actions has inherent survival benefits.


System 2 is the area of the brain responsible for our decision-making, reasoning, and beliefs. It concerns conscious mental activity such as self-control, decision-making, and careful attention.


For example, suppose you're looking for a woman in a throng. Your mind consciously focuses on the task, recalling the person's qualities and everything else that might help you find her. This concentration eliminates potential distractions, and you scarcely notice the other people in the crowd. If you maintain this focused attention, you may be able to find her in a matter of minutes; if you become distracted and lose focus, you will have difficulty finding her.


As we will see in the following blinks, the interaction of these two systems dictates how we behave.



1. The lazy mind: idleness can cause errors and impair intelligence.


To demonstrate how the two systems function, consider solving the following famous bat-and-ball problem:


A bat and ball are $1.10. The bat costs $1 more than the ball. How much does this ball cost?


The price you most likely thought of, $0.10, is the outcome of System 1's intuitive and natural behaviour, which needs to be corrected! Take a second to do the math now.


Do you see your error? The correct answer is $0.05.


Your impulsive System 1 took over and responded automatically based on intuition, but it reacted too quickly.


System 1 is duped by the bat-and-ball problem and wrongly believes it can solve it on its own, even though it normally consults System 2 when confronted with an unfamiliar circumstance. It sees the issue as more straightforward than it is and wrongly believes it can solve it on its own.


The bat-and-ball conundrum demonstrates our fundamental mental laziness. We expend the least energy possible on each activity when we engage our brains. The law of least effort refers to this tendency to spend the least energy possible on each activity. Confirming the answer with System 2 requires more work, and our minds will not do it if they believe it can get away with System 1.


This slowness could be better because employing System 2 is critical to our intelligence. Research indicates that exercising System-2 skills such as focus and self-control improves intelligence ratings. The bat-and-ball dilemma exemplifies this, as our thoughts could have verified the result using System 2 and avoided committing this typical error.


By being sluggish and avoiding employing System 2, our minds limit the power of our intelligence.



2. Autopilot: why we do not always have conscious control over our thoughts and actions.


What do you think when you see the word fragment "SO_P"? Probably nothing. What if you first contemplate the term "EAT"? When you look at the word "SO_P," you will most likely complete it as "SOUP." This is known as priming.


Exposure to a word, concept, or event prompts us to recall similar words and ideas, priming us. If you had seen the word "shower" instead of "eat" above, you would most likely have finished the letters as "soap."


Such priming influences not just our thoughts but also our actions. Hearing particular words and notions can affect both the mind and the body. One excellent illustration of this phenomenon is a study in which participants were primed with words associated with ageing, such as "Florida" and "wrinkle," and subsequently walked at a slower pace than usual.


Surprisingly, we need to realize we are preparing our thoughts and actions.


Priming demonstrates that, contrary to popular belief, we do not always have conscious influence over our actions, judgments, and decisions. Certain social and cultural situations constantly prime us, challenging the widespread belief that we always have conscious influence over our actions, judgments, and decisions.


Kathleen Vohs' research, for example, demonstrates that the thought of money drives individualistic behaviour. Exposure to representations of money primes individuals to act more independently and be less ready to participate in, rely on, or accept demands from others. One conclusion of Vohs' findings is that living in a culture full of triggers for prime money may move our behaviour away from altruism.


Priming, like other societal aspects, can influence an individual's thinking and, as a result, choices, judgment, and conduct. These reflect back into the culture and have a significant impact on the type of society in which we all live.



3. Snap judgments are how the mind makes hasty decisions even when there is insufficient information to make a reasoned decision.


Imagine you meet Ben at a party and find him easy to chat with. Later, someone asks if you know anyone who might wish to donate to their organization. You remember Ben, even if all you know about him is that he is pleasant to talk to.


In other words, you liked one component of Ben's personality and imagined you'd appreciate everything else about him. We frequently approve or disapprove of someone, even if we know nothing about them.


Our tendency to oversimplify things without sufficient information frequently results in judgment errors. Exaggerated emotional coherence, also known as the halo effect, occurs when you have positive thoughts about Ben's approachability and build a halo around him despite knowing little about him.


However, there are other ways our thoughts use shortcuts while making decisions.


People tend to agree with information supporting their preexisting opinions and accept whatever information is presented to them, a process known as confirmation bias.


This can be demonstrated by asking the question, "Is James friendly?"" Studies have shown that when presented with this question but no other information, we will likely consider James friendly since the mind instinctively reinforces the given idea.


The halo effect and confirmation bias occur because our minds are eager to make snap decisions. However, this frequently leads to blunders because we only sometimes have enough evidence to make an educated decision. Our minds rely on erroneous suggestions and oversimplifications to fill data gaps, sometimes leading to incorrect judgments.


Like priming, these cognitive events occur without conscious knowledge and influence our decisions, judgments, and behaviours.



4. Heuristics are mental shortcuts that allow us to make speedy decisions.


Frequently, we find ourselves in situations requiring us to make hasty decisions. To help us accomplish this, our minds have created small shortcuts that allow us to quickly understand our environment. These are known as heuristics.


These processes are usually really useful, yet our minds tend to abuse them. Using them in settings where they are not appropriate can lead to blunders. To better understand heuristics and how they might lead to errors, we'll look at two of the numerous types: the substitution heuristic and the availability heuristic.


Using the substitution heuristic, we answer a more straightforward question than the one originally presented.


Take this question, for example: "That woman is running for sheriff." How successful will she be in the office?" We automatically replace the question we're expected to answer with a simpler one, such as "Does this woman look like someone who would make a good sheriff?""


This heuristic implies that rather than examining the candidate's biography and policies, we simply ask ourselves the far more straightforward question of whether this woman meets our mental idea of a good sheriff. Unfortunately, if the lady does not suit our vision of a sheriff, we may reject her, even if she has years of crime-fighting expertise and is the ideal candidate.


Next, there's the availability heuristic, which occurs when you overestimate the likelihood of something you hear frequently or remember easily.


Strokes, for example, kill far more people than accidents, but according to one survey, 80% of respondents believed that an accidental death was more likely. This is because we hear about accidental deaths more in the media, and they have a more significant impact on us; we recall awful accidental deaths more easily than stroke deaths, so we may respond improperly to these threats.



5. Why do we fail to understand statistics and make unnecessary blunders due to a lack of numerical aptitude?


How can you forecast whether certain events will occur?


One promising approach is to keep the base rate in mind. The statistical foundation upon which other statistics are built is known as the base rate. For example, suppose a huge taxi company has 20% yellow and 80% red cabs. That means the introductory rate for yellow taxi cabs is 20%, while the base rate for red cabs is 80%. If you order a cab and want to guess the colour, recall the base prices, and you'll be able to make a reasonably accurate guess.


Therefore, we should constantly recall the base rate when projecting an occurrence, but regrettably, this does not occur. Base-rate neglect is relatively standard.


One of the reasons we ignore the base rate is that we prioritize what we expect over what is most likely. For example, consider those cabs again: if you look at five red cabs passing by, you'd probably conclude that the following one will be yellow for a change. But no matter how many taxis of either hue pass by, the probability that the next cab will be red remains around 80%, and if we remember the base rate, we should know this. However, we prefer to focus on what we anticipate seeing, such as a yellow cab, and hence, we will likely need clarification.


Base-rate neglect is a typical mistake associated with the more significant issue of working with statistics. We also need help to recall that everything regresses to its meaning. This is the recognition that all situations have an average status, and any deviations from that average will gradually tilt back toward the average.


For example, if a football striker who averages five goals per month scores ten goals in September, her coach will be overjoyed; however, if she then scores around five goals per month for the rest of the year, her coach will most likely criticize her for not continuing her "hot streak." However, the striker does not deserve this criticism because she is simply regressing to the mean.



6. Past imperfection: why we remember retrospectively rather than from experience.


Our minds do not straightforwardly recall memories; instead, we have two distinct mechanisms known as memory selves, each of which remembers situations differently.


First is the experiencing self, which records how we feel in the present moment and asks, "How does it feel right now?"


Then there's the remembering self, which notes how the incident occurred after the fact and asks, "How was it on the whole?"


The experiencing self provides a more accurate account of what happened since our feelings during an event are always the most accurate. In contrast, the remembering self, which is less precise because it registers memories after the scenario has ended, dominates our memory.


The remembering self dominates the experiencing self for two reasons: duration neglect, which ignores the entire duration of an event in favour of a specific memory from it, and the peak-end rule, which emphasizes what happens at the end of an event.


For an example of this dominance of the remembering self, consider this experiment, which measured people's memories of a painful colonoscopy. Researchers divided the patients into two groups before the colonoscopy: one group received long, drawn-out colonoscopies, while the other group received much shorter procedures with a higher level of pain near the end.


When asked about their pain during the procedure, each patient's experience provided an accurate answer: those who had the more prolonged procedures felt worse. However, after the procedure, when reflecting on the experience, those who underwent the shorter process but had a more painful ending felt.



7. Mind over matter: How shifting our mental focus can significantly impact our ideas and behaviors.


Our minds utilize varying amounts of energy depending on the task. When there is no need to deploy attention, and minimal energy is needed, we are in a state of cognitive ease, but when our minds mobilize attention, they use more energy and enter a state of mental stress.


These fluctuations in the brain's energy levels significantly impact how we behave.


When we are in a state of cognitive ease, the intuitive System 1 takes charge of our thinking, while the logical and more energy-demanding System 2 becomes weakened. This means that we are more intuitive, creative, and happy but more prone to making mistakes.


When we experience cognitive strain, our heightened awareness puts System 2 in charge. System 2 is more prepared to double-check our judgments than System 1, so we make fewer mistakes even if we are far less creative.


You may actively adjust the energy the mind requires to get into the correct frame of mind for various tasks. For example, promote cognitive ease if you want a compelling message.


When we see something familiar, we enter a state of cognitive ease. When clear messages are repeated to us or made more memorable, our minds evolve to respond positively to them.


Cognitive strain, on the other hand, helps us succeed at tasks such as statistical analysis.


Exposing ourselves to complex information, such as hard-to-read text, can lead us into this condition. Our thoughts perk up and increase our energy levels to comprehend the problem, so we are less likely to simply give up.



8. The presentation of probabilities influences our risk assessment when taking chances.


How ideas and problems are conveyed dramatically influences our perception and approach; even slight changes to the specifics or focus of a statement or inquiry can have a drastic impact.


The way we assess risk is an excellent example of this.


Changing how we express the figure can vary how we approach it, even with precisely established probabilities. You may believe that once we have determined the chance of a danger occurring, everyone will approach it similarly, but this is not the case.


For example, expressing a rare event in terms of relative frequency rather than statistical likelihood increases people's inclination to believe it will happen.


As part of the Mr Jones experiment, researchers asked two groups of psychiatric professionals whether it was safe to discharge Mr Jones from the psychiatric hospital. The researchers informed the first group that patients like Mr Jones had a "10 per cent probability of committing an act of violence," while the second group was told that "of every 100 patients similar to Mr Jones, 10 are estimated to commit an act of violence." Of the two groups, nearly twice as many respondents


Denominator neglect happens when we disregard straightforward data in favour of vivid mental images that impact our decisions, diverting our attention from what is statistically significant.


When comparing the following two statements: "This drug protects children from disease X but has a 0.001 per cent chance of permanent disfigurement" versus "One of 100,000 children who take this drug will be permanently disfigured," Although both statements are equal, the latter statement conjures up images of a disfigured child and is far more persuasive, so we would be less likely to administer the drug.



9. Not robots: Why we don't make decisions based solely on logic.


How do we, as people, make decisions?


For a long time, a powerful and influential group of economists argued that we all make choices according to utility theory. This theory states that when individuals make decisions, they look only at rational facts and choose the option with the best overall outcome for them, i.e., the most utility.


For example, utility theory proposes that If you prefer oranges over kiwis, you will choose a 10% chance of getting an orange over a 10% chance of winning a kiwi.


It seems obvious.


Milton Friedman led the most influential group of economists in this field, the Chicago School of Economics. Using utility theory, the Chicago School argued that individuals in the marketplace are ultra-rational decision-makers, whom economist Richard Thaler and lawyer Cass Sunstein later named Econs. Econs act similarly, valuing goods and services based on their rational needs. Furthermore, economic value


So, assume two people, John and Jenny, both have fortunes of $5 million. According to utility theory, they have the same wealth, which means they should be equally satisfied with their financial situation.


But what if we complicate things a little? Assume their $5 million fortunes are the end result of a day at the casino, and the two had vastly different starting points: John came in with a mere $1 million and quintupled his money, whereas Jenny came in with $9 million and dwindled down to $5 million. Do you think John and Jenny are equally happy with their $5 million?


Unlikely. Clearly, there is more to how we value things than just utility.


As we'll see in the next blink, we can make weird and seemingly unreasonable decisions because only some consider utility as rationally as utility theory suggests.



10. Gut feeling: Why are we frequently influenced by emotions rather than making decisions based exclusively on rational considerations?


If utility theory does not work, what does?


One alternative is prospect theory, which the author created.


Kahneman's prospect theory undermines utility theory by demonstrating that we only sometimes act rationally while making decisions.


Consider these two scenarios: In the first scenario, you receive $1,000 and must choose between receiving a definite $500 or taking a 50% chance of winning another $1,000. In the second scenario, you receive $2,000 and must choose between losing a definite $500 or taking a 50% chance of losing $1,000.


In the first case, most individuals choose a safe bet, whereas in the second case, most people take a chance. If we made entirely logical decisions, we would make the same decision in both situations.


Prospect theory helps to explain why this is the case. It highlights at least two reasons why we don't always act rationally, including our loss aversion or the fact that we dread losing more than we value gains.


The first reason is that we value things based on reference points. Starting with $1,000 or $2,000 in the two scenarios changes whether we're willing to gamble because the starting point affects how we value our position. Ending up at $1,500 feels like a win in the first scenario but a distasteful loss in the second. Even though our reasoning is clearly irrational, we understand value.


The diminishing sensitivity principle influences us by causing our perceived value to differ from its actual worth. For example, going from $1,000 to $900 feels better than going from $200 to $100, even though the monetary value of both losses is equal. Similarly, going from $1,500 to $1,000 has a higher perceived value loss than going from $2,000 to $1,500.



11. False images: Why does the mind create complete representations to understand reality, but they lead to overconfidence and errors?


Our minds naturally use cognitive coherence to understand situations; we construct complete mental pictures to explain ideas and concepts. For example, we have many images in our brains of the weather, such as a picture of a bright, hot sun bathing us in heat during the summer.


We use these visuals not only to help us understand things but also to make decisions.


When we make decisions, we refer to these images and base our assumptions and conclusions on them. For example, if we want to know what clothes to wear in the summer, we base our decisions on our perception of the season's weather.


Even if available statistics and data contradict our mental images, we let them guide us. For example, even if the weather forecaster predicts cool weather, you may still dress in shorts and a T-shirt because that's what your mental image of summer tells you to wear, even if you shiver outside!


In summary, we are immensely overconfident in our often faulty mental imagery, yet there are strategies to overcome this overconfidence and begin making better predictions.


To avoid mistakes, use reference class forecasting. Instead of making judgments based on general mental images, use specific historical examples for a more accurate forecast. For example, consider the last time you went out on a cold summer day and what you wore.


In addition, you can develop a long-term risk policy that plans specific measures for both success and failure in forecasting. By preparing and protecting, you can rely on evidence rather than general mental images and make more accurate forecasts. In the case of our weather example, this could imply bringing along a sweater just to be safe.



Final Summary

The book "Thinking, Fast, and Slow" shows that our minds are divided into two systems: the first system acts instinctively and requires little effort, while the second system is more deliberate and demands much attention. Our thoughts and actions change depending on which system controls our brain at the time.



Actionable guidance.

Repeat the message!

Repeated exposure to messages without negative repercussions is fundamentally desirable, which is why frequent exposure makes them more persuasive.


Do not let uncommon statistical incidents that are overreported in newspapers sway you.

Disasters and other occurrences are a significant part of our history, yet we frequently overestimate their statistical likelihood due to the vivid imagery we associate with them in the media.


Being in a good mood enhances your creativity and intuition.

When you're in a good mood, the alert and analytical portion of your mind relaxes, giving control to the more intuitive and speedier thinking system, which also makes you more creative.

Book Summary

Post a Comment

Previous Post Next Post