Don't accept a Nobel Prize -- and other tips for improving your rationality

Book Review:

Stuart Sutherland, "Irrationality", Pinter & Martin, 2007.


Back in 1992, the British writer and professor of psychology Stuart Sutherland (now deceased) published a book simply titled Irrationality, which foreshadowed Daniel Kahneman's Thinking, Fast and Slow by nearly two decades. Recently, I finished reading the second (2007) edition, although a newer edition (2013) is also available.


In the preface, Sutherland states his mission: to demonstrate, using research from psychology, that irrational behavior is the norm (not the exception) in our everyday lives. And this book covers a wide range of irrational phenomena, many of which will be familiar to those who've read Kahneman or who follow rationalist blogs like Less Wrong. And if you're not familiar with the literature, this book will convince you that indeed, humans are probably not rational creatures (in case you needed to be convinced of that).

Since I believe this topic is so important, I think it is worth summarizing the contents of the book at length here.

***

Introduction
  • People as far back as Aristotle (and notoriously, modern economists) have assumed that humans are rational beings. However, people find it hard to suspend judgment when there is insufficient information; emotions cloud our judgment; and most importantly, there are numerous inherent defects in how we think -- which lead to deficits in rational thinking (i.e. reaching the conclusion that is most likely to be correct given one's knowledge) and rational action (i.e. taking the action that is most likely to achieve one's objective given one's knowledge).
  • Rational thought assumes that there are constant "laws" governing the world.
  • Being rational does not guarantee that you will always attain the best outcome and always avoid error, because (a) accidents could still happen, and (b) there is always an element of luck/chance in every decision. However, in the long run, the rational decision will tend to be the best you can do.
  • Perhaps we can consider it "irrational" to pursue an end (goal) that is impossible to achieve, to have contradictory ends, or to fail to think about or prioritize one's ends. However, the rationality of an end is separate from its morality, so the primary focus of rationality is really the means.
  • Common causes of everyday irrationality include:
    • People do not take enough time to think things through when making complex and important decisions.
    • People fail to write down on paper the various pros and cons of different actions.
    • People neglect elementary concepts from mathematics and statistics.
    • Organizations are structured in ways that incentivize selfish behavior on behalf of their members to the detriment of the organization as a whole.
    • People often distort their thoughts about reality (via self-deception or wishful thinking) to feel happier or more comfortable -- which can lead to unjustifiable conclusions about the world.
  • Brain injuries and severe mental illnesses (e.g. schizophrenia) can contribute to irrationality, but experimental research by psychologists has found that most people are prone to certain errors.
The wrong impression
This chapter deals with the most common source of flawed thinking.
  • People often skip the facts in favor of whatever makes the deepest impression or comes to mind first. This is called the availability error, and it permeates our thinking.
    • Recency, strong emotions, and dramatic concrete imagery all make something more "available".
  • Experiments with the Prisoner's Dilemma found that subjects who listened to a story of kindness (e.g. donating a kidney to a stranger) before playing the game were more likely to collaborate. The availability error also leads people to overestimate their chances of dying a violent death (probably because such events are reported more in the media). Doctors are more likely to mistakenly diagnose a patient with an illness if they have recently seen many cases of that disease.
  • The primacy error is when our reasoning is influenced more by earlier than by connected later items, such as first impressions. This is a form of the availability error, because the early items "are immediately available in our minds when we encounter the rest" as Sutherland writes, so we interpret later evidence in light of earlier beliefs.
  • Another related phenomenon is the halo effect -- one salient/available good trait, like physical attractiveness, can make us judge a person's other characteristics (like their intelligence, athleticism, sense of humor etc.) as better than they actually are. Ditto for salient bad traits (the "devil effect"). The halo effect implies that interviews are a poor way to select a candidate, because the salient aspects are given too much weight.
Obedience
This chapter and the next six deal with social and emotional causes of irrationality.
  • Stanley Milgram's experiments in the 1960s showed that the habit of subconsciously obeying authority figures is ingrained in people. To some extent we need authority to organize groups, so our parents, teachers, bosses and the law all teach us obedience. However, unthinking obedience and conformity can go too far, as the example of Nazi Germany shows.
  • People are sometimes reluctant to express their own views when somebody has authority over them; for example, a number of plane crashes happened because the co-pilot did not dare to call out the pilot for making an error.
  • To decide whether obedience is rational in a particular instance, you have to actually think about it and ask yourself whether the command is justified.
Conformity
  • We tend to conform to the behavior of our peers, usually without being aware of it. Solomon Asch famously conducted an experiment wherein the majority of his subjects gave an incorrect answer after the other "subjects" (who were actually stooges/confederates of the researcher) all agreed on the same incorrect answer. Interestingly, if just one other person gives the (obviously) correct answer, most subjects will also give the correct answer.
  • Conformity may be motivated by a fear of social rejection; but conformity can lead to negative consequences due to the fact that people tend to associate with others who share similar beliefs. Therefore, people are rarely exposed to counter-arguments to their core beliefs, and they become reluctant to change their minds.
  • Public commitments are harder to escape from than private ones. On the one hand, this leads to the boomerang effect, whereby people become more convinced of their beliefs when they are challenged. On the other hand, public commitments to lose weight or quit drinking are more effective.
  • Stuart Sutherland argues that the desire to conform to the cycle of women's fashion is mostly irrational, because you cannot achieve any of the other attributes of film stars or high society women simply by copying their dress. It is also irrational to think that someone who is a good baseball player must therefore have good judgment about the hair cream that they've been paid to promote.
  • Conformity in crowds can lead to panic, violence, and religious conversion. Crowds give their members a feeling of anonymity, and if deviant behavior isn't punished, people will copy it (e.g. walking across the street when the light is red). The bystander effect is when people feel less obligation to intervene in an emergency when there are several other witnesses.
In-groups and out-groups
  • Belonging to a group leads not only to conformity but can also make the attitudes of the group's members more extreme. Groups are more willing to take risks than their members are individually (the "risky shift"). Groupthink suppresses criticism and dissent from within, which can lead to bad decision-making in committees.
  • Sutherland argues that wearing uniforms creates social distance and encourages irrational behavior. He backs this up by highlighting an experiment wherein subjects who wore a nurse's outfit became less aggressive and those who wore KKK robes became more aggressive.
  • Muzifer Sherif's experiments in the 1940s and 50s conducted on summer camp boys showed that mere division into two separate camps produced inter-group enmity and conflict. This might imply that competitive sports actually foster animosity rather than amity between different cities or countries.
  • It is hard to take pride in one's own group without regarding other groups as inferior; nevertheless, challenges that require groups to cooperate on a common task may decrease the hostility between the groups (as long as the combined effort is successful).
  • Prejudicial stereotypes are common for a number of reasons: stereotypes are convenient; we tend to pay more attention to things that confirm our prior opinions; the actions of minority group members (especially bad behavior) are more conspicuous/available because they are rarer; stereotypes can be self-fulfilling; stereotypes may sometimes have a basis in reality; attaching labels to things increases perceived differences; the halo effect also applies to groups; in-group conformity can increase prejudiced beliefs; and "do-gooders" may inadvertently reinforce stereotypes when trying to explain them away. Unfortunately, prejudice has caused much misery throughout human history.
Organizational folly
  • A rational organization is one that adopts the best available means of pursuing its ends... yet in practice, organizations tend to follow traditional practices or are structured so as to reward the selfish behavior of their members.
  • Leslie Chapman has written two books ("Your Disobedient Servant" and "Waste Away") chronicling his experience with wasteful practices in the British Civil Service.
    • Irrational waste occurs because decisions are often made in committees, which means that no individual is accountable. Promotions are determined largely by seniority, which means that people are not incentivized to take risks that might overcome inertia. In public organizations, money is often allocated based on previous years' expenditure, which encourages overspending. Greed and the desire for prestige lead to excessively lavish provisions for upper management. In some countries (like Britain), public bodies operate under secrecy, which makes it harder for the press to broadcast instances of waste.
  • Making public services more transparent and making civil servants use the public services themselves may help to improve the situation.
  • Even in private businesses, managers sometimes prioritize tradition over making profit, or they lose money simply because they failed to keep detailed financial records. Since three out of four new small firms in the U.S. become bankrupt within four years, it seems that businessmen are overly hopeful. Financial advisers tend to underperform the market (in part because they extrapolate from historical charts, despite the fact that share prices move randomly).
Misplaced consistency
  • People seek to be consistent in their beliefs, and the halo effect is one example of this. Another phenomenon is that people seek to justify their commitments, so they focus on the reasons why they did the right thing (as opposed to the reasons why they may be wrong).
  • The sour grapes effect is that people tend to lower their opinion of anything they don't or can't get.
    • On the other hand, something that is hard to get becomes more attractive, as long as you choose to undergo the rigors of getting it.
  • Salesmen use the "foot-in-the-door" technique to gradually slide step-by-step from a small deal to a large one. This can be an effective way to get people to do something they otherwise would not have done.
  • One form of irrationality is when people continue to do something that is not in their interest just because they have previously invested money, time or effort into it. This is called the sunk cost error. We are unable to acknowledge our own errors even to ourselves.
    • Investments should be based on the present situation and look to the future, not the past (except insofar as you can learn from the past).
  • Consistency effects can also lead people to make up reasons for doing something distasteful (e.g. having to do a boring task and pretending to yourself that it was enjoyable). This may plausibly be explained by cognitive dissonance theory, although Sutherland does not mention it here.
Misuse of rewards and punishment
  • Giving someone a reward for doing a pleasant task actually devalues it! Experiments show that adults are more likely to donate blood if they are not given a monetary incentive than if they are. This contradicts the behaviorist theory that rewards increase the motivation to perform an activity.
    • Note that not all rewards have this effect. For example, praising someone for performing well does not devalue the task. The same is true for self-praise or intrinsic rewards.
    • Evidence suggests that workers perform better on the job if they are motivated by pride in their work, rather than carrots and sticks (like salary).
  • Research also indicates that people who are trying to win a prize will be less imaginative and flexible in their work than equally talented people who are not. Giving prizes seems to increase the sum total of human unhappiness (because for each person who wins a prize, there are many who do not) -- although prizes are different from supporting a useful piece of work that would not otherwise have happened (e.g. through scientific grants). Hence Sutherland's tongue-in-cheek advice to turn down a Nobel Prize if you happened to be offered one.
  • The threat of strong punishment actually makes children like a forbidden toy more than the threat of mild punishment. Indeed, children who are punished less tend to be more obedient.
  • People prefer things that they chose freely over things that are forced upon them -- even if it's the same thing! This is irrational; however, it is also irrational to force compliance on others, and in the context of medicine, giving patients more choice is associated with less anxiety and better outcomes.
Drive and emotion
  • Strong emotions (like jealousy, depression or sadness) and stress can impede the concentration needed for rational thought and rational decision making.
  • When facing a difficult task, rewards can make us try too hard, which limits flexibility of thinking and accentuates the availability error.
  • People act impulsively in cases like smoking, overeating, heavy drinking, taking addictive drugs and having unprotected casual sex -- which may not be in their best interest in the long-run.
    • However, Sutherland argues that the puritanical, self-punitive response that some people have (e.g. spending hours jogging and eating nothing but so-called "health foods") is also irrational.
  • Emotions are neither rational nor irrational, but they can lead to irrational acts. For example, the Chernobyl disaster might have been caused by boredom. Irrationality also arises from conflicting motives (e.g. wanting to be dominant and likable), or not thinking about one's goals and the many possible consequences of one's actions.
Ignoring the evidence
This chapter and the next nine deal with errors produced simply by our inability to think straight.
  • Admiral Kimmel was in charge of the American Pacific Fleet in 1941, during which the Japanese attack on Pearl Harbor occurred. Conceivably, the damage could have been mitigated had Kimmel thought of alternatives to the binary of "doing nothing" or "going on full alert". But despite the warning signs, he stuck to his existing belief that there was no need for further action.
  • People are reluctant to relinquish their views. Admitting error and apologizing is difficult. However, even when self-esteem is not involved, people go to extreme lengths to avoid evidence that might refute their own beliefs, and when they do find such evidence, they refuse to believe it.
  • The "2, 4, 6 task" has been used in experiments to show that people try to confirm their current hypothesis rather than disconfirm it. A rule cannot be proved with certainty, but a single discrepant observation refutes it -- hence why this tendency is irrational.
  • In everyday life, we seek confirmation of our own opinions of ourselves. For example, American college students who viewed themselves unfavorably preferred to share a room with others who also viewed them unfavorably.
Distorting the evidence
  • General Montgomery led British troops in the battle of Arnhem during the Second World War, but his plan was poorly designed. Even though he was provided information that it was bound to fail, he dismissed the report as "ridiculous". His insistence resulted in needless deaths.
  • An experiment wherein subjects were presented with "evidence" that the death penalty does or does not have a deterrent effect showed that (a) subjects noticed more flaws in the studies that went against their personal beliefs about the death penalty; (b) the strength of their attitudes was increased by supporting evidence but not weakened by contrary evidence; and (c) after reading both pieces of evidence (a study pro and a study con) the subjects' original beliefs were strengthened! 
  • People's persistence in belief cannot be explained merely by the need to defend their self-esteem; the desire to confirm a current hypothesis is caused (at least in part) by a failure to think correctly. Another contributing factor is that people are really good at inventing explanations for events or phenomena -- indeed, people can use almost any event in a person's past life to explain why he or she did something later on.
Making the wrong connections
  • Psychoanalysts believe (contrary to the evidence) that their treatments help their patients; this is an example of faulty reasoning about causes and connections, because the psychoanalyst fails to seek out information about similar patients who do not receive psychoanalysis. This error is known as illusory correlation.
  • To conclude that there is an association between two events A and B, you need a 2x2 table with data about "A but not B", "Not A but B", "A and also B", plus "Neither A nor B". Quite often, doctors may fail to notice the negative evidence, i.e. the number of patients who have neither the symptom nor the disease. (This may be caused by the availability error.)
  • Psychologists and psychiatrists still sometimes use projective tests like the Rorschach or Draw-a-Person Test despite these tools being useless at diagnosis; this is presumably because the therapists fail to see associations that would be obvious if they looked at the numbers (all four quadrants). Instead, they go along with their preconceptions. Companies in continental Europe make the same mistake when they employ graphologists to select personnel.
  • People tend to pay more attention to group members who are different from the rest, and subsequently evaluate them in more extreme terms. People also tend to associate a rare quality with a rare type of person, such that bad behavior gets associated with minority groups (e.g. Jews or blacks). This is also an example of irrational thinking.
Mistaken connections in medicine
  • A mistake that doctors often make is to confuse the conditional probability of X given Y with its inverse (the probability of Y given X). For example, a woman who undergoes a mammography to test for breast cancer may have a 92% probability of testing positive if she has breast cancer, but what we want to know is the probability that she has breast cancer if the test results are positive, which may only be 1%. One survey of doctors in the U.S. found that 95% of doctors got this wrong!
  • In addition to mammography, a surgical breast biopsy can help diagnose cancer. However, this procedure can have severe complications (such as death), so women have to choose how much risk of cancer they are willing to accept. To make a good decision, subjective feelings of uncertainty should be translated into mathematical probabilities, and doctors should use the statistics of similar cases.
  • Unfortunately, doctors tend to be overconfident that their diagnosis is correct. One study found that when doctors were 88% confident that they correctly diagnosed pneumonia, only 20% of their patients actually had pneumonia.
  • Another form of irrationality is not telling patients undergoing a medical procedure what to expect. Evidence shows that well-informed patients recover more rapidly from abdominal surgery; yet doctors frequently think it a waste of time to talk to patients.
Mistaking the cause
  • Psychoanalysis is based on a "primitive" way of thinking, which is to think that the conditions of a phenomenon must resemble the phenomenon -- just like homeopathy is based on the idea that a substance which produces an illness can also cure the illness when given in tiny amounts.
    • If we know that people with high blood cholesterol levels are more prone to heart disease, does this mean that consuming less dietary cholesterol will increase your longevity? No. That is the fallacy of like causes like.
  • In medical research, false theories (e.g. that "Type A" personalities are more susceptible to heart attacks) can flourish due to journals selectively publishing findings that are new and interesting.
  • Another error is to mistake the effect for the cause. For example, people who like their psychotherapists tend to recover faster from mental illness, but this doesn't mean that liking one's therapist is an important factor in therapy -- it could be that patients who make more progress end up liking their therapists more.
  • Interestingly, people show more confidence in reasoning from cause to effect than from effect to cause. People are more likely to attribute the cause to an agent (e.g. a person) when the nature of the effect is dramatic. Irrationally, we hold people more responsible for the same action if the consequences were more serious (this is like blaming someone equally for breaking a jar of marmalade by accident and angrily smashing it on the ground). We also hold people more responsible for actions that harm us personally or our friends, than if they harm a stranger.
  • Events usually have many possible causes, but people tend to pick the most unusual or interesting one as the cause. An example of this is the fundamental attribution error; that is, attributing an action to someone's disposition/character rather than the situation.
    • We do this more for other people's behavior than our own, because we find the situation we are in more salient (visible) than our own actions.
    • Personality traits are less important than we think, because they are very inconsistent from one occasion to another. But due to illusory correlation, we only notice aspects of a person's behavior that confirm our initial snap judgments.
  • We tend to think that others are more similar to ourselves than they actually are (perhaps due to the availability error).
  • People can be mistaken about the causes of their moods and emotions. Experiments have found that the physiological arousal caused by riding an exercise bicycle can make male subjects rate female nude pictures as more sexually stimulating. Another study found that women judged their quality of sleep as the most important factor in their mood the next day, while objective mathematical analysis revealed a much stronger association between the day of the week and mood.
Misinterpreting the evidence
  • People misinterpret the evidence in highly systematic ways even when they have no preconceptions! One example is the representativeness error: people think that THHTTH (where T stands for "tail" and H stands for "head") is more likely to occur when tossing a fair coin six times than is TTTTTT even though both sequences have the same chance.
  • In one experiment, subjects rated the statement "Linda is a feminist bank teller" as more likely to be true than the statement "Linda is a bank teller". The subjects were given information that Linda is "single, outspoken and participated in anti-nuclear demonstrations as a student"; so they thought that Linda's description is typical/representative of a feminist. However, their judgement of the statements was incorrect -- there must be more women bank tellers than women bank tellers that are feminists (because some bank tellers are not feminists).
    • People tend to average two probabilities together when the probabilities should be multiplied. Hence people are more likely to believe something implausible when they are simultaneously told something highly plausible. This is irrational because the probability that all material is true is always reduced by adding extra material.
  • People think that smoking is more likely to cause lung cancer than heart disease, but since heart disease is much more common than lung cancer (and smoking also increases the risk of heart disease), there are more smokers who cause themselves to die from coronary illness than lung cancer. This demonstrates a failure to use Bayes's Theorem, which says that new information about the probability of an event must be combined with the prior probability (or base rate) of the event.
    • Another example is the use of lie detectors: since there are many more innocent than guilty people (the base rate), the lie detector is likely to catch fewer culprits than innocent people.
  • Most people lack adequate knowledge of statistics or basic probability theory; however, in many cases rational thought (i.e. reaching the conclusion most likely to be correct) must be based on the manipulation of numbers.
  • People are ignorant of the law of large numbers, which says that larger samples are more likely to contain the true frequency of an event. Ignoring sample size leads students to select courses based on face-to-face discussions with a few senior students rather than ratings collected from a large number of other students.
  • Even large sample sizes can be biased by not being representative of the population. Experiments show that even when people are told that an inhumane prison guard is not typical, they judge the prison service as a whole to be nasty based on that single striking case.
Inconsistent decisions and bad bets
  • The expected value of a bet is the amount to be won multiplied by the probability of winning, minus the amount to be lost multiplied by the probability of losing. Expected value is relevant because over a large number of bets, taking the gamble with the highest expected value will maximize the odds of you achieving your goals. However, people irrationally accept bad bets (e.g. the lottery).
  • People seem to be irrationally influenced by certainty. For example, experimental subjects would prefer to be vaccinated with full protection against a virus that infects 10% of the population, instead of being vaccinated with a 50% chance of protection against a virus that infects 20% of the population -- even though in both cases you get the same chance of being protected.
  • People take inconsistent decisions, because they let themselves be influenced by how the decision is framed. In one experiment, subjects were told that a rare disease is expected to kill 600 people; if the subjects are asked to choose between a program that saves 200 lives with certainty or  a program that saves 600 lives with a probability of 33%, most choose the first option. But when asked to choose between a program where 400 die for certain or a program where 600 die with a probability of 67%, most choose the second. Thus, when something is framed in terms of losses, people are more prone to taking risks.
  • We weigh losses more than equivalent gains due to the fact that we are reluctant to part with what we already have (this is known as the endowment effect, although Sutherland doesn't use the term in this book).
  • Inconsistent decisions arise when people prefer A to B, B to C, C to D, D to E, and E to A. This might happen when we ignore small differences in the evidence, which could add up to outweigh the more salient differences.
  • Is it worth traveling to a different store to save $5 on a purchase? The answer should depend on how much you value the absolute amount ($5) saved, not on the percentage you save on the item.
  • Even when no numbers are involved, people can be irrationally influenced by the power of suggestion. Research shows that subjects who were asked "how fast were the cars going when they smashed into each other?" recalled more broken glass than those who were asked "how fast were the cars going when they hit one another?" 
  • People are influenced by the two end points of a scale and tend to pick a number near the middle. They also tend to pick a number close to any number with which they were initially presented. These are known as anchoring effects. When different anchoring points are used, people give different judgments -- even though the anchor has no bearing on the correct answer!
    • The anchoring effect contributes to the phenomenon that people overestimate the probability of an event that is determined by a sequence of other events (because they stick too close to the probabilities of the individual events rather than multiplying them). It also contributes to the tendency of people to underestimate the time needed to complete a large-scale project (because they fail to add the probabilities that something might go wrong).
Overconfidence
  • Overconfidence manifests itself when we think, in hindsight, that we could have predicted an event that already happened, or when we believe that we could have taken a better decision than someone else. Experiments support this. People indeed believe that they can predict the future from the past better than they actually can, and people even distort their recollection of past events (such as their previous opinions).
    • We are good at inventing causal explanations for what happened, but this means that we fail to take into account the role of chance in e.g. history. This hindsight bias prevents us from learning from the past and also makes us too confident in our predictions about the future.
    • In one experiment, subjects who were 100% confident that they spelled a word correctly, spelled the word correctly only 80% of the time.
  • Overconfidence can also make us overly confident that our answer to a difficult question will be wrong, and hence we underestimate our ability to answer correctly in this case.
  • In real life, financial advisers on average do worse than the market in which they are investing. The construction industry (and defense industry) often underestimates the time and cost required to complete a project, even when there are penalties for being late. People are overconfident in their ability to control events -- this illusion of control is prevalent in gambling.
  • Why are humans overconfident? Sutherland offers five explanations: we fail to search for contrary evidence; it is often impossible to discover what the consequences of a different decision would have been; we distort both our memories and new evidence to fit with our beliefs; we construct causal stories for why we are right; and our self-esteem makes it unpleasant to be wrong.
Risks
  • Engineers often fail to take into account the limitations of the human operator. For example, the original cockpit layout of the Airbus A320 had poorly designed visual displays, which may have contributed to three crashes.
    • Engineers also neglect public reaction; the use of seat belts in cars may have encouraged reckless driving, thereby harming more pedestrians and cyclists.
    • By failing to consider the possibilities of failure (an error caused by overconfidence), engineers sometimes design unsafe devices. Modern technologies are highly complex, and since they contain so many crucial components, the probability of failure may be quite high even if the probability of each individual part failing is very low.
  • Most major accidents may be caused by bad management, because management sets the rules that the operators simply follow.
  • The general public also has irrational attitudes toward risk. Research has found that warnings about risks have little or no effect on behavior. People also overestimate the risk of dramatic accidents that kill many people at once (as opposed to the death of many people in different places at different times). People are afraid of new devices that they aren't used to.
    • An example of the latter is the perceived danger of nuclear reactors -- in reality, the number of estimated fatalities per unit of electricity produced is much higher for coal-fired stations. Perhaps people associate coal with cosy fireplaces, and nuclear power with the atomic bomb (a kind of halo effect).
    • In contrast to nuclear reactors, people don't protest the overuse of X-rays in hospitals despite X-rays causing about 250 unnecessary deaths per year in Britain. Presumably this irrationality arises because people are familiar with X-rays and associate them with better health.
  • Any form of technology, whether old or new, carries risks. It is plausible that horse-drawn coaches and oil lamps were more dangerous than motor vehicles and electric lighting are.
False inferences
  • When we have several goals and many possible courses of action, we can suffer from information overload. Instead of choosing the optimal option, we tend to satisfice -- i.e. settle for an option that is "good enough". Sometimes this leads to irrational decisions, because the few factors that we consider may not be the most important ones (merely the most available).
  • We are prone to errors when predicting the likelihood of an event based on evidence. One error is to ignore the principle of regression to the mean, which says that for any event in which chance plays a role, an extreme event is likely to be followed by a less extreme event of the same kind. For example, on one occasion the food at a restaurant may be exceptionally excellent by chance (because chance plays a role in cooking), but the next time you visit the restaurant the food will be closer to the norm/average.
    • For this reason, people tend to make predictions that are too extreme (in either direction) and therefore likely to be wrong. Moreover, people show more confidence in predictions based on extreme predictor scores -- even though they should be adjusting the predictor score towards the average!
  • Another error is to use correlated qualities for predictions, for example using both "skill at addition" and "skill at subtraction" to assess whether someone would make a good clerk. This is irrational because the two are probably completely correlated, which means that you need only use one measure -- knowing the second score gives no new information. You could improve your prediction by including a factor that is poorly correlated with "skill at addition", like "conscientiousness" for example.
  • The gambler's fallacy is when people act as if the ball in a game of roulette has a memory, even though it will on average come down equally often on red and black. So when the ball falls on black six times in a row, people think it is more likely to fall on red the next throw (perhaps due to the representativeness heuristic). However, the sequence BBBBBBB and BBBBBBR are equally probable.
  • People often see patterns in completely random events. When the Germans bombed London in WWII, people assumed that the East End was disproportionately targeted. Later statistical analysis revealed that the pattern of bombs was random.
The failure of intuition
This chapter and the next one deal with methods of manipulating evidence that would (in theory) produce the best possible conclusions given the evidence.
  • Human intuition is remarkably bad, and we can often do better by using formal mathematical analysis. One such method is multiple regression analysis, which takes various bits of evidence, weighs each according to its relative value, and then adds the figures together. Numerous studies have shown that intuitive human judgment is almost never superior to actuarial prediction.
    • A nice feature of the actuarial method is that if you add a predictor variable that is completely unconnected to the outcome variable, it will automatically be assigned a weight of zero.
    • Interestingly, the actuarial method can predict happiness in marriage simply by subtracting the average number of fights a couple has per week from the average number of times per week they have sex.
  • The procedure known as bootstrapping can be used to calculate the average weight that an individual human judge implicitly assigns to each predictor. For example, the admissions officer of a graduate school may take into account the candidate's GPA, GRE score, and references. But since humans are not perfectly consistent in their evaluations (we have bad days and good days), it is possible to increase the validity of the predictions and reduce random error by using a mathematical model based on the judge's average weights.
  • Actuarial methods may not be appropriate for trivial decisions, and they only work if there is a large number (about 30+) of previous cases for which numbers have been recorded. Nevertheless, it is irrational that people resist using this method in cases where it does better than their unaided intuition.
    • Reasons for resisting the actuarial technique include (a) people want to believe that they have special skills and talents; (b) people believe that intuition has some magical quality that cannot be replaced by formal calculations (although ironically, people implicitly use an actuarial method, just not consistently); and (c) people think it "soulless" to take important decisions by models rather than people -- even though an interview with a candidate cannot tell you more about their abilities than four years' grades.
  • Interviewing is not a helpful technique for personnel selection, despite its widespread use. Studies have found that the contrast effect can make one applicant sound worse than they actually are just because the previous applicant showed an exceptionally strong performance during the interview. Furthermore, evidence shows that cognitive ability tests are better predictors of future job progress.
Utility
  • How should people act in order to best attain their goals? One model that can help is Utility Theory. This is related to expected value (discussed earlier), but with a few caveats: increasing amounts of money has diminishing marginal utility, i.e. gaining $10 million is not ten times as desirable as gaining $1 million. Utility Theory takes this into account by letting you assign arbitrary numbers to represent the desirability of different outcomes to you.
  • To apply Utility Theory, you list the possible outcomes of each option, assign a number (utility) to each outcome depending on how desirable or undesirable it is, multiply the utility of each outcome by the probability of it occurring (to get the expected utility), add the expected utilities for each outcome, and finally select the option with the greatest expected utility.
    • Another caveat is that the utilities you assign must be consistent (so an outcome with a value of 60 is twice as desirable as one with a value of 30). You can represent a neutral outcome using zero, so that negative utilities represent costs and positive utilities represent benefits. You don't need to consider only your own happiness -- your ends may including benefiting someone else at your expense.
  • An extension of the above is Multi-Attribute Utility Theory, which breaks down each outcome into independent attributes. For example when deciding which car to buy, you can consider a list of attributes like "reliability", "acceleration", "comfort", "road-holding" and so on, and weigh each attribute according to its relative importance.
  • Utility Theory cannot tell you which outcomes to consider or what their probabilities are, but at least it combines the input in a rational way rather than the haphazard way of human thought.
  • Using Utility Theory can help people pay attention to factors that are contrary to their overall world-view, and thereby reduce disputes on committees. Utility Theory can be useful in medicine, for example in deciding when to perform aspiration of a potential tumor or cyst, or arteriography. Even when applying Utility Theory is too time-consuming for everyday decision-making, people can still make more rational decisions by writing down the outcomes and assessing their desirability and probability.
  • Limitations to Utility Theory include the fact that people don't always know what will make them happy; that people sometimes don't know what they want (although one can still try to avoid things one does not want); and that it may be rational to take an option that does not yield the maximum benefits but is safe and ensures nothing dreadful happens -- as long as that is what the person wants.
  • Economists use cost-benefit analysis to calculate expected monetary value; this includes attaching a cash value to human life, which many people find abhorrent, but in practice nobody acts as if a human life is worth an infinite amount of money. However, a weakness of cost-benefit analysis is that (unlike Utility Theory) it doesn't take into account non-monetary benefits and costs.
  • In medicine, decisions about which patient should get which treatment can be made more rationally using the concept of a Quality Adjusted Life Year (QALY). One QALY is equivalent to the number of years living with a given disability/disease that a member of the general public would be willing to trade off against a year free from disability. Calculating the expected QALYs of a given procedure can help avoid treatments that do patients more harm than good. It can also be used to prioritize treatment when not everybody can be helped. According to Sutherland, keeping people alive is not all that matters!
The paranormal
The widespread but irrational belief in the paranormal is a great résumé of various errors discussed in this book.
  • Paranormal phenomena (e.g. psychics, telepathy, gods, astrology, etc.) typically defy the known laws of physics, and lack evidence for their existence.
  • Paranormal beliefs originate from people's need to seek explanations instead of suspending judgment. Sometimes we think animalistically (i.e. that all movements are causal agents). Additionally, most cultures have a history of believing in supernatural higher beings, perhaps due to a fear of death and the search for meaning.
  • Many other errors are at play too. Paranormal phenomena grab more public attention and are thus available. In families, groups or crowds, conformity and in-group pressures contribute to the spread of such beliefs. Vague predictions can easily be distorted to fit one's situation. By investing time and money in fortune tellers, people are vulnerable to misplaced consistency (in the form of the sunk cost error). People are typically bad at calculating the odds of coincidence (e.g. if there are 23 people in a room, the chance of at least two of them having a birthday on the same day is over half).
  • During the Cold War, the Russians spent a lot of money on researching the paranormal because they thought telepathic communication could have military significance. In response, the U.S. armed forces did the same. This illustrates that irrationality is not limited by class, creed or education.
Causes, cures and costs
What are the deeper causes of irrationality? What can be done to promote rationality, and is it even necessary or desirable?
  • Sutherland identifies five primary causes of irrationality. These are: (i) the evolutionary pressures our ancestors faced to act on impulse when it came to matters of shelter, sustenance and procreation (which still do not require a great deal of rationality); (ii) the network-like structure of our brains, which use the same nerve cells for learning different things, and which are better suited for fast unconscious processes (e.g. face recognition) than for careful thinking; (iii) the numerous heuristic tricks we use due to mental laziness; (iv) the failure to use elementary probability theory and statistics; and (v) the self-serving need to be right or support one's self-esteem.
  • To reduce irrationality, we could teach people some general principles: survey all the evidence before coming to a conclusion, be willing to change your mind, look for flaws in the arguments in favor of your own views, don't take decisions under stress or when in a hurry, suspend judgment when the evidence is lacking either way, write down the pros and cons before taking any important decision, and learn about the errors outlined in this book.
    • Learning statistical concepts (e.g. the law of large numbers, regression to the mean) has been shown to help people take good decisions in everyday life. Knowledge of economics and psychology can also help. Moreover, the ability to take rational decisions in everyday life is correlated with success at work (such as salary).
  • In the case of decision-making by experts (e.g. doctors, generals, engineers, judges, businessmen etc.) it is undoubtedly true that rationality is necessary and desirable. Human lives may be at stake. However, for trivial personal decisions, the effects of irrationality are less important (except perhaps for decisions like which house to buy, which career to follow, whom to live with, and whether to have children -- but even then, there are usually many unknowns).
  • We value spontaneity because it indicates sincerity and because it's less boring than slow pondering. But we don't like spontaneous bad actions like anger, frustration, depression or envy. In order to spontaneously act in good ways (but not bad ones), we need to carefully consider which actions to perform, and then practice doing them until they become natural.
***

So as you can see, Irrationality is a very comprehensive book, covering a large scope. Much of the content overlaps with Kahneman's Thinking, Fast and Slow and Baron's Thinking and Deciding (although of course both books contain material not in this one, and likewise, this book has material not contained in the other two).

Stuart Sutherland writes in a very clear and entertaining style, which made this book enjoyable to read. All the concepts are explained by reference to experimental research or practical examples. At the end of most of the chapters, Sutherland gives a few "morals", or lessons to take away. Some of these are tongue-in-cheek, for example "Don't trust Which? [a consumer publication]". "If you are a patient, set your doctor a simple test on elementary probability theory." "If you are a casino owner losing money, don't sack the croupier: it's not his fault." "Never allow an insurance salesman past your front door." "Never volunteer to become a subject in the Psychological Laboratory at Yale."

Some of the advice in this book may be somewhat misguided, for instance Sutherland's suggestion to eat whatever you want, because the research on healthy diet is inconclusive. In some cases, the tone of the book is also slightly derogatory (e.g. accusing people of being asinine). However, besides these minor flaws, there is not much else wrong with this book. I gave this book 5/5 stars on Goodreads.

For a good introduction to the research on human irrationality, this book is highly recommended.

Comments

Popular posts from this blog