- Paperback: 256 pages
- Publisher: Pinter & Martin Ltd; 2nd edition (February 27, 2007)
- Language: English
- ISBN-10: 1905177070
- ISBN-13: 978-1905177073
- Product Dimensions: 5.6 x 0.6 x 8.7 inches
- Shipping Weight: 10.6 ounces
- Average Customer Review: 17 customer reviews
- Amazon Best Sellers Rank: #2,371,801 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Irrationality 2nd Edition
Use the Amazon App to scan ISBNs and compare prices.
|New from||Used from|
Customers who bought this item also bought
Customers who viewed this item also viewed
Terrifying, sometimes comic, very readable and totally enthralling (Oliver Sacks)
Extremely gripping and unusually well written (RICHARD DAWKINS)
About the Author
Stuart Sutherland was Professor of Psychology at the University of Sussex. A prolific columnist and contributor to the New York Times, he is best known for his books Irrationality and Breakdown.
Try the Kindle edition and experience these great reading features:
Showing 1-8 of 17 reviews
There was a problem filtering reviews right now. Please try again later.
Because the book covers its scope well and I can find no real fault with it, I see no reason to give it less than 5 stars. Here are some key points from the book:
(1) 'Irrationality' involves cognitive processes which are prone to result in inaccurate conclusions or suboptimal decisions relative to available evidence and time constraints. So irrationality will often lead to worse outcomes, but not always ('luck' is a factor also).
(2) The prevalence of irrationality may be due to our cognitive processes having been evolved for physical and social environments which are different from modern life. So what appears as a problematic 'bias' today may have been adaptive in the distant past. But we also need to remember that heuristic shortcuts, though prone to bias and inaccuracy, can also be very effective in many situations, and in the real world we don't have the luxury of thoroughly analyzing everything (rationality is bounded).
(3) Emotions (which can lead to self-deception and denial), stress, rewards, punishments, and even boredom can contribute to irrationality, but the issue is mostly cognitive.
(4) As situations become more complex, we tend to become more irrational. So taking a more systematic approach to making decisions can be helpful as complexity increases (eg, listing pros and cons of options).
(5) A good grasp of basic statistics can significantly reduce irrationality.
(6) Many cognitive biases are tied to availability bias, which involves latching on to what comes to mind readily. Availability is tied to strong emotions, recency, first impressions, etc. Anecdotes feed availability bias.
(7) The halo effect is when people overgeneralize and infer too much about a person from a few traits, and it can undermine the effectiveness of interviews. It's generally better to assume that people are a mix of strengths and weaknesses.
(8) Deference to 'experts' and other authority figures, and conformity with groups (eg, peer pressure), are natural instincts, and they also provide people with 'cover', with the result that they can greatly compromise group effectiveness by promoting groupthink (homogenization of group cognition), increasing risk taking by the group, and even promoting violent group behavior. There can be a vicious cycle here, where group dynamics make groups become increasingly extreme rather than more moderated, and increasingly hostile towards other groups. Contrary to common opinion, sports increase this hostility rather than decreasing it. Groups may also be irrational because of individuals in the group putting their own interest ahead of that of the group in various ways (ingratiating themselves with group leaders, refusing to speak up because it may compromise their position in the group, etc.).
(9) People are often NOT open-minded. We tend to filter and interpret according to what we want to believe, and once someone has a firm belief, presenting them with disconfirming evidence can cause them to rationalize in a way that results in the belief becoming even firmer; we're good at 'making up' explanations for things. This confirmation bias can affect our basic perception and our memory, not just our cognition, and can result in 'illusory correlation', where we falsely perceive a relationship between things. So it's important to sometimes test our beliefs by deliberately trying to falsify them, simultaneously consider multiple competing hypotheses, and consider both positive and negative cases.
(10) Even when correlations are real, it's important not to mix up cause and effect, nor ignore 'common causes' (causes which the correlation between two effects, with neither effect being a cause of the other). When multiple causes contribute to an effect, we sometimes will pick out the most cognitively available one as the dominant cause.
(11) Combining implausible and plausible information tends to make the implausible information and its source appear more plausible and credible.
(12) Anecdotes and small samples are often non-representative of overall trends and can result in incorrect inferences. But beware that even large samples can be biased for various reasons (eg, self-selection bias).
(13) We often fail to recognize that many small probabilities can add up to a large probability in cumulative situations (eg, event risk vs lifetime risk), whereas the probability of an event will be reduced to the extent that a concurrence of factors is needed for the event to occur (ie, joint probability).
(14) A 'slippery slope' can be thought of as a process of normalizing deviance.
(15) We generally tend to be more concerned with avoiding losses than making gains. It may be that losses cause more pain than gains cause pleasure. Insurance is an example where we (rationally) take a definite but smaller loss in order to prevent an uncertain but larger loss.
(16) Continuing in an unpromising direction or refusing to cut losses may be due to sunk-cost bias. The remedy is to focus on the future, rather than the past (other than learning from the past).
(17) Offering a substantial reward for performing a task which is inherently pleasant can backfire by causing that task to become devalued. This has implications for education and employee compensation.
(18) People will tend to dislike something more if it's forced on them rather than freely chosen.
(19) Punishing children, and neglecting them when they cry, are strategies which usually backfire.
(20) We have some bias towards short-term focus, often to the detriment of our long-term interests.
(21) The way questions, situations, and options are framed can dramatically (and irrationally) influence what decisions are made. The 'power of suggestion' exploits this, and the 'anchoring bias' is an example of this.
(22) Overconfidence bias can result in our overestimating our ability to make predictions about the future and overestimating our knowledge and skill in general, as well as the hindsight bias of thinking that we would have made a better prediction or decision in the past than someone else did. Hindsight bias is also driven by inaccurate memories of past events, as well as our knowledge of actual outcomes and our ability to generate causal explanations (which may not be accurate), which can lead us away from considering alternative possible outcomes. Hindsight bias can inhibit our learning from the past and thus prevent recurrence of mistakes. 'Experts' are as prone to these biases as anyone else, and in some cases more prone to them.
(23) We tend to take personal credit for successes, while blaming situations for our failures. But we tend to take the opposite views with the successes and failures of others. And we tend to assume that other people are more similar to us than they actually are.
(24) Bad ergonomic design (eg, instrumentation layout) can contribute to human errors and failures. Designers need to account for the limitations and tendencies of human operators, including how operators may change their behavior in unexpected ways in response to changes in system design.
(25) Risk assessment can be extraordinarily difficult with complex systems. And risks which are insidiously 'spread out' in time and space tend to be taken less seriously than risks which can cause a concentrated loss, even though the former may represent a much higher overall risk.
(26) Our bounded rationality and need to satisfice can lead to irrationality in various ways, such as our neglecting major factors and/or overweighting minor factors when making decisions. On a broader level, we may also spend too little time on major decisions and too much time on minor decisions.
(27) 'Regression to the mean' is the tendency for extreme events to be most likely followed by events which are less extreme, rather than comparably or more extreme. It's therefore irrational to expect that extreme events will continue in the same extreme direction, or to be surprised when they don't.
(28) Our intuition/judgment is often less accurate than formal analysis, even though we tend to be (over)confident about our intuition/judgment. But the judgment of groups is usually (not always!) better than that of individuals, since errors tend to cancel out in groups. Boredom, fatigue, illness, and situational distractions can all compromise judgment. This issue applies to interviewing as well, so formal testing is a useful interviewing method.
(29) The perceived marginal benefit of money tends to decrease as we have more money.
(30) We tend to not accurately predict how we will feel emotionally in situations we haven't previously experienced (eg, marriage, career change, relocation, etc.).
(31) Irrationality can be reduced by working hard to make a habit of rationality.
On the other hand, his last few chapters go into rating decisions on the basis of a numerical system (say 1 to 5). But in most cases the arguments for or agains a decision have no known outcome. For example, if you think you will live to be 90, you would give the decision to have back surgery a high rating (rather than live with the pain). On the other hand if you expect to live to only 85 maximum, you would give the choice for surgery a lower rating. All you need is a memo from God to tell you when you will die.
I would not discourage you from reading the book. I had fun taking the "pieces apart." But chances are that you could spend your time more productively.