550 of 572 people found the following review helpful
Format: HardcoverVerified Purchase
When you come late to the party, writing the 160th review, you have a certain freedom to write something as much for your own use as for other readers, confident that the review will be at the bottom of the pile.
Kahneman's thesis is that the human animal is systematically illogical. Not only do we mis-assess situations, but we do so following fairly predictable patterns. Moreover, those patterns are grounded in our primate ancestry.
The first observation, giving the title to the book, is that eons of natural selection gave us the ability to make a fast reaction to a novel situation. Survival depended on it. So, if we hear an unnatural noise in the bushes, our tendency is to run. Thinking slow, applying human logic, we might reflect that it is probably Johnny coming back from the Girl Scout camp across the river bringing cookies, and that running might not be the best idea. However, fast thinking is hardwired.
The first part of the book is dedicated to a description of the two systems, the fast and slow system. Kahneman introduces them in his first chapter as system one and system two.
Chapter 2 talks about the human energy budget. Thinking is metabolically expensive; 20 percent of our energy intake goes to the brain. Moreover, despite what your teenager tells you, dedicating energy to thinking about one thing means that energy is not available for other things. Since slow thinking is expensive, the body is programmed to avoid it.
Chapter 3 expands on this notion of the lazy controller. We don't invoke our slow thinking, system two machinery unless it is needed. It is expensive. As an example, try multiplying two two-digit numbers in your head while you are running. You will inevitably slow down. NB: Kahneman uses the example of multiplying two digit numbers in your head quite frequently. Most readers don't know how to do this. Check out "The Secrets of Mental Math" for techniques. Kahneman and myself being slightly older guys, we probably like to do it just to prove we still can. Whistling past the graveyard - we know full well that mental processes slow down after 65.
Chapter 4 - the associative machine - discusses the way the brain is wired to automatically associate words with one another and concepts with one another, and a new experience with a recent experience. Think of it as the bananas vomit chapter. Will you think of next time you see a banana?
Chapter 5 - cognitive ease. We are lazy. We don't solve the right problem, we solve the easy problem.
Chapter 6 - norms, surprises, and causes. A recurrent theme in the book is that although our brains do contain a statistical algorithm, it is not very accurate. It does not understand the normal distribution. We are inclined to expect more regularity than actually exists in the world, and we have poor intuition about the tail ends of the bell curve. We have little intuition at all about non-Gaussian distributions.
Chapter 7 - a machine for jumping to conclusions. He introduces a recurrent example. A ball and bat together cost $1.10. The bat costs one dollar more than the ball. How much does the ball cost? System one, fast thinking, leaps out with an answer which is wrong. It requires slow thinking to come up with the right answer - and the instinct to distrust your intuition.
Chapter 8 - how judgments happen. Drawing parallels across domains. If Tom was as smart as he is tall, how smart would he be?
Chapter 9 - answering an easier question. Some questions have no easy answer. "How do you feel about yourself these days?" Is harder to answer than "did you have a date last week?" If the date question is asked first, it primes an answer for the harder question.
Section 2 - heuristics and biases
Chapter 10 - the law of small numbers. In the realm of statistics there is a law of large numbers. The larger the sample size, the more accurate the statistical inference from measuring them. Conversely, a small sample size can be quite biased. I was in a study abroad program with 10 women, three of them over six feet. Could I generalize about the women in the University of Maryland student body? Conversely, I was the only male among 11 students and the only one over 60. Could they generalize anything from that? In both cases, not much.
Chapter 11 - anchors. A irrelevant notion is a hard thing to get rid of. For instance, the asking price of the house should have nothing to do with its value, but it does greatly influence bids.
Chapter 12 - the science of availability. If examples come easily to mind, we are more inclined to believe the statistic. If I know somebody who got mugged last year, and you don't, my assessment of the rate of street crime will probably be too high, and yours perhaps too low. Newspaper headlines distort all of our thinking about the probabilities of things like in and terrorist attacks. Because we read about it, it is available.
Chapter 13 - availability, emotion and risk. Continuation.
Chapter 14 - Tom W's specialty. This is about the tendency for stereotypes to override statistics. If half the students in the University area education majors, and only a 10th of a percent study mortuary science, the odds are overwhelming that any individual student is an education major. Nonetheless, if you ask about Tom W, a sallow gloomy type of guy, people will ignore the statistics and guess he is in mortuary science.
Chapter 15 - less is more. Linda is described as a very intelligent and assertive woman. What are the odds she is a business major? The odds that she is a feminist business major? Despite the mathematical impossibility, most people will think that the odds of the latter are greater than the former.
Chapter 16 - causes trump statistics. The most important aspect of this chapter is Bayesian analysis, which is so much second nature to Kahneman that he doesn't even describe it. The example he gives is a useful illustration.
* 85% of the cabs in the city are green, and 15% are blue.
* A witness identified the cab involved in a hit and run as blue.
* The court tested the witness' reliability, and the witness was able to correctly identify the correct color 80% of the time, and failed 20% of the time.
First, to go to the point. Given these numbers, most people will assume that the cab in the accident was blue because of the witness testimony. However, if we change the statement of the problem so that there is a 20% chance that the blue identification of the color was wrong, but 85% of the cabs involved in accidents are green, people will overwhelmingly say that the cab in the accident was a green madman. The problems are mathematically identical but the opinion is different.
Now the surprise. The correct answer is that there is a 41% chance that the cab involved in the accident was blue. Here's how we figure it out from Bayes theorem.
If the cab was blue, a 15% chance, and correctly identified, an 80% chance, the combined probability is .15 * .8 = .12, a 12% chance
If the cab was green, an 85% chance, and incorrectly identified, a 20% chance, the combined probability is .85 * .2 = .17, a 17% chance
Since the cab had to be either blue or green, the total probability of it being identified as blue, whether right or wrong, is .12 + .17 = .29. In other words, this witness could be expected to identify the cab as blue 29% of the time whether she was right or wrong.
The chances she was right are .12 out of .29, or 41%. Recommend that you cut and paste this, because Bayes theorem is cited fairly often, and is kind of hard to understand. It may be simple for Kahneman, but it is not for his average reader, I am sure.
Chapter 17 - regression to the mean. If I told you I got an SAT score of 750 you could assume that I was smart, or that I was lucky, or some combination. The average is only around 500. The chances are little bit of both, and if I take a test a second time I will get a lower score, not because I am any stupider but because your first observation of me wasn't exactly accurate. This is called regression to the mean. It is not about the things you are measuring, it is about the nature of measurement instruments. Don't mistake luck for talent.
Chapter 18 - taming intuitive predictions. The probability of the occurrence of an event which depends on a number of prior events is the cumulative probability of all those prior events. The probability of a smart grade school kid becoming a Rhodes scholar is a cumulative probability of passing a whole series of hurdles: studying hard, excelling in high school, avoiding drink and drugs, parental support and so on. The message in this chapter is that we tend to overestimate our ability to project the future.
Part three - overconfidence
Chapter 19 - the illusion of understanding. Kahneman introduces another potent concept, "what you see is all there is," thereinafter WYSIATI. We make judgments on the basis of the knowledge we have, and we are overconfident about the predictive value of that observation. To repeat their example, we see the tremendous success of Google. We discount the many perils which could have totally derailed the company along the way, including the venture capitalist who could have bought it all for one million dollars but thought the price was too steep.
Chapter 20 - The illusion of validity. Kahneman once again anticipates a bit more statistical knowledge than his readers are likely to have. The validity of a measure is the degree to which an instrument measures what it purports to measure. You could ask a question such as whether the SAT is a valid measure of intelligence. The answer is, not really, because performance on the SAT depends quite a bit on prior education and previous exposure to standardized tests. You could ask whether the SAT is a valid predictor of performance in college. The answer there is that it is not very good, but nonetheless it is the best available predictor. It is valid enough because there is nothing better. To get back to the point, we are inclined to assume measurements are more valid than they are, in other words, to overestimate our ability to predict based on measurements.
Chapter 21 - intuitions versus formulas. The key anecdote here is about a formula for predicting the quality of a French wine vintage. The rule of thumb formula beat the best French wine experts. Likewise, mathematical algorithms for predicting college success are as least as successful, and much cheaper, than long interviews with placement specialists.
Chapter 22 - expert intuition, when can we trust it? The short answer to this is, in situations in which prior experience is quite germane to new situations and there is some degree of predictability, and also an environment which provides feedback so that the experts can validate their predictions. He would trust the expert intuition of a firefighter; there is some similarity among fires, and the firemen learns quickly about his mistakes. He would not trust the intuition of a psychiatrist, whose mistakes may not show up for years.
Chapter 23 - the outside view. The key notion here is that people within an institution, project, or any endeavor tend to let their inside knowledge blind them to things an outsider might see. We can be sure that most insiders in Enron foresaw nothing but success. An outsider, having seen more cases of off-balance-sheet accounting and the woes it can cause, would have had a different prediction.
Chapter 24 - the engine of capitalism. This is a tour of decision-making within the capitalist citadel. It should destroy the notion that there are CEOs who are vastly above average, and also the efficient markets theory. Nope. The guys in charge often don't understand, and more important, they are blind to their own lack of knowledge.
Part four - choices
This is a series of chapters about how people make decisions involving money and risk. In most of the examples presented there is a financially optimal alternative. Many people will not find that alternative because of the way the problem is cast and because of the exogenous factors. Those factors include:
Marginal utility. Another thousand dollars is much less important to a millionaire than a wage slave.
Chapter 26 - Prospect theory: The bias against loss. Losing $1000 causes pain out of proportion to the pleasure of winning $1000.
Chapter 27 - The endowment effect. I will not pay as much to acquire something as I would demand if I already owned it and were selling.
Chapter 28 - Bad Events. We will take unreasonable risk when all the alternatives are bad. Pouring good money after bad, the sunk cost effect, is an example.
Chapter 29 - The fourfold pattern. High risk, low risk, win, lose. Human nature is to make choices which are not mathematically optimal: buying lottery tickets and buying unnecessary insurance.
Chapter 30 - rare events. Our minds are not structured to assess the likelihood of rare events. We overestimate the visible ones, such as tsunamis and terrorist attacks, and ignore the ones of which we are unaware.
Chapter 31 - Risk policies. This is about systematizing our acceptance of risk and making policies. As a policy, should we buy insurance or not, recognizing that there are instances in which we may override the policy. As a policy, should we accept the supposedly lower risk of buying mutual funds, even given the management fees?
Chapter 32 - keeping score. This is about letting the past influence present decisions. The classic example is people who refuse to sell for a loss, whether shares of stock or a house.
Chapter 33 - reversals. We can let a little negative impact a large positive. One cockroach in a crate of strawberries.
Chapter 34 - Frames and reality. How we state it. 90% survival is more attractive than 10% mortality.
Part V. Two selves: Experience and memory
Our memory may be at odds with our experience at the time. Mountain climbing or marathon running are sheer torture at the time, but the memories are exquisite. We remember episodes such as childbirth by the extreme of pain, not the duration.
Lift decision: do we live life for the present experience, or the anticipated memories? Are we hedonists, or Japanese/German tourists photographing everything to better enjoy the memories?
939 of 989 people found the following review helpful
on November 17, 2011
Format: HardcoverVerified Purchase
Back in 1994, Massimo Piattelli-Palmarini, Director of the Institute of San Raffaele in Milan, Italy, wrote a charming little book about common cognitive distortions called Inevitable Illusions. It is probably the very first comprehensive summary of behavioral economics intended for general audience. In it, he predicted that the two psychologists behind behavioral economics - Amos Tversky and Daniel Kahneman - would win the Nobel prize. I didn't disagree with the sentiment, but wondered how in the world were they going to get it since these two were psychologists and there is no Nobel prize in psychology. I didn't think there was much chance of them winning the Nobel Prize in economics. I was wrong and Piattelli-Palmarini was right. Kahneman won the Nobel prize in Economic Sciences. (Tversky unfortunately prematurely passed away by this time.) Just as Steve Jobs who was not in the music industry revolutionized it, the non-economists Kahneman and Tversky have revolutionized economic thinking. I have known Kahneman's work for quite some time and was quite excited to see that he was coming out with a non-technical version of his research. My expectations for the book were high and I wasn't disappointed.
Since other reviewers have given an excellent summary of the book, I will be brief in my summary but review the book more broadly.
The basis thesis of the book is simple. In judging the world around us, we use two mental systems: Fast and Slow. The Fast system (System 1) is mostly unconscious and makes snap judgments based on our past experiences and emotions. When we use this system we are as likely to be wrong as right. The Slow system (System 2) is rational, conscious and slow. They work together to provide us a view of the world around us.
So what's the problem? They are incompatible, that's what.
System 1 is fast, but easily swayed by emotions and can be as easily be wrong as be right. You buy more cans of soup when the display says "Limit 12 per customer". We are on autopilot with this system. System 1 controls an amazing array of behavior. System 2 is conscious, rational and careful but painfully slow. It's distracted and hard to engage. These two systems together provide a backdrop for our cognitive biases and achievements.
This very well written book will enlighten and entertain the reader, especially if the reader is not exposed to the full range of research relating to behavioral economics.
This book serves an antidote to Malcolm Gladwell's Blink. Although Gladwell never says that snap judgments are infallible and cannot badly mislead us, many readers got a different message. As the Royal Statistical Society's Significance magazine put it "Although Gladwell's chronicle of cognition shows how quick thinking can lead us both astray and aright, for many readers Blink has become a hymn to the hunch." While Kahneman does show how "fast thinking" can lead to sound judgments, he also notes how they can lead us astray. This point is made much more clearly and deliberately in Kahneman's book
All my admiration for the brilliance and creativity of Kahneman (and Tversky) does not mean that I accept 100% of their thesis. Consider this oft-quoted study. Linda is 31 years old, single, outspoken, and very bright. As a student, she was deeply concerned with the issues of discrimination and social justice, and she also participated in anti-nuclear demonstrations. Which is more probable?
1. Linda is a bank teller.
2. Linda is a bank teller and is active in the feminist movement.
Eighty-five percent of test subjects chose the second option, that Linda was a bank teller and active in the feminist movement. Kahneman's interpretation is that this opinion is wrong because the probability of a (random) woman being a bank teller is greater that than person's being a bank teller AND a feminist. What Kahneman overlooks here is that what most people answered may not be the question that was asked. The respondents may not have been concerned with mathematical probabilities, but rather could be responding to the question in reverse: Is it more likely for a current activist to have been an activist in the past compared to others in the profession? A more formal and theoretically better argued rebuttal of some of Kahneman's hypotheses can be found in the works of Gerd Gigerenzer.
Kahneman notes that even top performers in business and sports tend to revert to the mean in the long run. As a result, he attributes success largely to luck. I'm not so convinced of this. There can be alternative explanations. People who achieve high degree of success are also exposed to a high degree of failure and the reversion to the mean may be attributable to this possible mirror effect. Spectacular success may go with spectacular failure and run-of-the-mill success may go with run-of-the-mill failure. Eventually everyone may revert the mean, but the ride can be very different. Chance may not account for that.
Another concern is that much of the work is done in artificial settings (read college students). While much of what we learnt can perhaps be extended to the real world, it is doubtful every generalization will work in practice. Some may find Kahneman's endorsement of "libertarian paternelism," not acceptable. More importantly, when applied to the real world it did not always found to work.
In spite to these comments this book is written carefully in a rather humble tone. I also appreciated Kahneman's generous and unreserved acknowledgement of Tversky's contributions and his conviction that, had he been alive, Tversky would have been the co-recipient of the Nobel Prize. My cautionary comments probably have more to do with the distortions that might arise by those who uncritically generalize the findings to contexts for which they may not applicable. As mentioned earlier, the wide misinterpretation of Gladwell's Blink comes to mind.
Nevertheless, Thinking Fast and Slow is a very valuable book by one of the most creative minds in psychology. Highly recommended. For a more complete and critical understanding, I also recommend the writings of the critics of behavioral economic models such as Gerd Gigerenzer.
PS. After I published this review, I noticed an odd coincidence between Thinking Fast and Slow and Inevitable Illusions that I mentioned in my opening paragraph. Both books have white covers, with an image of a sharpened yellow pencil with an eraser top. How odd is that?
2,603 of 2,893 people found the following review helpful
on November 20, 2011
Format: Kindle Edition
The kindle version of this excellent book is disappointing. Several features of the book are confusing in the ebook because the formatting is so poor. Tables with two columns run together because they are not boxed and the columns are only separated by one space. There are questions at the end of each chapter whose purpose is unclear until you see them in the real book, where they are set off in a box with a different type face. Most disappointing is the handling of the footnotes - they are relegated to the back of the book with no page number reference. There is few word phrase in the notes that corresponds to the place in the text to which the note refers, but it is up to the reader to scan the chapter to find the reference. The book reads like a mechanical translation of the physical book into a new format, with no effort taken to edit and format appropriately. So the reader loses. With the price of the ebook almost as much as the real book, you will be happier if you buy the real thing.
687 of 769 people found the following review helpful
on October 25, 2011
Format: HardcoverVerified Purchase
Daniel Kahneman may have won his Nobel Prize in Economic Sciences, but his work was psychological in nature as it challenged the rational model of judgment and decision-making. He's considered one of the most important psychologists alive today, and this book doesn't disappoint with its breakthrough approach to understanding the "machinery of the mind."
Kahneman introduces two mental systems, one that is fast and the other slow. Together they shape our impressions of the world around us and help us make choices. System 1 is largely unconscious and it makes snap judgments based upon our memory of similar events and our emotions. System 2 is painfully slow, and is the process by which we consciously check the facts and think carefully and rationally. Problem is, System 2 is easily distracted and hard to engage, and System 1 is wrong as often as it is right. System 1 is easily swayed by our emotions. Examples he cites include the fact that pro golfers are more accurate when putting for par than they are for birdie (regardless of distance), and people buy more cans of soup when there's a sign on the display that says "Limit 12 per customer."
There are lots of interesting anecdotes as well as layman's summaries of psychological research that will leave you feeling fascinated by the brain. The book has 38 chapters broken into five sections. I've listed some of the chapter titles for each section to give you a feel for what it's about:
PART ONE - TWO SYSTEMS
1. The Characters of the Story
2. Attention and Effort
3. The Lazy Controller
4. A Machine for Jumping to Conclusions
5. How Judgments Happen
PART TWO - HEURISTICS AND BIASES
6. The Law of Small Numbers
7. Availability, Emotion, and Risk
8. Tom W's Specialty
9. Linda: Less is More
10. Causes Trump Statistics
11. Taming Intuitive Predictions
PART THREE - OVERCONFIDENCE
12. The Illusion of Understanding
13. The Illusion of Vanity
14. Intuitions Vs. Formulas
15. Expert Intuition: When Can We Trust It?
PART FOUR - CHOICES
16. Prospect Theory
17. Bad Events
18. Risk Policies
19. Keeping Score
PART FIVE - TWO SELVES
20. Life as a Story
21. Experienced Well-Being
65 of 69 people found the following review helpful
on November 13, 2014
Format: Kindle Edition
I have certainly enjoyed my experience with both of these books and I'm sure anyone else will, too.
145 of 167 people found the following review helpful
Format: HardcoverVerified Purchase
As one who was brought up with Herbert Simon and"satisficing," I have mixed feelings about this book. As an intelligence professional I know for a fact that corrupt politicians have zero interest in the facts, only in what will profit them personally in the short-term. As much as I would like to see integrity restored as the core value of government, economy, and society, in the larger context in which we live this book is a curiosity.
There are gems and it is certainly worth reading, but as one other reviewer points out, it is not the easiest reading nor the most delightful. Here is what I got out of it (my summary notes, I donate all books right after I read them, to a nearby university).
For those instances when BOTH intelligence (decision-support) officers and their clients (politicians, policy makers, acquisition managers, operational commanders) have integrity--a condition that does not exist today, this book is very useful as a training aid.
01 It strives to provide a deeper understanding of judgments and choices by humans.
02 It fully documents the biases of intuition (judgment informed by past cases)
03 It documents the fact that decision making under uncertainty leads to humans being too prone to believe findings based on inadequate evidence, and too prone to avoid collecting a sufficiency of observations or research findings by others.
The essence of the book is the author's distinction between System 1 and System 2.
System 1 is automatic, fact, and falls prey to illusions.
System 2 is controlled, slow, requires attention, and is easily distracted.
Conclusions about judgment heuristics (rules of thumb):
HARD to think statistically
EASY to think associatively
We have EXCESS CONFIDENCE in what we think we know, and a deep, deep, deep inability to acknowledge our ignorance.
Humans DEVIATE from rational model with two major CORRUPTIONS:
01 Treat problems in isolation instead of as part of a systemic whole
02 Treat problems in relation to framing effects that distort perceptions with inconsequential trivia
QUOTE (34): We found that people, when engaged in a mental sprint, may become effectively blind."
That one sentence made the book worthwhile to me. I have long been a fan of Red Cells and walk-abouts and other forms of being forced to engage outside the box, this one sentence reminded me of my now firm view that all analytic teams need an independent Yoda to challenge them.
The author surprises me with a substantive discourse on how money has caused people to collaborate less--it makes them more independent of one another and displaces the social value of collaboration.
I am fascinated by the author's focus on surprise as a litmus test for the extent to which we are open.
He emphasizes that hypotheses should be confirmed by trying to REFUTE the hypothesis rather than by searching for additional supporting evidence. Having the hypothesis is enough. If it cannot be refutes, THAT is worth much more than a documented but not seriously challenged hypothesis.
QUOTE (117): The tendency to see patterns in randomness is overwhelming.
This in the context of the anchoring effect, and the stark strong impact of preconceived notions that shape perception of inclusive into conclusive, contradictory into confirming.
The discussion of risk, for an intelligence professional, is very very interesting. The author focuses on how critical it is to actually have a measure of risk--to know with some precision what you are defining as risk, why.
I am blown away by a discussion that makes it clear that praise or punishment are generally irrelevant for professionals. They tend to do the best they can, and the odds are such that praise or punishment have no effect but appear to have effect because they zig zag along a mean. "It is what it is." I connect this with the National Football League, and how calm most team players are when they miss a catch or a block. "It is what it is."
I'm not sure this is the book I would want for advanced intelligence courses, but I really like chapter 19 on the illusion of understanding and chapter 20 on the illusion of validity. Even if our political officials are corrupt, we intelligence professionals should at least strive to get it right.
The author is a believer in algorithms over experts. I have mixed feelings about that. Certainly I agree that most experts are wrong and much narrower in their understanding than common competence requires, but I am also very skeptical of algorithms, witness Google's math hacks against digital garbage. As a believer in collective human intelligence (citizen wisdom councils, etcetera), I accept the importance of taking algorithms as far as they can go, but algorithms are no better than the humans who constructed them and the data known to the humans at that time.
I give the author great credit for providing a superb overview across the book of stars in this field. This is not a selfish or self-centered book--it appears to do full justice to all others.
Great thoughts in this book:
01 Capitalists--both inventors and entrepreneurs--overestimate their success rate by two times.
02 Illusion of control is increased by a failure to seek out data from others [this is one reason I champion M4IS2--Multinational, Multiagency, Multidisciplinary, Multidomain Information-Sharing and Sense-Making) and public intelligence in the public interest).
QUOTE (262): Organizations that take the word of overconfident experts can expect costly consequences.
NEW TO ME: Psycho-physics--relation of mind and matter. This may be a different way of talking about quantum physics, but it is one more indication that mind-matter interfaces are going to be a huge area of study in the future.
The author confirms Machiavelli 101--defenders of the status quo are always stronger than reformers seeking change [he does not say this but Kuhn and others do: UNTIL the status quo self-destructs from its own corruption, and the reformers are free to build on its ashes].
On page 411 he provides a very serious critique of libertarianism, pointing out that libertarians assume all individuals are rational and see no value in aggregate services.
QUOTE (417): Observers are less cognitively bvusy and more open to information than actors.
The more I think about this, the more I think that we need a new class of intelligence professionals who are neither collectors nor analysts, but observers "in situ" with decision-makers or "in situ" with crisis situations, and they provide the "third eye". I am writing the chapter on "The Craft of Intelligence" for the next Routledge Handbook of Intelligence Studies, and this is one new idea that I credit to this book and author, that I plan to integrate into my thinking about the future of intelligence as a discipline.
There are two appendices and an excellent index.
As is my custom, I always use Amazon's link feature to point to other related books. Here are ten in the decision-making arena that I consider especially valuable.
Radical Man: The Process of Psycho-Social Development
The Knowledge Executive
Tools for Thought: The History and Future of Mind-Expanding Technology
Thinking in Time: The Uses of History for Decision-Makers
Planning with Complexity: An Introduction to Collaborative Rationality for Public Policy
Business War Games: How Large, Small, and New Companies Can Vastly Improve Their Strategies and Outmaneuver the Competition
Reflexive Practice: Professional Thinking for a Turbulent World
Open Space Technology: A User's Guide
Atlas of Science: Visualizing What We Know
Holistic Darwinism: Synergy, Cybernetics, and the Bioeconomics of Evolution
126 of 149 people found the following review helpful
on October 25, 2011
Behavioral Economics is perhaps the most popular genre of non-fiction in the last decade. With bestsellers by the likes of Malcolm Gladwell, Steven D. Levitt, Dan Ariely, Richard Thaler, Tim Harford, and a number of other qualified journalists and academics, it seems as though the field contains an infinite wealth of fascinating material. And, it could be said, that all of this is due in large part to the work of Daniel Kahneman.
As a part of the pioneering team with Amos Tversky, Kahneman has practically shaped Behavioral Economics since the 1960s, when they began conducting experiments. This book brings together all of Kahneman's findings in one coherent study.
Since Kahneman's work has been so influential, a lot of the ideas presented here might not be new. Cognitive biases such as loss aversion, priming, and framing have all been presented and analyzed in nearly every Behavioral Economics book out there. But, while the ideas are not novel, it is rewarding to hear analysis from the original source of the studies. Kahneman provides insights into the rationale of the studies that other writers could not offer, and so this book seems more penetrative than the others. Where his successors string together pieces of interesting yet seemingly incoherent tid bits about cognition and behavior, Kahneman proposes a much more developed thesis on human cognition.
That thesis is summarized by the title--that there are two ways humans think and make decisions, "fast" and "slow," and that we cannot disregard either when considering people's thoughts and actions. The two ways can be described by a number of dichotomies: The first method of thinking is automatic, the second is controlled; the first is effortless, the second effortful; and so on. An easier way to describe the two systems would be to identify them as subconscious and conscious, though Kahneman does not explicitly make this description, perhaps because these concepts are so loaded with meaning.
Kahneman examines this concept by delving into the latest studies in the field and thus provides the avid Behavioral Economics reader a source of great new instances of it. The survey of cognitive errors includes research on the strange tendencies of golfers under stressful situations, parole officers after lunch, and shoppers under the influence of marketing ploys. As it has been in nearly all Behavioral Economics books, this material is absolutely fascinating and doesn't ever seem to lose its mystery.
Of course, despite being so fascinating, Behavioral Economics as a discipline has its flaws, and this book is no exception. In general, the flaws have to do with the fact that the studies assumed to prove various cognitive errors are rather abstract by nature and so rely on a number of qualifications to even be useful. In order to analyze behavior, for instance, there need to be objective standards for "right" and "wrong" actions, which I'm not sure has been investigated as thoroughly as it should. It is for this reason that far reaching claims about human behavior being irrational and the subsequent calls for changes in social structure (by whatever means) are typically unfounded and lead down a dangerous road of regulation and control (Thaler's Nudge and Ariely's Predictably Irrational come to mind). Kahneman does not jump to these conclusions, and certainly does not propose policy action a la Thaler or Ariely, but he does lay the groundwork for such ventures--after all, he is the pioneer.
Altogether, this book does not suffer from this inherent flaw, and rather simply encourages more study and debate. And that might make it a classic in the field.
143 of 173 people found the following review helpful
on November 27, 2011
Format: HardcoverVerified Purchase
I was privileged to have Daniel Kahneman for many years as a member of my research group "MacArthur Network on the Origin and Nature of Norms and Preferences," where I came to appreciate his dedication, intelligence, and friendship. At last Kahneman has written a book for the public, and is already a widely discussed best seller. I am convinced that the contributions of Kahneman and his coauthor Amos Tversky are fundamental and lasting, but I am concerned that his work not be seen as conveying the general message that "people are irrational."
I was motivated to write these hasty comments by the review of the book by Jim Holt in the New York Times Book Review (11/27/2011), p. 16ff. Holt writes "Although Kahneman draws only modest policy implications... others... go much further. [David Brooks, NY Times editorial writer], for example, has argued that Kahneman and Tversky's work illustrates `the limits of social policy'; in particular, the folly of government action to fight joblessness and turn the economy around." Of course, nothing of the sort follows from Kahneman and Tversky's work.
I know Kahneman and Tversky's work (hereafter KT) very well, but I am going through this book slowly, so I will add to my comments as they accumulate.
Psychologists have known for many years that humans make systematic visual mistakes, called "optical illusions." This does not elicit the general pronouncement that humans systematically error in their visual judgments. The same should apply to KW's results: the results are correct, but they should not be sloppily interpreted as saying that people are general illogical and error-prone decision-makers.
First main point: I believe KW's work does not at all suggest that people are poor at making logical inferences. The experiments which might suggest this are generally misinterpreted. A particularly pointed example of this heuristic is the famous Linda the Bank Teller problem, first analyzed in Amos Tversky and Daniel Kahneman, "Extensional versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment", Psychological Review 90 (1983):293-315. Subjects are given the following description of a hypothetical person named Linda: "Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations." The subjects were then asked to rank-order eight statements about Linda according to their probabilities. The statements included the following two: "Linda is a bank teller" and Linda is a bank teller and is active in the feminist movement."
More than 80\% of the subjects---graduate and medical school students with statistical training and doctoral students in the decision science program at Stanford University's business school---ranked the second statement as more probable than the first. This seems like a simple logical error because every bank teller feminist is also a bank teller. However, there is another interpretation according to which the subjects are correct in their judgments. Let p and q be properties that every member of a population either has or does not have. The standard definition of "the probability that member x is p" is the fraction of the population for which p is true. But an equally reasonable definition is the probability that x is a member of a random sample of the subset of the population for which p is true.' According to the standard definition, the probability of p and q cannot be greater than the probability of p. But, according to the second, the opposite inequality can hold: x might be more likely to appear in a random sample of individuals who are both p and q than in a random sample of the same size of individuals who are p.
In other words, the probability that a randomly chosen bank teller is Linda is probably much lower than the probability that a randomly chosen feminist bank teller is Linda. Another way of expressing this point is that the probability that a randomly chosen member of the set "is a feminist bank teller" may be Linda is greater than the probability that a randomly chosen member of the set "is a bank teller," is Linda.
I believe my interpretation is by far the more natural. Moreover, why would the experimenters have included information about Linda's college behavior unless it were relevant? This behavior is completely irrelevant given KW's interpretation of probability, but wholly pertinent given a "conditional probability" interpretation. The latter can be colloquially restated as "the conditional probability that an individual is Linda given that she is a feminist bank teller is higher than the conditional probability that an individual is Linda given that she is a bank teller."
Second Main Point: Many of the examples of irrationality given by KW are not in any way irrational. Consider for example the chief investment officer described by Kahneman (p. 12). This man invested tens of millions of dollars in Ford Motor Company stock after having visited and automobile show and having been impressed with the quality of the current offering of Ford vehicles. Kahneman says "I found it remarkable that he had apparently not considered the one question that an economist would call relevant: I Ford stock currently underpriced?" In fact, there is no objective measure of a stock being "underpriced," and no known correlation between a measure of "being underpriced" and subsequent performance on the stock market. Moreover, the executive may not have revealed all of the reasoning involved in his decision, but rather only a "deciding factor" after other considerations had been factored in.
Third Main Point: We have long known that people do not generally act in their own best interest. We have weakness of will, we procrastinate, we punish ourselves for things that are not our fault, we act thoughtlessly and regret our actions, yet repeat them, we become addicted to cigarettes and drugs, we become obese even though we would like to be thin, we pay billions of dollars for self-help books that almost never work. KW have not added much, if anything, to our understanding of this array of bizarre behaviors. Of course, they do not claim otherwise. However, commentators regularly claim that this behavior somehow contradicts the `rational actor model' of economic theory, which it does not in any way. Economic theory explores the implications of human choice behavior without claiming that the choices people make are in some sense prudent or even desirable to the decision-maker (we cannot choose our preferences).
Economic theory is in general supportive of the notion that people should get what they want, but has included the notion of "merit goods" that society values or disvalues for moral or practical reasons that counterindicate consumer sovereignty. For instance, we regulate pharmaceuticals, we prohibit racial discrimination in public places, and we outlaw markets in body parts.
Fourth Main Point: KW are right on target in asserting that people make massive errors in interpreting statistical arguments (e.g., the base rate fallacy, or the interpretation of conditional probabilities). This has nothing to do with "illogicality" or "irrationality," but rather the complexity of the mathematics itself. For instance, KW have shown that physicians routinely fail to understand what the statistical accuracy of lab tests mean---the fact that at test is 95% accurate is compatible with the fact that it is wrong 95% (or any other, depending on the incidence of the condition that is test for) of the time.
The psychologist Gerd Gigerenzer has shown that if conditional probabilities are reinterpreted as frequencies, people have no problem in interpreting their meaning (see the discussion "Risk School" in Nature 461,29, October 2009). Gigerenzer has be promoting the idea that trigonometry be dropped from the high school math sequence (no one uses it except surveyors, physicists, and engineers) and probability theory be added. This sounds like a great idea to me.
Of course, if people do not do well a formal statistical analysis, how are to we defend the rational actor model, which is thoroughly Bayesian and implicitly assumes people are infinitely capable statistical decision-makers? The answer is not to abandon the rational actor model, which in general has had exceptional explanatory power---see my book, The Bounds of Reason (Princeton 2009) and my review of Ken Binmore's Rational Decisions (Economic Journal, February 2010). Rather, I believe the answer lies in replacing the subjective prior assumption of the rational actor model with a broader assumption that individuals make decision within networks of minds that are characterized by distributed cognition, much as social insects, except of course on a much higher level, using language instead of pheromones, with the cultural construction of iconic rather than pheromonic signals. But, that is the subject to be explored in the future. One of the payoffs of KT research is to make it clear how insufficient the standard economic model of decision-making under (radical) uncertainty really is.
67 of 80 people found the following review helpful
on April 22, 2012
Let me start by declaring a bias: I love pop-science books. I'm fairly happy with the hard core scientific tomes too, but pop-science brings together two of my passions - science and literature, and hence I tend to view most such books through a favourable lens.
Fast & Slow started off fairly well, and Kahneman's writing felt lucid and thoughtful. The introductory chapters held promise of a book that would develop well and give me lots to think about, and provide some useful guidance in my everyday life. About halfway through the book, however, against my strong desire to go on, I gave up, and here's why: the book was going nowhere. It wasn't that it stopped being insightful - it was just that it seemed more like a collection of interesting examples than a collected, coherent development. Individually, the chapters did not feel unpleasant, and in fact contained a lot of knowledge and insight that I was unfamiliar with and found engrossing, but my objection is to putting all these chapters together as a book. Most books, outside of the text/ reference book variety, effectively become part of the greater body of literature, and should be viewed through that lens. I don't believe that books can be measured by different yardsticks on account of being clever, or being written by somebody famous or important, and consequently, I believe this book falls short.
So, despite my bias towards this genre, my recommendation would be to read it as a background book, or better still, as a series of articles over a long-ish period of time. You might save yourself some of the disappointment that way and be able to enjoy, and complete this book.
49 of 58 people found the following review helpful
on February 9, 2012
An interesting book, but repetitious unless you are a real fan of psychology. Read the first 50 pages and ask yourself what you learned. Read the next 50 pages and ask yourself how much more you learned. The increment diminishes.