Amazon.com: Customer Reviews: How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life
Automotive Deals HPCC Amazon Fashion Learn more Discover it $5 Albums Fire TV Stick Handmade school supplies Shop-by-Room Amazon Cash Back Offer TarantinoCollection TarantinoCollection TarantinoCollection  Amazon Echo  Echo Dot  Amazon Tap  Echo Dot  Amazon Tap  Amazon Echo Starting at $49.99 All-New Kindle Oasis AutoRip in CDs & Vinyl Segway miniPro STEM Subscribe & Save

Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

on March 23, 2005
Mr. Gilovich says ". . . there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs." The cost of these biases is real and severe. This book explains why people are prone to wrong thinking, and ways they can counteract this.

Here are points that Mr. Gilovich made:

1. Seeing Order in Randomness - We all have a natural tendency to see order in data, even when the data is totally random and irregular. We do this even when we have no personal reason to see order. This happens especially when we remember facts from the past. Our memory plays tricks on us by emphasizing any possible patterns, and forgetting irregularities that might refute the patterns. For instance, basketball players often think that if they make one successful basket, then they are more likely to make the next basket - because they remember times when this has happened to them. "When you're hot, you're hot." However, objective statistical studies done on when successful baskets are made show that, if anything, the opposite is true.

This natural tendency to misconstrue random events is called the "clustering illusion." Chance events often seem to us to have some order to them, but when the law of averages is applied objectively, this order disappears. This error is compounded when our active imagination tries to create theories for why there should be order. Because of this, we need to be careful when we draw conclusions based on a sequence we think we see in some data.

2. Looking for Confirmation - We all have a natural tendency to look for "yes" instead of "no." If we have an idea, we tend to look for evidence that will confirm our idea, not evidence that will disprove it. This is true even when we have no personal attachment to the idea.

Some researchers believe this tendency results from our need to take an extra neurological step when we try to understand negative or disconfirming evidence, in contrast to positive or confirming evidence. To understand a negative proposition, we may need to translate it into a positive one. Therefore, we subconsciously look for easy positives instead of more difficult negatives. This does not promote objectivity and good science. If we want to do good science, then we need to force ourselves to look for negative evidence that contradict our ideas.

3. Hidden Data - When we search for evidence, often there is data that we unintentionally overlook. For instance, if we receive a bad impression about a person from the beginning, we may avoid them, and by avoiding them, they may never have a chance to show us the better side of their personality. But if we receive a good impression, we may get to know that person better, and thereby gather more positive data, and falsely confirm in our mind that first impressions are reliable. The way we collect data may filter out important categories of data, and this may cause us to confirm our wrong ideas. We need to avoid search strategies that show us only a distorted side of an issue.

4. Mental Corner-Cutting - We all cut corners with our mind. We often use mental strategies - inductive generalizations, etc. - to understand the world around us more quickly and easily. These strategies are very useful. But they come at a cost. These corner-cutting strategies can cause systematic errors or blind spots in our thinking. We need to be aware when we have not been thorough; therefore, we need to look out for signals that we are drawing a wrong conclusion.

5. Objectivity is Not Always Useful - We shouldn't expect everyone to reevaluate their beliefs every time a new piece of evidence comes along. "Well- supported beliefs and theories have earned a bit of inertia. . ." However, we should draw a distinction between a belief that is well supported by evidence over time, and a belief that only has traditional or popular support. Some scientists believe the complex mental processes that give us biases and preconceived notions are some of the same processes that make us intelligent beings - superior to computers or animals. Our biases are useful, but also dangerous. We need to be consciously aware of our biases.

6. Reinterpreting Evidence - When people are presented with ambiguous information, they often interpret it to support their established beliefs. When people are presented with unambiguous information that contradicts their beliefs, they tend to pay close attention to it, scrutinize it, and either invent a way of discounting it as unreliable, or redefine it to be less damaging than it really is.

For instance, gamblers tend to remember their losses very well - remember them better than their winnings - but they remember their losses as "near" wins that provide clues about how to win next time. But gamblers aren't the only ones who do this. We all do this from time to time in our own way.

7. Remembering Selective Evidence - Charles Darwin once said that he ". . . followed a golden rule, namely that whenever a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones."

Darwin's golden rule is not a normal tendency among people. People do not necessarily only remember evidence that supports their beliefs. Rather, they tend to remember events that cause them pain or difficulty, events that they predicted would happen, or events that otherwise drew their attention. They tend to forget events that follow the normal course of things.

For example, some people think that they always end up needing things that they threw away. But this is only because they remember all the things that they threw away, but later needed; while they forget about the many more times when they threw something away and never needed it again.

Another example is how people often say they wake up and their digital clock reads something like 1:23 or 12:12. This seems to be more than a coincidence - how come they wake up at these special times? However, they are simply forgetting the many more times when they woke up and the clock read 3:54 or 10:17. Certain types of events stick in our memory. We need to be careful that our selective memories do not bias our thinking.

8. The Wish to Believe and the Lake Wobegon Effect - The vast majority of people think of themselves as above average in qualities that they think are important. This is called the "Lake Wobegon Effect" after the fictitious community where "all the women are strong, the men are good-looking, and all the children are above average."

For instance, a survey of high- school seniors found that 70% of them thought that they were above average in leadership ability, and 60% thought they were in the top 10% of amiable people. 94% of college professors think they were better than their colleagues are.

One way that people try to confirm their beliefs is to search for evidence until they find something that supports them. They may do a very detailed, in-depth study of something, but they do not stop and evaluate what they have when they uncover evidence against their beliefs. Instead, they continue on and stop only when they've found enough evidence to support their side to relieve their conscience.

Often when we look evidence that supports what we believe, we only ask that it leave the door open for our beliefs. But when we find evidence that contradicts what we believe, we hold it to a higher standard. We ask that it prove its findings beyond a reasonable doubt. We hold others to a higher standard than we hold ourselves. This may be the most important point in this book.

For example, people who believe in a particular stringent health diet may look around for evidence that their diet is working, while people who eat more permissively find solace in studies that say that it doesn't matter what we eat. Conservatives tend to read conservative periodicals and not liberal ones, and therefore they are only exposed to evidence that encourages their convictions. Liberals do the same. What we need here is to search in an even-handed way for supporting evidence and contradicting evidence, and weigh each side objectively.

9. Telling Stories - Much of what we know about our world we heard from others. But second-hand information is often simplified and "cleaned up" as it is told. As we relate stories, we often exaggerate them, or make them happen to a friend instead of to an unknown person, or try to make the story more understandable. We do this subconsciously because we want our audience to be entertained or impressed.

Instead, we need to temper what we hear by: (1) considering the source of the message, (2) putting more credence in actual statements of fact and not predictions, (3) scale estimates down by accepting the less drastic if two numbers offered to us, (4) not allow our personal feelings towards someone deceive us into thinking that they are an example of a widespread phenomenon.

10. Correction from Others - Our friends and acquaintances can bring an objective perspective to our habits and beliefs. For instance, young children are good at correcting silly behaviors in each other, such as a funny way of walking, or eating with your mouth open. But, as we get older, we tend to associate with people who agree with us or share our habits, and therefore we have fewer opportunities to meet corrections. If we have adopted a defective belief, then we may never encounter the correction we need.

11. Strategies - If we all have innate tendencies to reason wrongly, what can we co to combat this? We can train our minds to compensate for our shortcomings: (1) We should be aware of how our minds try to see order even when there is no order. (2) We should be aware of how our minds remember things in a very biased way. (3) We should actively search for data that we may have missed, and especially search for data that contradicts our theories or beliefs. (4) We should ask ourselves how someone who disagrees with us would look at this data? (5) We should remember that stories that we hear may come from an unreliable source, or they may be exaggerated by the storyteller to make a point.

The more we understand and compensate for these errors, the more confidence we can put out beliefs that we have more carefully validated.

Conclusion

I believe these observations apply to the conservative Christian community as much as the rest of the world. Christians have a duty to look at their own beliefs with the same critical eye that they turn on the "liberal media." I wish I could find books like this one by Mr. Gilovich written in the Christian community. We need Christian leaders who will take a stand for self-criticism.

Let's not use bad reasoning or bad science to promote good ideas. An example would be if creationists like me were more open about the evidence that seems to contradict creationism. We like to think that all evidence is in our favor, but I believe that if we were more public about the problems with creationist theories, more people would be impressed with our objectivity and reliability.

The challenge I have for myself is to become more aware of how I am reasoning, and be honest enough to acknowledge the errors I may discover there.
2020 comments| 275 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 16, 2002
A well-written book that focuses on the common errors human beings make when trying to comprehend the world around them, and form opinions. Central ponits: that we try to make order out of chaos, even when there is no order; that we filter what we hear according to our own biases; and that wishful thinking can distort reality. He sets up the cases in a very readable way, and then gives examples of a few erroneous beliefs and their consequences. This is where you may find some disagreeing with him. The case studies include ESP and Homeopathy. If you subscribe to those fallacies, you will probably be challenged during that section of the book. Since there are NO reputable studies that support them, that is to the good. Finally, he gives us a clue into how we can better evaluate what information we are presented with. While not a scholarly work on "Critical Thinking" (such as "Asking the Right Questions: A Guide to Critical Thinking") it would be a wonderful companion book to "Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time" by Shermer and Gould. You owe it to yourself to read this and consider it fairly.
77 comments| 141 people found this helpful. Was this review helpful to you?YesNoReport abuse
on April 19, 2000
The Sports Illustrated curse is NOT real. Our gut feelings about winning streaks and losing streaks are way off. And there's sort of an illusion that makes punishment look more effective than it probably is and reward look less effective than it probably is (reward has a tougher row to hoe, in fact). These are among Gilovich's more memorable points. Each is backed up with plain reasoning AND hard data.
It's just the kind of book that'll make you THINK about what you're thinking. An excellent start down that path, one we all need to take. I enjoyed it and got a lot out of it. I have re-read parts of it a few times in the years since I first bought it.
Written by a social psychologist for a lay audience. It's well organized and easily digestible as long as you are willing to stop and think every so often as you read.
I'd like to see this book handed out to every new college student, or, maybe better, required reading for every high school student.
11 comment| 147 people found this helpful. Was this review helpful to you?YesNoReport abuse
on August 23, 2006
This book provides a well-organized survey of issues that limit our reasoning abilities:

- Our misperception of random events, as in the "clustering illusions" that lead us to believe in the hot hand, for example.

- Our misunderstanding of statistical regression, which, for instance, affects our perception of the roles of reward and punishment in education.

- Our tendency to seek confirmatory information, as in the justification of our choices.

- Our inability to see what could have happened under different circumstances, as in self-fulfilling prophecies (e.g. a negative first impression or the presumed insolvency of a financial institution).

- Our own biases that make us expose inconsistent information to more critical scrutiny than consistent information.

- Asymmetries that distort what we recall and, thus, what we take into account to evaluate the validity of beliefs (as in multiple endpoints situations or one-sided events).

- Our tendency to believe what we want to believe (specially about ourselves), as if beliefs were possessions.

- The distortions present in secondhand information (a.k.a. sharpening and leveling).

- The influence of what we think others believe (and also of the inadequate feedback we often receive about that).

These limitations make us draw incorrect conclusions and bolster erroneous beliefs. Being aware of them helps us in distinguishing what we know well from what we only think is true. Just this is of utmost importance for thinking clearly. Could there be a better reason for reading this book?
0Comment| 39 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 1, 2002
I don't remember how I came across this book. I think it may have had something to do with a search I was conducting for books that explain why there's so much misinformation out there about the stock market, and how I can better avoid being [pulled] in by it. I'm not sure that I'd recommend reading it for that purpose, but that's not a criticism. In fact, it's a much more generally applicable book. There were many, many times when I thought to myself as reading, "Yeah, I make that mistake." For instance, I'm great at picking apart evidence in support of ideas and arguments that I don't agree with, yet I frequently accept evidence for ideas that I do support with little questioning. But since reading chapter 3 of this book, I have at least become conscious of when I am doing this. So this book wasn't only informative, but it actually has had an impact on the way that I think. And there's plenty of fun statistics in here too. Guess what the chances are of two people in a room of 23 randomly selected people sharing a birthday. Would you believe 50%? I'm still trying to come to terms with that one. The only caveat I would offer on this book is that occassionally, it is a bit technical. I found that I did need to be in a quiet place where I could focus when I was reading it. But you don't need to be a scientist or have more than a basic understanding of statistics to be able to follow along. I'm very glad I read this. It was both fun and useful.
0Comment| 25 people found this helpful. Was this review helpful to you?YesNoReport abuse
on May 5, 2000
This book teaches one of the most important lessons anyone can learn: that we all make mistakes.
Most of us overestimate the frequency of events which receive wide media coverage, like plane crashes. We strain to find significance in random data, and believe things we want to believe even if there's no evidence to support them. "How We Know What Isn't So" explains why, and shows us how to overcome the factors which produce such systematic error.
Until very recently, many of the most egregiously false claims never reached a broad audience. Now that the Web allows anyone with rudimentary skills to create an impressive-looking, authoritative-sounding Web site, the lessons of this book are more important than ever.
0Comment| 25 people found this helpful. Was this review helpful to you?YesNoReport abuse
on July 28, 2006
Gilovich starts this book of with a real bang! After reading the Introduction, I thought this book would be really interesting and change the way I look at the world. He writes in a college-teacher tone that will not be approachable to those he most needs to reach. In other words, I really don't need a book to tell me to avoid holistic cretins, ESP, psychics, mysticism, and all that other garbage.

The first half of his book is concerned with a review of the psychology literature from the '70s and '80s on how people arrive at their beliefs. This is boring and somewhat like a set of lecture notes. The information in it is useful, but it could be summarized and edited. The second half of the book is devoted to why people shouldn't believe in the topics I mentioned above. Most people who read his book wouldn't even dream of subscribing to these beliefs, so the whole second half is pretty boring.

Gilovich finishes the book by telling us that psychologists are best at understanding the world and are the most perceptive professionals out there, bar none. While admitting that the core thought processes that lead to logical decision making come from hard science, Gilovich wants us to believe that the softies have perfected clear thinking about the world. He should have reread that portion of the book when he wasn't overheated and realized how silly it would sound to attorneys, physicians, scientists, and other thoughtful people.
99 comments| 90 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 18, 1999
This book's strength lies in Gilovich's ability to make science, statistics, and the tools of critical thinking accessible to anyone. Armed with the examples and reasoning of this excellent work, anyone would be able to stand up to the most fervent proponents of bogus "phenomena" like certain alternative therapies and the easy lure of seeing "extra-sensory" connections where none exist. Most importantly, Gilovich is able to explain in simple language why the average reader should be wary of anecdotal evidence, and should not fail to look at such "evidence" in its overall context. In other words, the book brings home the importance of the scientific method and tenaciously holds to that standard. Interestingly, in the case of the smallest bit of empirical evidence for ESP ("Ganzfeld" experiments), the book recites the data without bias against such phenomena. Instead, as is his way, Gilovich simply follows the data where it leads. The author should rank in the same league as Steven Jay Gould and Carl Sagan in terms of bringing science to the lay reader.
0Comment| 27 people found this helpful. Was this review helpful to you?YesNoReport abuse
on October 28, 2004
In this book, How We Know What Isn't So, Thomas Gilovich takes us through the facts behind spurious reasoning, anecdotal evidence, and incomplete analyses. In the current climate of fad diets, herbal remedies, and other pseudoscientific claims, it is an important book, laying bare the faulty reasoning that can lead to errors in judgement or to falling for some con artist's story.

The primary focus of the book is an analysis of how the human mind tends to bring order from randomness. For example, early chapters deal with random events, and Gilovich points out how seemingly ordered random events can be. For example, when flipping a coin, you should expect to see 4 or 5 heads (or tails) in a row at some point when flipping the coin 100 times. This seems obvious, but he then moves to "real life" examples to show how such randomness can lead to a belief in hot streaks when gambling, for example.

Similarly, he tackles and explains studies that clearly show the importance of anecdotal evidence - we'll take information at face value if we hear it from someone we trust, even if we don't know where that person got the info. Likewise, people tend to more closely scrutinise evidence that contradicts their beliefs in an attempt to find a fault with the evidence. However, they are also likely to "mindlessly" accept anything that seems to support a belief. This is human nature, of course, but it means that it's human nature to not evaluate evidence objectively, and that is what leads to spurious reasoning and belief in concepts that have no scientific evidence (the efficacy of many herbal medicines, belief in ESP, etc.)

In a year with a particularly acrimonious election, this book seems doubly important. With the mudslinging and misrepresentations taking place in both the Kerry and the Bush camps, it's tough for the average person to sort it all out. Gilovich's book points the way to how to ask the right questions on the way to finding the truth, which is something everyone needs to be able to do in a democracy.
0Comment| 13 people found this helpful. Was this review helpful to you?YesNoReport abuse
on December 1, 1999
More babies are not born during a full moon. After a job interview, the wrong person is likely to be hired. Being pushy does not help you get along in life. Infertile couples who adopt are not more likely to conceive. So why do some many people believe it is so? An excellent book full of the answers that, alas, will not convince the people who need to hear it most.
0Comment| 20 people found this helpful. Was this review helpful to you?YesNoReport abuse