Top positive review
275 people found this helpful
Beliefs and Bias
on March 23, 2005
Mr. Gilovich says ". . . there are inherent biases in the data upon which we base our beliefs, biases that must be recognized and overcome if we are to arrive at sound judgments and valid beliefs." The cost of these biases is real and severe. This book explains why people are prone to wrong thinking, and ways they can counteract this.
Here are points that Mr. Gilovich made:
1. Seeing Order in Randomness - We all have a natural tendency to see order in data, even when the data is totally random and irregular. We do this even when we have no personal reason to see order. This happens especially when we remember facts from the past. Our memory plays tricks on us by emphasizing any possible patterns, and forgetting irregularities that might refute the patterns. For instance, basketball players often think that if they make one successful basket, then they are more likely to make the next basket - because they remember times when this has happened to them. "When you're hot, you're hot." However, objective statistical studies done on when successful baskets are made show that, if anything, the opposite is true.
This natural tendency to misconstrue random events is called the "clustering illusion." Chance events often seem to us to have some order to them, but when the law of averages is applied objectively, this order disappears. This error is compounded when our active imagination tries to create theories for why there should be order. Because of this, we need to be careful when we draw conclusions based on a sequence we think we see in some data.
2. Looking for Confirmation - We all have a natural tendency to look for "yes" instead of "no." If we have an idea, we tend to look for evidence that will confirm our idea, not evidence that will disprove it. This is true even when we have no personal attachment to the idea.
Some researchers believe this tendency results from our need to take an extra neurological step when we try to understand negative or disconfirming evidence, in contrast to positive or confirming evidence. To understand a negative proposition, we may need to translate it into a positive one. Therefore, we subconsciously look for easy positives instead of more difficult negatives. This does not promote objectivity and good science. If we want to do good science, then we need to force ourselves to look for negative evidence that contradict our ideas.
3. Hidden Data - When we search for evidence, often there is data that we unintentionally overlook. For instance, if we receive a bad impression about a person from the beginning, we may avoid them, and by avoiding them, they may never have a chance to show us the better side of their personality. But if we receive a good impression, we may get to know that person better, and thereby gather more positive data, and falsely confirm in our mind that first impressions are reliable. The way we collect data may filter out important categories of data, and this may cause us to confirm our wrong ideas. We need to avoid search strategies that show us only a distorted side of an issue.
4. Mental Corner-Cutting - We all cut corners with our mind. We often use mental strategies - inductive generalizations, etc. - to understand the world around us more quickly and easily. These strategies are very useful. But they come at a cost. These corner-cutting strategies can cause systematic errors or blind spots in our thinking. We need to be aware when we have not been thorough; therefore, we need to look out for signals that we are drawing a wrong conclusion.
5. Objectivity is Not Always Useful - We shouldn't expect everyone to reevaluate their beliefs every time a new piece of evidence comes along. "Well- supported beliefs and theories have earned a bit of inertia. . ." However, we should draw a distinction between a belief that is well supported by evidence over time, and a belief that only has traditional or popular support. Some scientists believe the complex mental processes that give us biases and preconceived notions are some of the same processes that make us intelligent beings - superior to computers or animals. Our biases are useful, but also dangerous. We need to be consciously aware of our biases.
6. Reinterpreting Evidence - When people are presented with ambiguous information, they often interpret it to support their established beliefs. When people are presented with unambiguous information that contradicts their beliefs, they tend to pay close attention to it, scrutinize it, and either invent a way of discounting it as unreliable, or redefine it to be less damaging than it really is.
For instance, gamblers tend to remember their losses very well - remember them better than their winnings - but they remember their losses as "near" wins that provide clues about how to win next time. But gamblers aren't the only ones who do this. We all do this from time to time in our own way.
7. Remembering Selective Evidence - Charles Darwin once said that he ". . . followed a golden rule, namely that whenever a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favorable ones."
Darwin's golden rule is not a normal tendency among people. People do not necessarily only remember evidence that supports their beliefs. Rather, they tend to remember events that cause them pain or difficulty, events that they predicted would happen, or events that otherwise drew their attention. They tend to forget events that follow the normal course of things.
For example, some people think that they always end up needing things that they threw away. But this is only because they remember all the things that they threw away, but later needed; while they forget about the many more times when they threw something away and never needed it again.
Another example is how people often say they wake up and their digital clock reads something like 1:23 or 12:12. This seems to be more than a coincidence - how come they wake up at these special times? However, they are simply forgetting the many more times when they woke up and the clock read 3:54 or 10:17. Certain types of events stick in our memory. We need to be careful that our selective memories do not bias our thinking.
8. The Wish to Believe and the Lake Wobegon Effect - The vast majority of people think of themselves as above average in qualities that they think are important. This is called the "Lake Wobegon Effect" after the fictitious community where "all the women are strong, the men are good-looking, and all the children are above average."
For instance, a survey of high- school seniors found that 70% of them thought that they were above average in leadership ability, and 60% thought they were in the top 10% of amiable people. 94% of college professors think they were better than their colleagues are.
One way that people try to confirm their beliefs is to search for evidence until they find something that supports them. They may do a very detailed, in-depth study of something, but they do not stop and evaluate what they have when they uncover evidence against their beliefs. Instead, they continue on and stop only when they've found enough evidence to support their side to relieve their conscience.
Often when we look evidence that supports what we believe, we only ask that it leave the door open for our beliefs. But when we find evidence that contradicts what we believe, we hold it to a higher standard. We ask that it prove its findings beyond a reasonable doubt. We hold others to a higher standard than we hold ourselves. This may be the most important point in this book.
For example, people who believe in a particular stringent health diet may look around for evidence that their diet is working, while people who eat more permissively find solace in studies that say that it doesn't matter what we eat. Conservatives tend to read conservative periodicals and not liberal ones, and therefore they are only exposed to evidence that encourages their convictions. Liberals do the same. What we need here is to search in an even-handed way for supporting evidence and contradicting evidence, and weigh each side objectively.
9. Telling Stories - Much of what we know about our world we heard from others. But second-hand information is often simplified and "cleaned up" as it is told. As we relate stories, we often exaggerate them, or make them happen to a friend instead of to an unknown person, or try to make the story more understandable. We do this subconsciously because we want our audience to be entertained or impressed.
Instead, we need to temper what we hear by: (1) considering the source of the message, (2) putting more credence in actual statements of fact and not predictions, (3) scale estimates down by accepting the less drastic if two numbers offered to us, (4) not allow our personal feelings towards someone deceive us into thinking that they are an example of a widespread phenomenon.
10. Correction from Others - Our friends and acquaintances can bring an objective perspective to our habits and beliefs. For instance, young children are good at correcting silly behaviors in each other, such as a funny way of walking, or eating with your mouth open. But, as we get older, we tend to associate with people who agree with us or share our habits, and therefore we have fewer opportunities to meet corrections. If we have adopted a defective belief, then we may never encounter the correction we need.
11. Strategies - If we all have innate tendencies to reason wrongly, what can we co to combat this? We can train our minds to compensate for our shortcomings: (1) We should be aware of how our minds try to see order even when there is no order. (2) We should be aware of how our minds remember things in a very biased way. (3) We should actively search for data that we may have missed, and especially search for data that contradicts our theories or beliefs. (4) We should ask ourselves how someone who disagrees with us would look at this data? (5) We should remember that stories that we hear may come from an unreliable source, or they may be exaggerated by the storyteller to make a point.
The more we understand and compensate for these errors, the more confidence we can put out beliefs that we have more carefully validated.
I believe these observations apply to the conservative Christian community as much as the rest of the world. Christians have a duty to look at their own beliefs with the same critical eye that they turn on the "liberal media." I wish I could find books like this one by Mr. Gilovich written in the Christian community. We need Christian leaders who will take a stand for self-criticism.
Let's not use bad reasoning or bad science to promote good ideas. An example would be if creationists like me were more open about the evidence that seems to contradict creationism. We like to think that all evidence is in our favor, but I believe that if we were more public about the problems with creationist theories, more people would be impressed with our objectivity and reliability.
The challenge I have for myself is to become more aware of how I am reasoning, and be honest enough to acknowledge the errors I may discover there.