Enter your mobile number below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
Getting the download link through email is temporarily not available. Please check back later.

  • Apple
  • Android
  • Windows Phone
  • Android

To get the free app, enter your mobile phone number.

Have one to sell? Sell on Amazon
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 3 images

Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts Paperback – March, 2008

4.5 out of 5 stars 346 customer reviews

See all 17 formats and editions Hide other formats and editions
Price
New from Used from
Kindle
"Please retry"
Paperback, March, 2008
$6.85 $1.81
Unknown Binding
"Please retry"
$49.56

Up to 50% off select Non-Fiction books
Featured titles are up to 50% off for a limited time. See all titles
click to open popover

Editorial Reviews

Excerpt. © Reprinted by permission. All rights reserved.

CHAPTER 1
 
Cognitive Dissonance:
The Engine of Self-justification

 
Press release date: November 1, 1993
 
           we didn’t make a mistake when we wrote in our previous releases that New York would be destroyed on September 4 and October 14, 1993. We didn’t make a mistake, not even a teeny eeny one!
 
Press release date: April 4, 1994
 
           All the dates we have given in our past releases are correct dates given by God as contained in Holy Scriptures. Not one of these dates was wrong . . . Ezekiel gives a total of 430 days for the siege of the city . . . [which] brings us exactly to May 2, 1994. By now, all the people have been forewarned. We have done our job. . . .
 
           We are the only ones in the entire world guiding the people to their safety, security, and salvation!
 
           We have a 100 percent track record!1
 
 It’s fascinating, and sometimes funny, to read doomsday predictions, but it’s even more fascinating to watch what happens to the reasoning of true believers when the prediction flops and the world keeps muddling along. Notice that hardly anyone ever says, “I blew it! I can’t believe how stupid I was to believe that nonsense”? On the contrary, most of the time they become even more deeply convinced of their powers of prediction. The people who believe that the Bible’s book of Revelation or the writings of the sixteenth-century self-proclaimed prophet Nostradamus have predicted every disaster from the bubonic plague to 9/11 cling to their convictions, unfazed by the small problem that their vague and murky predictions were intelligible only after the event occurred.
 
           Half a century ago, a young social psychologist named Leon Festinger and two associates infiltrated a group of people who believed the world would end on December 21.2 They wanted to know what would happen to the group when (they hoped!) the prophecy failed. The group’s leader, whom the researchers called Marian Keech, promised that the faithful would be picked up by a flying saucer and elevated to safety at midnight on December 20. Many of her followers quit their jobs, gave away their homes, and dispersed their savings, waiting for the end. Who needs money in outer space? Others waited in fear or resignation in their homes. (Mrs. Keech’s own husband, a nonbeliever, went to bed early and slept soundly through the night as his wife and her followers prayed in the living room.) Festinger made his own prediction: The believers who had not made a strong commitment to the prophecy—who awaited the end of the world by themselves at home, hoping they weren’t going to die at midnight—would quietly lose their faith in Mrs. Keech. But those who had given away their possessions and were waiting with the others for the spaceship would increase their belief in her mystical abilities. In fact, they would now do everything they could to get others to join them.
 
           At midnight, with no sign of a spaceship in the yard, the group felt a little nervous. By 2 a.m., they were getting seriously worried. At 4:45 a.m., Mrs. Keech had a new vision: The world had been spared, she said, because of the impressive faith of her little band. “And mighty is the word of God,” she told her followers, “and by his word have ye been saved—for from the mouth of death have ye been delivered and at no time has there been such a force loosed upon the Earth. Not since the beginning of time upon this Earth has there been such a force of Good and light as now floods this room.”
 
           The group’s mood shifted from despair to exhilaration. Many of the group’s members, who had not felt the need to proselytize before December 21, began calling the press to report the miracle, and soon they were out on the streets, buttonholing passersby, trying to convert them. Mrs. Keech’s prediction had failed, but not Leon Festinger’s.
 
°°°
 
The engine that drives self-justification, the energy that produces the need to justify our actions and decisions—especially the wrong ones—is an unpleasant feeling that Festinger called “cognitive dissonance.” Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as “Smoking is a dumb thing to do because it could kill me” and “I smoke two packs a day.” Dissonance produces mental discomfort, ranging from minor pangs to deep anguish; people don’t rest easy until they find a way to reduce it. In this example, the most direct way for a smoker to reduce dissonance is by quitting. But if she has tried to quit and failed, now she must reduce dissonance by convincing herself that smoking isn’t really so harmful, or that smoking is worth the risk because it helps her relax or prevents her from gaining weight (and after all, obesity is a health risk, too), and so on. Most smokers manage to reduce dissonance in many such ingenious, if self-deluding, ways.
 
           Dissonance is disquieting because to hold two ideas that contradict each other is to flirt with absurdity and, as Albert Camus observed, we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd. At the heart of it, Festinger’s theory is about how people strive to make sense out of contradictory ideas and lead lives that are, at least in their own minds, consistent and meaningful. The theory inspired more than 3,000 experiments that, taken together, have transformed psychologists’ understanding of how the human mind works. Cognitive dissonance has even escaped academia and entered popular culture. The term is everywhere. The two of us have heard it in TV newscasts, political columns, magazine articles, bumper stickers, even on a soap opera. Alex Trebek used it on Jeopardy, Jon Stewart on The Daily Show, and President Bartlet on The West Wing. Although the expression has been thrown around a lot, few people fully understand its meaning or appreciate its enormous motivational power.
 
           In 1956, one of us (Elliot) arrived at Stanford University as a graduate student in psychology. Festinger had arrived that same year as a young professor, and they immediately began working together, designing experiments to test and expand dissonance theory.3 Their thinking challenged many notions that were gospel in psychology and among the general public, such as the behaviorist’s view that people do things primarily for the rewards they bring, the economist’s view that human beings generally make rational decisions, and the psychoanalyst’s view that acting aggressively gets rid of aggressive impulses.
 
           Consider how dissonance theory challenged behaviorism. At the time, most scientific psychologists were convinced that people’s actions are governed by reward and punishment. It is certainly true that if you feed a rat at the end of a maze, he will learn the maze faster than if you don’t feed him; if you give your dog a biscuit when she gives you her paw, she will learn that trick faster than if you sit around hoping she will do it on her own. Conversely, if you punish your pup when you catch her peeing on the carpet, she will soon stop doing it. Behaviorists further argued that anything that was merely associated with reward would become more attractive—your puppy will like you because you give her biscuits—and anything associated with pain would become noxious and undesirable.
 
           Behavioral laws do apply to human beings, too, of course; no one would stay in a boring job without pay, and if you give your toddler a cookie to stop him from having a tantrum, you have taught him to have another tantrum when he wants a cookie. But, for better or worse, the human mind is more complex than the brain of a rat or a puppy. A dog may appear contrite for having been caught peeing on the carpet, but she will not try to think up justifications for her misbehavior. Humans think; and because we think, dissonance theory demonstrated that our behavior transcends the effects of rewards and punishments and often contradicts them.
 
           For example, Elliot predicted that if people go through a great deal of pain, discomfort, effort, or embarrassment to get something, they will be happier with that “something” than if it came to them easily. For behaviorists, this was a preposterous prediction. Why would people like anything associated with pain? But for Elliot, the answer was obvious: self-justification. The cognition that I am a sensible, competent person is dissonant with the cognition that I went through a painful procedure to achieve something—say, joining a group that turned out to be boring and worthless. Therefore, I would distort my perceptions of the group in a positive direction, trying to find good things about them and ignoring the downside.
Copyright © 2007 by Carol Tavris and Elliot Aronson
 
All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or ...

About the Author

CAROL TAVRIS is a social psychologist and author of Anger and The Mismeasure of Woman. She has written for the Los Angeles Times, the New York Times, Scientific American, and many other publications. She lives in Los Angeles.

ELLIOT ARONSON is a social psychologist and author of The Social Animal. The recipient of many awards for teaching, scientific research, writing, and contributions to society, he is a professor emeritus at the University of California, Santa Cruz.

NO_CONTENT_IN_FEATURE

New York Times best sellers
Browse the New York Times best sellers in popular categories like Fiction, Nonfiction, Picture Books and more. See more

Product Details

  • Paperback: 304 pages
  • Publisher: Mariner Books; Reprint edition (March 2008)
  • Language: English
  • ISBN-10: 0156033909
  • ISBN-13: 978-0156033909
  • Product Dimensions: 5.3 x 0.9 x 8 inches
  • Shipping Weight: 10.4 ounces
  • Average Customer Review: 4.5 out of 5 stars  See all reviews (346 customer reviews)
  • Amazon Best Sellers Rank: #92,567 in Books (See Top 100 in Books)

Customer Reviews

Top Customer Reviews

By Dr. Cathy Goodwin TOP 1000 REVIEWERVINE VOICE on June 13, 2007
Format: Hardcover
Why do people refuse to admit mistakes - so deeply that they transform their own brains? They're not kidding themselves: they really believe what they have to believe to justify their original thought.

There are some pretty scary examples in this book. Psychologists who refuse to admit they'd bought into the false memory theories, causing enormous pain. Politicians. Authors. Doctors. Therapists. Alien abduction victims.

Most terrifying: The justice system operates this way. Once someone is accused of a crime - even under the most bizarre circumstances - the police believe he's guilty of something. Even when the DNA shows someone is innocent, or new evidence reveals the true perpetrator, they hesitate to let the accused person go free.

This book provides an enjoyable, accurate guide through contemporary social psychology. So many "obvious" myths are debunked as we learn the way memory really works and why revenge doesn't end long-term conflict.

Readers should pay special attention to the authors' discussion of the role of science in psychology, as compared to psychiatry, which is a branch of medicine. I must admit I was shocked to realize how few psychiatrists understand the concept of control groups and disconfirmation. Psychoanalysis in particular is not scientific. The authors stop short of comparing it to astrology or new age.

This book should be required reading for everyone, especially anyone who's in a position to make policy or influence the lives of others. But after reading Mistakes were Made, I suspect it won't do any good. Once we hold a position, say the authors, it's almost impossible to make a change.
7 Comments 305 people found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Hardcover Verified Purchase
Or so say Tavis and Aronson on how we lose our ethical grip---we make a small slip, say to ourselves it is not that bad, and our minds rationalize the next slip. From lunch with a lobbyist to a golf outing in Europe is not---when the mind puts its mind to it---that big a leap. Their discussion of confirmation bias, one of the worst breeders of bad decisions is outstanding and undertandable. And the chapter on how the police get the innocent to confess is chilling. There are all sorts of useful tips.Want to co-op an enemy? Get her to do a favor for you; her mind will say, "I do not do favors for jerks,and because I do not, he must not be that big a jerk." The mind can not hold two thoughts at once, so it bridges the dissonance. At 236 pages, the book is long enough to be worthwhile, but short enough to read on a vacation. Anyone interested in persuasion and how our minds work will find the read a useful one.
1 Comment 242 people found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Hardcover Verified Purchase
Ready for a whirlwind tour through time and space, from the Crusades and the Holocaust to the war in Iraq, from recovered memories and the fallacies of clinical judgment to false confessions, wrongful convictions, and failed marriages? Then this is the book for you.

What ties these disparate topics together, according to tour guides Carol Tavris and Elliot Aronson, is the notion of "cognitive dissonance," which has been creeping into popular awareness in recent years. Cognitive dissonance is the uncomfortable feeling created when you experience a conflict between your behavior and your beliefs, most specifically about who you are as a person. ("I'm a good person, I couldn't do this bad thing.") To reduce dissonance, people engage in a variety of cognitive maneuvers, including self-serving justifications and confirmation bias (paying attention to information that confirms our beliefs while discounting contrary data).

Tavris and Aronson, both top social psychologists and excellent writers to boot, make their point through the repeated use of a pyramid image. Two people can be standing at the top an imaginary pyramid and can undergo the same dissonance-inducing experience. Person A processes the experience accurately, which leads him down one side of the pyramid. Person B engages in a series of defensive maneuvers to reduce cognitive dissonance that eventually lands him at the opposite side of the pyramid. Once at these opposite poles, the two can no longer recognize their initial similarities, and see each other as unfathomable and even dangerous. A particularly compelling, real-life example is two men who experienced a terrifying episode of sleep paralysis in which they saw demons attacking them.
Read more ›
2 Comments 60 people found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
Format: Hardcover
This page-turning read takes you through the myriad ways in which a human urge toward self-justification warps personal lives and contaminates public discourse. The authors ask: "Why do people dodge responsibility when things fall apart?" They explain, with abundant examples. Even more important, they draw readers painlessly through the evidence about self-justification, much of it based on research into the contours of memory distortion.

No one escapes the authors' withering gaze: political leaders who lie to cover up, bosses who kick downward and kiss upward, marriage partners who whine.

A book about the defenses that people erect for bad decisions and hurtful acts might easily turn into an exercise in "bubba psychology", or giving folk wisdom the patina of scholarship. But Tavris and Aronson are much better than that. They are serious, renowned psychologists with a knack for telling arresting stories. They have an eye for counter-intuitive and revealing details. Each chapter tells you things you didn't know, or illuminates experiences you thought you understood, but come to see in a fresh light.

In short, you'll see a bit of yourself as well as others in Mistakes Were Made. You'll be thankful for its insights.
Comment 57 people found this helpful. Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse

Most Recent Customer Reviews