Truck Month Best Books of the Month Amazon Fashion Learn more nav_sap_plcc_ascpsc $5 Albums Fire TV with 4k Ultra HD Beauty Mother's Day Gifts Amazon Gift Card Offer ctstrph2 ctstrph2 ctstrph2  Amazon Echo  Echo Dot  Amazon Tap  Echo Dot  Amazon Tap  Amazon Echo Fire, Only $39.99 Kindle Paperwhite UniOrlando ReadyRide Bikes from Diamondback May4th

Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

Why do people refuse to admit mistakes - so deeply that they transform their own brains? They're not kidding themselves: they really believe what they have to believe to justify their original thought.

There are some pretty scary examples in this book. Psychologists who refuse to admit they'd bought into the false memory theories, causing enormous pain. Politicians. Authors. Doctors. Therapists. Alien abduction victims.

Most terrifying: The justice system operates this way. Once someone is accused of a crime - even under the most bizarre circumstances - the police believe he's guilty of something. Even when the DNA shows someone is innocent, or new evidence reveals the true perpetrator, they hesitate to let the accused person go free.

This book provides an enjoyable, accurate guide through contemporary social psychology. So many "obvious" myths are debunked as we learn the way memory really works and why revenge doesn't end long-term conflict.

Readers should pay special attention to the authors' discussion of the role of science in psychology, as compared to psychiatry, which is a branch of medicine. I must admit I was shocked to realize how few psychiatrists understand the concept of control groups and disconfirmation. Psychoanalysis in particular is not scientific. The authors stop short of comparing it to astrology or new age.

This book should be required reading for everyone, especially anyone who's in a position to make policy or influence the lives of others. But after reading Mistakes were Made, I suspect it won't do any good. Once we hold a position, say the authors, it's almost impossible to make a change.
66 comments|300 people found this helpful. Was this review helpful to you?YesNoReport abuse
on June 25, 2007
Or so say Tavis and Aronson on how we lose our ethical grip---we make a small slip, say to ourselves it is not that bad, and our minds rationalize the next slip. From lunch with a lobbyist to a golf outing in Europe is not---when the mind puts its mind to it---that big a leap. Their discussion of confirmation bias, one of the worst breeders of bad decisions is outstanding and undertandable. And the chapter on how the police get the innocent to confess is chilling. There are all sorts of useful tips.Want to co-op an enemy? Get her to do a favor for you; her mind will say, "I do not do favors for jerks,and because I do not, he must not be that big a jerk." The mind can not hold two thoughts at once, so it bridges the dissonance. At 236 pages, the book is long enough to be worthwhile, but short enough to read on a vacation. Anyone interested in persuasion and how our minds work will find the read a useful one.
11 comment|241 people found this helpful. Was this review helpful to you?YesNoReport abuse
on August 14, 2007
Ready for a whirlwind tour through time and space, from the Crusades and the Holocaust to the war in Iraq, from recovered memories and the fallacies of clinical judgment to false confessions, wrongful convictions, and failed marriages? Then this is the book for you.

What ties these disparate topics together, according to tour guides Carol Tavris and Elliot Aronson, is the notion of "cognitive dissonance," which has been creeping into popular awareness in recent years. Cognitive dissonance is the uncomfortable feeling created when you experience a conflict between your behavior and your beliefs, most specifically about who you are as a person. ("I'm a good person, I couldn't do this bad thing.") To reduce dissonance, people engage in a variety of cognitive maneuvers, including self-serving justifications and confirmation bias (paying attention to information that confirms our beliefs while discounting contrary data).

Tavris and Aronson, both top social psychologists and excellent writers to boot, make their point through the repeated use of a pyramid image. Two people can be standing at the top an imaginary pyramid and can undergo the same dissonance-inducing experience. Person A processes the experience accurately, which leads him down one side of the pyramid. Person B engages in a series of defensive maneuvers to reduce cognitive dissonance that eventually lands him at the opposite side of the pyramid. Once at these opposite poles, the two can no longer recognize their initial similarities, and see each other as unfathomable and even dangerous. A particularly compelling, real-life example is two men who experienced a terrifying episode of sleep paralysis in which they saw demons attacking them. One recognized it for what it was; the other became convinced that he had been abducted by aliens and had even fathered a set of twins with an alien partner.

The book could have been called, "Cognitive Dissonance: What It Is and How to Combat It," but then it wouldn't be selling like hotcakes. It provides a thorough overview of the social psychology research on this topic, much of it quite interesting and all of it engagingly presented.

The authors conclude by offering suggestions for reducing the impact of cognitive dissonance on individuals and cultures. One remedy is greater oversight, such as mandatory videotaping of all police interviews of suspects, independent commissions to investigate prosecutorial misconduct, and greater transparency in the academic review process. Another is attention to Americans' cultural fear of making mistakes. Intelligence is acquired, not innate, the authors argue, and mistakes are a necessary part of learning. I particularly enjoyed their examples of prominent individuals who forthrightly owned up to mistakes, including a therapist who had engaged in recovered memory treatment, a prosecutor who had obtained the conviction of an innocent man, and - last but not least - Oprah Winfrey.
22 comments|56 people found this helpful. Was this review helpful to you?YesNoReport abuse
on April 10, 2007
This page-turning read takes you through the myriad ways in which a human urge toward self-justification warps personal lives and contaminates public discourse. The authors ask: "Why do people dodge responsibility when things fall apart?" They explain, with abundant examples. Even more important, they draw readers painlessly through the evidence about self-justification, much of it based on research into the contours of memory distortion.

No one escapes the authors' withering gaze: political leaders who lie to cover up, bosses who kick downward and kiss upward, marriage partners who whine.

A book about the defenses that people erect for bad decisions and hurtful acts might easily turn into an exercise in "bubba psychology", or giving folk wisdom the patina of scholarship. But Tavris and Aronson are much better than that. They are serious, renowned psychologists with a knack for telling arresting stories. They have an eye for counter-intuitive and revealing details. Each chapter tells you things you didn't know, or illuminates experiences you thought you understood, but come to see in a fresh light.

In short, you'll see a bit of yourself as well as others in Mistakes Were Made. You'll be thankful for its insights.
0Comment|56 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 10, 2010
This book is amazing because as you read it you go through three distinct stages of understanding.

Stage 1 (50 pages in)
You say to yourself: "Wow, I know quite a few people who are making the mistakes described in this book."

Stage 2 (halfway through)
You say to yourself: "Wow, EVERY single person I know is making the mistakes described in this book."

Stage 3 (by the time you finish the book)
You say to yourself: "Wow, I myself have been making the mistakes described in this book, and I didn't even realize it.
11 comment|31 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 27, 2008
As a self-defense trainer, I'm puzzled by an apparent contradiction. I give a "pop quiz" at the beginning of every class I teach. The question correctly answered most often by most students is whether women are more often assaulted by strangers or acquaintances. Most (correctly) answer acquaintances. Yet, when asked who they see as THEIR possible assailant, almost ALL students describe a stranger blitz attack. And it was this discrepancy between what people told me they knew about assault and who they felt was likely to attack THEM that puzzled me.

Tavris and Aronson's book is all about "cognitive dissonance," a state of mental tension that arises when a person simultaneously holds two ideas, beliefs or opinions that are contradictory. Because holding two contradictory views is a mentally uncomfortable state, cognitive dissonance describes the process by which they become reconciled in the head of the beholder. Everyone over the age of 14 can recall a time when they made a decision, stubbornly stuck by it despite its obvious poor results, and only after enough time went by could acknowledge it as a mistake.

Tavris and Aronson have collected a wide range of examples. While their examples did not directly address my question, I'll infer their answer (here's the short, simplistic version). Acquaintance assault casts doubt on your ability to judge character. That is a weakness. Weakness is bad, and admitting to weakness is also bad. These are uncomfortable feelings. Therefore, even though you INTELLECTUALLY know better, you FEEL more threatened by those dark alleys you'd never walk down anyways.

While I do not consider this the entire explanation for students' contradiction, I believe it is part of a complex convergence of social and psychological factors. How does this information help my students learn better risk assessment?

One of the authors' points is that cognitive dissonance is everywhere because it is a normal activity of the human mind. However, the authors also point out that we can minimize it (and its harmful effects) with awareness and a measure of self-reflection mixed with honesty. Acknowledging mistakes is the first step in learning from them. Acknowledging your real risks is the first step in planning to reduce them.
55 comments|19 people found this helpful. Was this review helpful to you?YesNoReport abuse
on October 16, 2007
I have to admit: I read the book twice. The first time, I bogged down after every other chapter because I needed to reconcile what I was reading with what I regarded as true. Many times the book talked about me, and how I justified some aspects of my life. The book actually portrays scenarios very close to my own circumstances!

So, the first time I read it, I felt defensive, because the renowned psychologists and authors, Carol Tavris and Elliot Aronson, wrote chapters clearly explaining how I had self-justified my decisions in life, and how I made myself believe all the stories I told. I also read chapters filled with references to historical, as well as current events, supporting the authors' theories of cognitive dissonance, prejudice, and hypocrisies in our governments and societies. Every page was an eye-opener that required some serious reflection.

When I reached the end, when all the angles of self-justification and self-deception were finally exhausted, I took a long pause. Then I read the book again, this time I was much more open to a better understanding of the principles the authors shared. Only then did I appreciate the nuances of this mental phenomenon of believing only what we want to believe in. "Believing is seeing."

Tavris and Aronson did a marvelous and professional job explaining the self-justifying mechanisms of memory, law enforcement, marriage, and war. How we manipulate our own memories to validate our bad decisions; how officers of the law are "testilying" to back up their preconceived notions; how husbands and wives rationalize divorce; and how heads of state convince the people, and themselves, that they never make mistakes.

What really impressed me about Mistakes Were Made (but not by me) are the countless quotes and references to the words and actions of well-known personalities, celebrities, and politicians. The allusions could be construed as bold and audacious, but they are all public knowledge--quoted from news items, scientific journals, and research papers--and serve well to prove the authors' theories.

So, if you're curious to know how crooks, criminals, and evildoers can sleep at night, and how bitter couples and warring nations can live with themselves, grab this book. And, yes, read it twice. - Ruby Bayan, OurSimpleJoys.com
0Comment|16 people found this helpful. Was this review helpful to you?YesNoReport abuse
on July 13, 2007
If stars were awarded based on the number of times one mentioned a book to friends and colleagues, "Mistakes Were Made" would rate an 11. This book, written in an accessible style, provides one of the most succinct and persuasive looks into the way human beings manage their self-image. The thread throughout the book's narrative is Cognitive Dissonance Theory, a psychological model positing that the human mind is incapable of holding two contradictory notions simultaneously. That's all the more true when one of the notions is tied to one's self-image. What do humans do when faced with the dissonance between "I am a very good person!" and "I just stabbed my co-worker in the back"? More often than not, the mind's self-justification software kicks into high gear, often to the detriment of accuracy. When abetted by "confirmation bias" -- the tendency to accept evidence that supports one's view and reject that which contradicts it -- the picture emerges of beings whose self-evident rightness is hard to dislodge.

Authors Tavris and Aronson provide scientific basis from the social sciences for their conjectures. The results are not flattering. For instance, those who endure a harsh initiation into a group will rate it more positively (sometimes wildly so!) than those who have paid a small psychic price for joining. The mind cannot tolerate the idea that it went through a difficult, expensive and/or embarrassing ordeal for nothing. By inflating the group's value, or by reducing the initiation's toll ("Naw, it wasn't so bad!") the mind reduces the dissonance between its self-valuation as competent and the merit of the choice it made.
The place where Tavris and Aronson really hit home is in their discussion of memory. Far from being a static and permanent record of our experiences, memory is far more fungible than we would like it to be. Research has shown that some "memories" are entirely false, based on post-hoc mental reconstructions. The most notable example was of the woman who had fond memories of her father reading her a book -- a book she later was shocked to discover had not been published until after his death! More notorious examples involve psychologists who spearheaded the memory recovery movement. In case after case, these professionals inadvertently created memories of childhood traumas -- including sexual abuse -- that destroyed lives and families. Not surprisingly, the psychologists, faced with the dissonant reality that they had been responsible for so much pain, reacted by justifying their behavior and/or claiming that the victims were still being controlled by their alleged abusers.

"Mistakes Were Made" may seem like a breezy little book about kooky human behavior. But the conclusions can have far-ranging and devastating consequences and can be applied to everything from marriage to politics. In a controversial approach that might turn off some readers, the authors discuss attitudes toward the current American war in Iraq. When 40% of a nation's citizen believe (after much press to the contrary) that Iraq's Saddam Hussein was behind 9/11, the consequences of Cognitive Dissonance Theory become sinister indeed.

If you are honest with yourself, you will be strongly affected by the implications of "Mistakes Were Made," since all of us (even book reviewers) are capable of self-deception. What Tavris and Aronson seem to indicate is the frightening conclusion that this self-deception is not merely a moral issue or a bad habit, but is rooted deep in our neurons -- making it at once difficult to discover on self-reflection (since self-reflection itself can be distorted by self-justification and shifting memory) and difficult to correct.

But there is good news, which I have experienced personally. Having been exposed to the ideas in the book, we can be more mindful of how our own minds can betray us. And mindfulness leads to correction, and then to a more accurate view of ourselves, our neighbors and the truth.
0Comment|18 people found this helpful. Was this review helpful to you?YesNoReport abuse
on September 14, 2007
Renowned social psychologists Carol Travis and Elliot
Aronson have written a truly fascinating book, MISTAKES
WERE MADE (BUT NOT BY ME). . . its subtitle made me want
to read it even more: WHY WE JUSTIFY FOOLISH BELIEFS,
BAD DECISIONS AND HURTFUL ACTS because I have long observed
this tendency--even in my own life.

The authors make what could be a dry subject come alive
by the use of many examples . . . in addition, I liked how
they incorporated much research--cited in nearly 40 pages
of endnotes--but made it come alive via a lively writing style.

When they explained how our memories tell more about
what we believe now than what really happened then, I had
to laugh . . . and recall the story of how I once took Risa,
my daughter, to my first home . . . from there, I proceeded
to take her to my elementary school, which I could have
sworn was nearly a mile away . . . in reality, it turned out
to be less than two short blocks away!

MISTAKES WERE MADE further shows how couples can
break out of the "he said,she said" spiral of blame and
defensiveness, and perhaps most importantly, how all of
us can learn to own up and let go of the need to be right.

There were many memorable passages in the book; among
those that most caught my attention were the following:

* The same DNA that exonerates an innocent person can be used
to identify the guilty one, but this rarely happens. Of all the convictions
the Innocence Project has succeeded in overturning so far, there
is not a single instance in which the police later tried to find the
actual perpetrator of the crime. The police and prosecutors just
close the books on the case completely, as if to obliterate its
silent accusation of the mistake they made.

* De Klerk, who had been elected president in 1989, knew that a
violent revolution was all but inevitable. The fight against
apartheid was escalating; sanctions imposed by other countries
were having a significant impact on the nation's economy;
supporters of the banned African National Congress were
becoming increasingly violent, killing and torturing people whom
they believed were collaborating with the white regime. De Klerk
could have tightened the noose by instituting even more repressive
policies in the desperate hope of preserving white power. Instead,
he revoked the ban on the ANC and freed Mandela from the prison
in which he had spent twenty-seven years. For this part, Mandela
could have found entirely legitimate. Instead, he relinquished
anger for the sake of the goal to which he had devoted his life.
"If you want to make peace with your enemy, you have to work with
your enemy," said Mandela. "Then he becomes your partner." In
1993, both men shared the Nobel Peace Prize, and the following
year Mandela was elected president of South Africa.

* Making mistakes is central to the education of budding scientists
and artists of all kinds, who must have the freedom to experiment,
try this idea, flop, try another idea, take a risk, be willing to get the
wrong answer. One classic example, once taught to American
schoolchildren and still on many inspirational Web sites in various
versions, is Thomas Edison's reply to his assistant (or to a reporter),
who was lamenting Edison's ten thousand experimental failures in
his effort to create the first incandescent light bulb. "I have not failed,"
he told the assistant (or reporter). "I successfully discovered 10,000
elements that don't work." Most American children, however, denied
the freedom to noodle around, experiment, and be wrong in ten ways,
let alone ten thousand. The focus on constant testing, which grew
out of reasonable desire to measure and standardize children's
accomplishments, has intensified their fear of failure. It is
certainly important for children to learn to succeed; but it is just
as important for them to learn not to fear failure. When children or
adults fear failure, they fear risk. They can't afford to be wrong.

That said, you won't go wrong by reading MISTAKES WERE
MADE . . . I was so impressed by it that I now plan to get
copies of the book for many of my colleagues at my college,
in that they will be able to relate to much of it . . . so will you.
0Comment|11 people found this helpful. Was this review helpful to you?YesNoReport abuse
This is a well written, snappy book that addresses an important issue, best described by the book's title and subtitle: "Mistakes Were Made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts."

The two authors, both well reputed psychologists, use the theory of cognitive dissonance as their starting point. Leon Festinger was one of the major theorists of this approach. The authors of this book simply define the perspective thus (page 13): "Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as 'Smoking is a dumb thing to do because it could kill me' and 'I smoke two packs a day.'" How does one deal with this? By adopting one of the positions and then downgrading or rejecting the other. The end result is self-justification, self-deception, seeking out evidence to support the choice that we have made while rejecting evidence that does not fit with our choice.

The brain itself shows evidence of the operation of cognitive dissonance. The example on page 19 of functional Magnetic Resonance Imaging (fMRI) and processing information about presidential candidates is telling. The end result is "blind spots," in which people (page 42) "fail to notice vital events and information that might make them question their behavior or their convictions." As such, the authors note that cognitive dissonance makes mincemeat of such theoretical views as rational actor theory and psychoanalytic theory. One result of cognitive dissonance is what is called "confirmation bias," the attending to evidence that supports our views and the rejection/suppression of evidence that does not support our views.

Many examples are advanced to illustrate the case that the authors make. Issues include: moral lapses (e.g., Watergate participants), "made up" memories (raising serious questions about the whole idea of repressed memories), criminal justice system decisions on guilt or innocence, and so on. Much is at stake with cognitive dissonance as it operates.

In the closing chapter, the authors try to indicate how understanding cognitive dissonance might help us to limit the damage that may occur as a result of its operation. Convincing? I'm not so sure, but this discussion does get one thinking about how we might address the harmful side effects of cognitive dissonance.

A readable book that raises important issues. I think that more use of neuroscientific research could have strengthened this book that much more. Also, the work by cognitive psychologists like Kahneman and Tversky could have spoken to key points as well. This book might also profitably be read in tandem with another recent book on a similar subject, Cordelia Fine, "A Mind of Its Own." In addition, Linden's "accidental Mind" provides a perspective on related issues from a neuroscience viewpoint.
33 comments|8 people found this helpful. Was this review helpful to you?YesNoReport abuse