Will Your Next Mistake Be Fatal? and over one million other books are available for Amazon Kindle. Learn more
Buy Used
$7.95
FREE Shipping on orders over $35.
Used: Very Good | Details
Condition: Used: Very Good
Comment: Eligible for FREE Super Saving Shipping! Used item in very good condition with clean, pristine pages.
Access codes and supplements are not guaranteed with used items.
Add to Cart
Have one to sell? Sell on Amazon
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See this image

Will Your Next Mistake Be Fatal?: Avoiding the Chain of Mistakes That Can Destroy Your Organization Hardcover – October 8, 2004

ISBN-13: 007-6092035473 ISBN-10: 0131913646

Used
Price: $7.95
12 New from $9.21 44 Used from $0.01
Amazon Price New from Used from
eTextbook
"Please retry"
Hardcover
"Please retry"
$9.21 $0.01
Paperback
"Please retry"
$49.95

Free%20Two-Day%20Shipping%20for%20College%20Students%20with%20Amazon%20Student



NO_CONTENT_IN_FEATURE
NO_CONTENT_IN_FEATURE

Product Details

  • Hardcover: 336 pages
  • Publisher: Pearson Prentice Hall (October 8, 2004)
  • Language: English
  • ISBN-10: 0131913646
  • ISBN-13: 978-0131913646
  • Product Dimensions: 1.1 x 6 x 9.2 inches
  • Shipping Weight: 1.3 pounds
  • Average Customer Review: 4.6 out of 5 stars  See all reviews (17 customer reviews)
  • Amazon Best Sellers Rank: #1,375,970 in Books (See Top 100 in Books)

Editorial Reviews

About the Author

Robert E. Mittelstaedt Jr. is Dean and professor of the W. P. Carvey School of Business, Arizona State University, and former, Vice Dean and Director, Aresty Institute of Executive Education, The Wharton School. He has consulted with organizations ranging from IBM to Weirton Steel, Pfizer to the U.S. Nuclear Regulatory Commission, and is a member of the board of directors of three corporations in electronics and healthcare services businesses.

Mittelstaedt's research interests have included executive learning, corporate governance, IT, and strategy. He formerly directed the Wharton Innovation Center and the Wharton Applied Research Center. Mittelstaedt founded Intellgo, Inc. He served as an officer in the U.S. Navy in nuclear submarines at the height of the Cold War. He is also a licensed commercial pilot with multi-engine and instrument ratings.


© Copyright Pearson Education. All rights reserved.

Excerpt. © Reprinted by permission. All rights reserved.

Introduction: What me worry?

What do the failure of Enron, the Watergate scandal, Three-Mile Island, and most airline crashes have in common? Quite simply, it would be almost impossible to make each of these things happen without a serious sequence of errors that goes unchecked. Whether it is a physical disaster, a political blunder, a corporate misstep, or a strategic mistake, as the investigation unfolds, we always find out that it took a unique set of compounding errors to bring the crisis to front-page status.

In many cases these blunders are so complex and the impact so serious that we find ourselves saying, "You couldn't make that happen if you tried." The difference between organizations that end up on the front page of a national newspaper in a negative light, those you never hear about, and those that end up on the front page in a positive light, is the process of "Managing Multiple Mistakes (M3)."

It has long been known that most man-made physical disasters are the result of a series of mistakes. In most cases, if one can find a way to "break the chain" a major catastrophe can be avoided. This recognition of failure chains in operating aircraft, trains, nuclear power plants, chemical plants, and other mechanical devices has led to an emphasis on understanding causes and developing procedures, training and safety systems to reduce the incidence of accidents and mitigate damage if one does occur. Strangely, there has been little emphasis on extending this process to help avoid business disasters – whether operational or strategic.

Enron, WorldComm and HealthSouth are now widely known as major business disasters. Enron might even be classified as a major economic disaster given the number of employees, pensions and shareholders affected at Enron and their accountants, Arthur Andersen. As investigations unfolded we learned that none was the result of a single bad decision or action. Each involved a complicated web of mistakes that were either unnoticed, dismissed as unimportant, judged as minor or purposely ignored in favor of a high-risk high-payoff gamble.

This book is about the avoidable traps that we set for ourselves as business people that lead to disasters. It is about what we can learn from the patterns of action or inaction that preceded disasters (sometimes called "accidents") in a variety of business and non-business settings in order to avoid similar traps and patterns of mistakes. This goes beyond kaizen and six-sigma on the factory floor to M3 or "Managing Multiple Mistakes" in the executive suite and all operational levels of companies.

This is not a book about crisis management. It is not about managing public relations, the victims, the lawyers or the shareholders. It is about discipline, culture and learning from the experiences of others that will improve the odds you can avoid the things we label as accidents, disasters or crises altogether. Even if you do not totally avoid such a situation, knowledge of the typical patterns that occur should help you create an organization that is observant enough to intervene early and minimize damage. Learning and implementing the lessons described here will not mean that you throw away your plans for handling problem situations. But it could mean that you will never have to manage the aftermath of an unpleasant situation.

There are lessons to be learned from looking at the mistake patterns and commonalities in other organizations, especially since most organizations do not do a very good job of evaluating their own mistakes where they have the most information. We miss learning opportunities by not being curious enough to look deeply at our own failures, but we miss a very rich set of opportunities when we do not look at the mistakes others have made, especially when they have been well documented. We often miss these opportunities to learn from others because we believe "Their situation was different – we don't have much to learn from them."

The reality is very different, because study shows that while the specifics may be different across industries and situations, the patterns of mistakes preceding accidents are quite similar. Learning is not always from the sources that you expect like you own experience, your own industry or very similar companies. It takes a bit of extra effort, but you can often learn more by looking at examples in an industry or situation that is markedly different from your own and recognizing that there are great similarities in the patterns of actions and behaviors. This is because without the burden of a set of assumptions around what you "know" is the right or wrong way to do something, it is easy to observe the salient facts, absent all the distracting details and quickly say to yourself something like:

  • Didn't they know water would boil if they lowered the pressure? (Three Mile Island)
  • Why did they fail to follow the procedure and fly into the ground? (Korean Air)
  • Didn't they know customers would want a replacement for a defective chip? (Intel)
  • Don't they know that customers are often more loyal if you admit a mistake and fix it? (Firestone)
  • Didn't they know the leverage and/or fraud might kill the company? (Enron, WorldComm, HealthSouth)
  • Didn't NASA learn anything the first time? (Columbia)
  • Why is J&J legendary for its handling of the Tylenol crisis over 20 years ago?
  • How did a United Airlines crew minimize loss of life with a crash landing where "everything" went wrong? (UA-232 at Sioux City, Iowa)

In each case with crises that have adverse outcomes there is a very common pattern:

  • An initial problem, often minor in isolation, that goes uncorrected
  • A subsequent problem that compounds the effect of the initial problem
  • An inept corrective effect
  • Disbelief at the accelerating seriousness of the situation
  • Generally, an attempt to hide the truth about what is going on while an attempt is made at remediation
  • Sudden recognition that the situation is out of control, or "in extremis"1
  • Finally, the ultimate disaster scenario involving significant loss of life, financial resources, or both and ultimately, the recriminations.

We will explore a number of famous and not so famous disasters or near disasters from the perspective of the mistake sequence and where it might have been broken to change the outcome, or was broken to minimize the damage. We will call your attention to the mistakes so that you might think about the signals that were present and how you, in an ideal world, might have acted differently.

The mistakes identified are usually the result of direct action or inaction by humans. In many scenarios the mistake sequence was initiated with equipment malfunctions that were known but not taken into account in decision making. In other situations the mistakes may have been in the design of systems or business procedures that were based on faulty assumptions. Sometimes there were significant uncontrollable initiating or contributing factors, such as equipment failure, a natural weather occurrence or other "act of God." These initiating factors must be considered in decision-making when they are present because while they are not always human in origin, they are a part of the chain of causes that lead to disasters where humans have an opportunity to intervene effectively or ineffectively.

In the past you may have looked at the occurrence of disasters or recovery from near-disasters as a matter of passing interest in the news. We are suggesting that you look a little deeper, learn a little more and stretch a little further for the implications that you can use:

  • Is there a disaster waiting to happen in my organization?
  • Will we see the signs?
  • Will we stop it soon enough?
  • Do we have the skills to see the signals and the culture to "break the chain?"
  • Are we smart enough to realize that it makes economic sense to care about reducing or stopping mistakes?

Learn from the mistakes of others and envision business success without mistakes, because your future may depend on your ability to do just that. To aid in this quest we will identify some "Insights" linking common themes that come out of the study of mistakes across industries and situations. These will appear appropriately in each chapter and be summarized in a broad way again in Chapter 10.

1 A term from the nautical rules of the road indicating that a collision is virtually unavoidable and the most extreme actions possible must be taken to minimize or avoid damage.



More About the Author

Discover books, learn about writers, read author blogs, and more.

Customer Reviews

4.6 out of 5 stars
5 star
15
4 star
0
3 star
0
2 star
1
1 star
1
See all 17 customer reviews
Overall I found the book to be informative and interesting.
John G. Hilliard
It amazed me to read how in most cases, the correction of even one of the small mistakes in the chain could have led to a minimization or avoidance of the catastrophe.
Charles Ashbacher
In each chapter, the author has a section called Insights; they number one through 38 throughout the book.
John Matlock

Most Helpful Customer Reviews

57 of 57 people found the following review helpful By Thomas Duff HALL OF FAMETOP 1000 REVIEWERVINE VOICE on October 25, 2004
Format: Hardcover
As I was part of one of the biggest corporate mistakes in U.S. history, I thought it would be interesting to read Will Your Next Mistake Be Fatal? by Robert E. Mittelstaedt, Jr. I wasn't disappointed... This should be required reading in all organizations.

Chapter list: The Power of M3 and the Need to Understand Mistakes; Execution Mistakes; Execution Mistakes and Successes as Catalysts for Change; Strategy - How Do You Know It's a Mistake?; Physical Disasters with Cultural Foundations and Business Implications; Cultures that Create "Accidents"; Mistakes as Catalysts for Cultural Change; Economics at Work: Watching Entire Industries Lose It; Mistakes Aren't Just for Big Companies - Small Company Chains; Making M3 Part of Your Culture for Success; Summary of Insights; References; Index

I started working for Enron Broadband back in 1998 when it first got going. I was laid off on September 1st, 2001 when the Portland office closed. Little did any of us know that the entire company would melt down just three months later. Did they plan for that to be the culmination of all their actions? No, but no one in power saw the signs and had enough courage to stop the string of mistakes that ultimately doomed the company. Mittelstaedt uses Enron and numerous other companies to show how a culture of tolerating mistakes can lead a company to the brink of disaster (and many times right over the edge). But instead of just concentrating on "when bad things happen to good companies", he also covers how strong leadership can allow a company to survive and prosper in adverse conditions (like J&J's handling of the Tylenol tampering case). There's a lot of material here that is excellent reading and should cause organizations to ponder their ways.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
66 of 67 people found the following review helpful By John G. Hilliard on May 24, 2005
Format: Hardcover
The one comforting realization this author brings out in his book is that significant or catastrophic events are usually caused by a long trail of mistakes. Thus, unlike say a car crash that can be caused by one simple mistake, the business version of a car crash is preventable if you can identify the mistakes. The author calls this the process of managing multiple mistakes. If you can find a away to break a chain of mistakes, a major issue can be avoided. This book attempts to educate the reader about the avoidable traps that business people tend to set for themselves. The author details many of the higher profile blunders over the past 30 years to show the reader the patterns and to act as case studies. It attempts to help the reader learn from the experiences of others to improve the odds that the reader can also avoid disasters.

The author believes that there are lessons to be learned from looking at the mistake patterns and common themes that have taken place in other organizations. The author believes that most organizations and people for that matter, lack the drive to truly investigate and examine the mistakes they make individually and in business, thus learning opportunities are lost. The author believes then that by looking at others mistakes, it can at least for us, lead to learning. Overall I found the book to be informative and interesting. It was sort of like watching a series of train wrecks in slow motion. It also provides an interesting way to view behavior to help the reader in identifying the chain of mistakes before you become one of the footnotes of history. The book is well worth your time.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
40 of 41 people found the following review helpful By Rolf Dobelli HALL OF FAME on July 14, 2005
Format: Hardcover
Robert Mittelstaedt has written a rare, commendable book. He manages to address a significant business topic - the phenomenon of major corporate blunders - in an original, insightful and entertaining way. In this intriguing volume, he cites case after fascinating case where a series of seemingly small errors went uncorrected until a whole house of cards marked "faulty assumptions" came crashing down. The biggest mistake you can make is assuming that a fatal blunder just couldn't happen in your organization, and the second biggest is ignoring the warning signs that disaster is just ahead. The key, Mittelstaedt advises, is to learn to admit that something has gone wrong before the situation spirals out of control. We encourage every thoughtful business professional to read this substantive contribution to the field of risk management and disaster prevention.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
10 of 10 people found the following review helpful By Charles Ashbacher HALL OF FAMETOP 500 REVIEWERVINE VOICE on December 14, 2005
Format: Hardcover
I think so highly of this book that I listed it as the best book of the year in my annual "Best Books of the Year" column that I write for the online "Journal of Object Technology" (JOT). JOT is a journal written for people who use object-oriented technologies to teach computer science and develop large software projects. The process of creating a program that has millions of lines of source code is a very complex one, there are millions of ways you can fail and only a few ways that you can succeed. The main point is that catastrophic mistakes are generally not due to one big error, but are the consequence of a series of small errors that reinforce each other.

Major historical failures such as the sinking of the Titanic, the meltdown at Three Mile Island, plane crashes, business failures such as the collapse of Enron and the associated destruction of Arthur Anderson are all examined in detail. The failures can generally be placed in one of the following categories:

*) Failure to question an authority figure.

*) Not trusting hard data.

*) Trusting "hard" data too much.

*) Believing only the data that agrees with your beliefs.

*) Tolerating violations of proven procedure.

*) Lack of adequate training and experience.

*) Arrogance or a feeling of infallibility.

It amazed me to read how in most cases, the correction of even one of the small mistakes in the chain could have led to a minimization or avoidance of the catastrophe.

There are also examples of successful recoveries from catastrophic events. In 1982, several people died due to a tampering with the product Tylenol after it left the factories.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Customer Images

Most Recent Customer Reviews

Search