- Take an Extra 30% Off Any Book: Use promo code HOLIDAY30 at checkout to get an extra 30% off any book for a limited time. Excludes Kindle eBooks and Audible Audiobooks. Restrictions apply. Learn more
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your email address or mobile phone number.
Robert E. Mittelstaedt Jr. is Dean and professor of the W. P. Carvey School of Business, Arizona State University, and former, Vice Dean and Director, Aresty Institute of Executive Education, The Wharton School. He has consulted with organizations ranging from IBM to Weirton Steel, Pfizer to the U.S. Nuclear Regulatory Commission, and is a member of the board of directors of three corporations in electronics and healthcare services businesses.
Mittelstaedt's research interests have included executive learning, corporate governance, IT, and strategy. He formerly directed the Wharton Innovation Center and the Wharton Applied Research Center. Mittelstaedt founded Intellgo, Inc. He served as an officer in the U.S. Navy in nuclear submarines at the height of the Cold War. He is also a licensed commercial pilot with multi-engine and instrument ratings.
© Copyright Pearson Education. All rights reserved.
What do the failure of Enron, the Watergate scandal, Three-Mile Island, and most airline crashes have in common? Quite simply, it would be almost impossible to make each of these things happen without a serious sequence of errors that goes unchecked. Whether it is a physical disaster, a political blunder, a corporate misstep, or a strategic mistake, as the investigation unfolds, we always find out that it took a unique set of compounding errors to bring the crisis to front-page status.
In many cases these blunders are so complex and the impact so serious that we find ourselves saying, "You couldn't make that happen if you tried." The difference between organizations that end up on the front page of a national newspaper in a negative light, those you never hear about, and those that end up on the front page in a positive light, is the process of "Managing Multiple Mistakes (M3)."
It has long been known that most man-made physical disasters are the result of a series of mistakes. In most cases, if one can find a way to "break the chain" a major catastrophe can be avoided. This recognition of failure chains in operating aircraft, trains, nuclear power plants, chemical plants, and other mechanical devices has led to an emphasis on understanding causes and developing procedures, training and safety systems to reduce the incidence of accidents and mitigate damage if one does occur. Strangely, there has been little emphasis on extending this process to help avoid business disasters whether operational or strategic.
Enron, WorldComm and HealthSouth are now widely known as major business disasters. Enron might even be classified as a major economic disaster given the number of employees, pensions and shareholders affected at Enron and their accountants, Arthur Andersen. As investigations unfolded we learned that none was the result of a single bad decision or action. Each involved a complicated web of mistakes that were either unnoticed, dismissed as unimportant, judged as minor or purposely ignored in favor of a high-risk high-payoff gamble.
This book is about the avoidable traps that we set for ourselves as business people that lead to disasters. It is about what we can learn from the patterns of action or inaction that preceded disasters (sometimes called "accidents") in a variety of business and non-business settings in order to avoid similar traps and patterns of mistakes. This goes beyond kaizen and six-sigma on the factory floor to M3 or "Managing Multiple Mistakes" in the executive suite and all operational levels of companies.
This is not a book about crisis management. It is not about managing public relations, the victims, the lawyers or the shareholders. It is about discipline, culture and learning from the experiences of others that will improve the odds you can avoid the things we label as accidents, disasters or crises altogether. Even if you do not totally avoid such a situation, knowledge of the typical patterns that occur should help you create an organization that is observant enough to intervene early and minimize damage. Learning and implementing the lessons described here will not mean that you throw away your plans for handling problem situations. But it could mean that you will never have to manage the aftermath of an unpleasant situation.
There are lessons to be learned from looking at the mistake patterns and commonalities in other organizations, especially since most organizations do not do a very good job of evaluating their own mistakes where they have the most information. We miss learning opportunities by not being curious enough to look deeply at our own failures, but we miss a very rich set of opportunities when we do not look at the mistakes others have made, especially when they have been well documented. We often miss these opportunities to learn from others because we believe "Their situation was different we don't have much to learn from them."
The reality is very different, because study shows that while the specifics may be different across industries and situations, the patterns of mistakes preceding accidents are quite similar. Learning is not always from the sources that you expect like you own experience, your own industry or very similar companies. It takes a bit of extra effort, but you can often learn more by looking at examples in an industry or situation that is markedly different from your own and recognizing that there are great similarities in the patterns of actions and behaviors. This is because without the burden of a set of assumptions around what you "know" is the right or wrong way to do something, it is easy to observe the salient facts, absent all the distracting details and quickly say to yourself something like:
In each case with crises that have adverse outcomes there is a very common pattern:
We will explore a number of famous and not so famous disasters or near disasters from the perspective of the mistake sequence and where it might have been broken to change the outcome, or was broken to minimize the damage. We will call your attention to the mistakes so that you might think about the signals that were present and how you, in an ideal world, might have acted differently.
The mistakes identified are usually the result of direct action or inaction by humans. In many scenarios the mistake sequence was initiated with equipment malfunctions that were known but not taken into account in decision making. In other situations the mistakes may have been in the design of systems or business procedures that were based on faulty assumptions. Sometimes there were significant uncontrollable initiating or contributing factors, such as equipment failure, a natural weather occurrence or other "act of God." These initiating factors must be considered in decision-making when they are present because while they are not always human in origin, they are a part of the chain of causes that lead to disasters where humans have an opportunity to intervene effectively or ineffectively.
In the past you may have looked at the occurrence of disasters or recovery from near-disasters as a matter of passing interest in the news. We are suggesting that you look a little deeper, learn a little more and stretch a little further for the implications that you can use:
Learn from the mistakes of others and envision business success without mistakes, because your future may depend on your ability to do just that. To aid in this quest we will identify some "Insights" linking common themes that come out of the study of mistakes across industries and situations. These will appear appropriately in each chapter and be summarized in a broad way again in Chapter 10.
1 A term from the nautical rules of the road indicating that a collision is virtually unavoidable and the most extreme actions possible must be taken to minimize or avoid damage.
This is the book that I would have liked to write. It gets to the heart of what makes the difference between success and failure. Read morePublished on September 21, 2006 by Carolyn Thornlow
It may be one of the most overlooked but great business book. A likely explanation is that Wharton School Publishing is far less famous than HBR. Who knows! Read morePublished on September 8, 2006 by ServantofGod
Another valuable source book for organizational (and personal) success from the Wharton School
Every disaster is the result of a mistake or a series of mistakes that... Read more
Win or lose is a stark concept. At one time, many acted as if there was enough to go around. Yet, in today's worldwide competitive marketplace that is no longer the case. Read morePublished on January 10, 2006 by Craig L. Howe
If you think "fatal" is hyperbolic, consider these statistics which Michael Gerber shares in E-Myth Mastery: "Of the 1 million U.S. Read morePublished on January 6, 2006 by Robert Morris
“This is not a book about crisis management. It is not about managing public relations, the victims, the lawyers, or the shareholders. Read morePublished on December 4, 2005 by Turgay BUGDACIGIL
When a business fails, a plane crashes, or some other catastrophic event occurs it is rarely the result of a single mistake. Read morePublished on November 23, 2005 by Harold McFarland
In 1980, one year after the Three Mile Island nuclear power plant disaster, Robert E. Mittelstaedt, Jr. served as a consultant to the Nuclear Regulatory Commission. Read morePublished on October 22, 2005 by Wharton School Book Reports
The book was disappointing considering the authors credentials. If you want to know details about why the Titanic sank, airline crashes, Three Mile island etc the book is fine. Read morePublished on July 10, 2005 by Alex Z