- Use promo code PRIMEBOOKS18 to save $5.00 when you spend $20.00 or more on Books offered by Amazon.com. Enter code PRIMEBOOKS18 at checkout. Here's how (restrictions apply)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations Revised ed. Edition
Use the Amazon App to scan ISBNs and compare prices.
Fulfillment by Amazon (FBA) is a service we offer sellers that lets them store their products in Amazon's fulfillment centers, and we directly pack, ship, and provide customer service for these products. Something we hope you'll especially enjoy: FBA items qualify for FREE Shipping and Amazon Prime.
If you're a seller, Fulfillment by Amazon can help you increase your sales. We invite you to learn more about Fulfillment by Amazon .
The Amazon Book Review
Author interviews, book reviews, editors picks, and more. Read it now
Frequently bought together
Customers who bought this item also bought
Special offers and product promotions
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
Complex systems, interconnected networks with time-delays, buffering units, hidden keystone variables, and unclear indicators, are everywhere in the real world. Unfortunately, human minds tend to think linearly and concretely. Dorner documents several pathological thinking styles he encounters in his experiments. Some people over-correct, making dramatic changes while chasing a pointer that drowned out any data in induced oscillations. Some people get lost chasing irrelevant details, asking for more information rather than acting. And some people get trapped in methodism, following a predetermined course of action in complete disregard of the information coming in.
Against this, Dorner advocates for having a clear mental model of a system, discrete objectives, and a holistic sense of possible higher-order effects. Make small changes, seek steady states, and do not try and race a chaotic system. He points towards 'wisdom' with maddening vagueness. If there's a major problem with this book, it's that it's been overtaken by the zeitgeist. Dorner's methods are now children's toys rather than cutting edge science. We all 'get' networks and complexity, but we still lack the language to truly understand them.
A strength of this work is that Dörner takes examples from so many areas including his own computer simulations which show the near-universal applicability of his concepts. One of Dörner's main themes is the failure to think in temporal configurations (page 198): in other words, humans are good at dealing with problems they currently have, but avoid dealing with and tend to ignore problems they don't have (page 189): potential outcomes of decisions are not foreseen, sometimes with tragic consequences. In one computer simulation (page 18) Dörner had a group of hypereducated academics attempt to manage farmland in Africa: they failed miserably. In this experiment Dörner made observations about the decision makers which revealed that they had: "acted without prior analysis of the situation; failed to anticipate side effects and long-term repercussions; assumed the absence of immediately negative effects meant that correct measures had been taken; and let overinvolvement in 'projects' blind them to emerging needs and changes in the situation." (How many governmental bodies the world over does this remind you of?)
I am a safety professional, and am especially interested in time-critical decision making skills. Dörner's treatment of the Chernobyl accident is the most insightful summation I have seen. He makes the point that the entire accident was due to human failings, and points out the lack of risk analysis (and managerial pressure) and fundamental lack of appreciation for the reactivity instability at low power levels (and more importantly how operators grossly underestimated the danger that changes in production levels made, page 30.) Dörner's grasp here meshes the psychology and engineering disciplines (engineers like stasis; any change in reactivity increases hazards.) Another vital point Dörner makes is that the Chernobyl operators knowingly violated safety regulations, but that violations are normally positively reinforced (i.e. you normally "get away with it," page 31.) The discussion about operating techniques on pages 33 and 34 is insightful: the operators were operating the Chernobyl Four reactor intuitively and not analytically. While there is room for experiential decision making in complex systems, analysis of future potential problems is vital.
In most complex situations the nature of the problems are intransparent (page 37): not all information we would like to see is available. Dörner's explanation of the interactions between complexity, intransparence, internal dynamics (and developmental tendencies,) and incomplete (or incorrect) understanding of the system involved shows many potential pitfalls in dynamic decision making skills. One of the most important of all decision making criteria Dörner discusses is the importance of setting well defined goals. He is especially critical of negative goal setting (intention to avoid something) and has chosen a perfect illustrative quote from Georg Christoph Lichtenberg on page 50: "Whether things will be better if they are different I do not know, but that they will have to be different if they are to become better, that I do know." A bigger problem regarding goals occurs when "we don't even know that we don't understand," a situation that is alarmingly common in upper management charged with supervising technical matters (page 60.)
Fortunately Dörner does have some practical solutions to these problems, most in chapter six, "Planning." One of the basics (page 154) is the three step model in any planning decision (condition element, action element, and result element) and how they fit into large, dynamic systems. This is extremely well formulated and should be required reading for every politician and engineer. These concepts are discussed in conjunction with "reverse planning" (page 155) in which plans are contrived backwards from the goal. I have always found this a very useful method of planning or design, but Dörner finds that is rare. Dörner argues that in extremely complex systems (Apollo 13 is a perfect example) that intermediate goals are sometimes required as decision trees are enormous. This sometimes relies on history and analogies (what has happened in similar situations before) but it may be required to stabilize a situation to enable further critical actions. This leads back to the quote that titles this review: 'adaptability of thought' (my term) is vital to actions taken in extremely complex situations. Rigid operating procedures and historical problems may not always work: a full understanding of the choices being made is vital, although no one person is likely to have this understanding; for this reason Dörner recommends there be a "redundancy of potential command" (page 161) which is to say a group of highly trained leaders able to carry out leadership tasks within their areas of specialty (again, NASA during Apollo 13) reportable in a clear leadership structure which values their input. Dörner then points out that nonexperts may hold key answers (page 168); though notes that experts should be in charge as they best understand the thought processes applicable in a given scenario (pages 190-193.) This ultimately argues for more oversight by technicians and less by politicians: I believe (and I am guessing Dörner would concur) that we need more inter- and intra-industry safety monitoring, and fewer congressional investigations and grandstanding.
This is a superb book; I recommend it highly to any safety professional as mandatory reading, and to the general public for an interesting discussion of decision making skills.