Buy new:
$19.99$19.99
Arrives:
Monday, July 24
Ships from: Amazon.com Sold by: Amazon.com
Buy used: $8.88
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required. Learn more
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations Paperback – August 4, 1997
| Price | New from | Used from |
Purchase options and add-ons
- Print length240 pages
- LanguageEnglish
- PublisherBasic Books
- Publication dateAugust 4, 1997
- Dimensions9.42 x 6.04 x 0.61 inches
- ISBN-100201479486
- ISBN-13978-0201479485
Frequently bought together

What do customers buy after viewing this item?
- Lowest Pricein this set of products
Sources of Power, 20th Anniversary Edition: How People Make Decisions (The MIT Press)Paperback$16.42 shipping - Highest ratedin this set of products
Leaders Eat Last: Why Some Teams Pull Together and Others Don'tPaperback$15.68 shipping - This item:
The Logic Of Failure: Recognizing And Avoiding Error In Complex SituationsDietrich DornerPaperback$15.47 shippingOnly 3 left in stock (more on the way).
Red Team: How to Succeed By Thinking Like the EnemyHardcover$16.42 shippingOnly 7 left in stock (more on the way).
The Myth of the Strong Leader: Political Leadership in the Modern AgeHardcover$17.36 shippingGet it as soon as Monday, Jul 24Only 1 left in stock - order soon.
Editorial Reviews
About the Author
Product details
- Publisher : Basic Books; Revised ed. edition (August 4, 1997)
- Language : English
- Paperback : 240 pages
- ISBN-10 : 0201479486
- ISBN-13 : 978-0201479485
- Item Weight : 9.5 ounces
- Dimensions : 9.42 x 6.04 x 0.61 inches
- Best Sellers Rank: #88,039 in Books (See Top 100 in Books)
- #248 in Decision-Making & Problem Solving
- #298 in Cognitive Psychology (Books)
- #752 in Business Management (Books)
- Customer Reviews:
About the author

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on Amazon-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
A strength of this work is that Dörner takes examples from so many areas including his own computer simulations which show the near-universal applicability of his concepts. One of Dörner's main themes is the failure to think in temporal configurations (page 198): in other words, humans are good at dealing with problems they currently have, but avoid dealing with and tend to ignore problems they don't have (page 189): potential outcomes of decisions are not foreseen, sometimes with tragic consequences. In one computer simulation (page 18) Dörner had a group of hypereducated academics attempt to manage farmland in Africa: they failed miserably. In this experiment Dörner made observations about the decision makers which revealed that they had: "acted without prior analysis of the situation; failed to anticipate side effects and long-term repercussions; assumed the absence of immediately negative effects meant that correct measures had been taken; and let overinvolvement in 'projects' blind them to emerging needs and changes in the situation." (How many governmental bodies the world over does this remind you of?)
I am a safety professional, and am especially interested in time-critical decision making skills. Dörner's treatment of the Chernobyl accident is the most insightful summation I have seen. He makes the point that the entire accident was due to human failings, and points out the lack of risk analysis (and managerial pressure) and fundamental lack of appreciation for the reactivity instability at low power levels (and more importantly how operators grossly underestimated the danger that changes in production levels made, page 30.) Dörner's grasp here meshes the psychology and engineering disciplines (engineers like stasis; any change in reactivity increases hazards.) Another vital point Dörner makes is that the Chernobyl operators knowingly violated safety regulations, but that violations are normally positively reinforced (i.e. you normally "get away with it," page 31.) The discussion about operating techniques on pages 33 and 34 is insightful: the operators were operating the Chernobyl Four reactor intuitively and not analytically. While there is room for experiential decision making in complex systems, analysis of future potential problems is vital.
In most complex situations the nature of the problems are intransparent (page 37): not all information we would like to see is available. Dörner's explanation of the interactions between complexity, intransparence, internal dynamics (and developmental tendencies,) and incomplete (or incorrect) understanding of the system involved shows many potential pitfalls in dynamic decision making skills. One of the most important of all decision making criteria Dörner discusses is the importance of setting well defined goals. He is especially critical of negative goal setting (intention to avoid something) and has chosen a perfect illustrative quote from Georg Christoph Lichtenberg on page 50: "Whether things will be better if they are different I do not know, but that they will have to be different if they are to become better, that I do know." A bigger problem regarding goals occurs when "we don't even know that we don't understand," a situation that is alarmingly common in upper management charged with supervising technical matters (page 60.)
Fortunately Dörner does have some practical solutions to these problems, most in chapter six, "Planning." One of the basics (page 154) is the three step model in any planning decision (condition element, action element, and result element) and how they fit into large, dynamic systems. This is extremely well formulated and should be required reading for every politician and engineer. These concepts are discussed in conjunction with "reverse planning" (page 155) in which plans are contrived backwards from the goal. I have always found this a very useful method of planning or design, but Dörner finds that is rare. Dörner argues that in extremely complex systems (Apollo 13 is a perfect example) that intermediate goals are sometimes required as decision trees are enormous. This sometimes relies on history and analogies (what has happened in similar situations before) but it may be required to stabilize a situation to enable further critical actions. This leads back to the quote that titles this review: 'adaptability of thought' (my term) is vital to actions taken in extremely complex situations. Rigid operating procedures and historical problems may not always work: a full understanding of the choices being made is vital, although no one person is likely to have this understanding; for this reason Dörner recommends there be a "redundancy of potential command" (page 161) which is to say a group of highly trained leaders able to carry out leadership tasks within their areas of specialty (again, NASA during Apollo 13) reportable in a clear leadership structure which values their input. Dörner then points out that nonexperts may hold key answers (page 168); though notes that experts should be in charge as they best understand the thought processes applicable in a given scenario (pages 190-193.) This ultimately argues for more oversight by technicians and less by politicians: I believe (and I am guessing Dörner would concur) that we need more inter- and intra-industry safety monitoring, and fewer congressional investigations and grandstanding.
This is a superb book; I recommend it highly to any safety professional as mandatory reading, and to the general public for an interesting discussion of decision making skills.
As a point of comparison to my own decision making as well as to those around me this book has been very insightful.
Complex systems, interconnected networks with time-delays, buffering units, hidden keystone variables, and unclear indicators, are everywhere in the real world. Unfortunately, human minds tend to think linearly and concretely. Dorner documents several pathological thinking styles he encounters in his experiments. Some people over-correct, making dramatic changes while chasing a pointer that drowned out any data in induced oscillations. Some people get lost chasing irrelevant details, asking for more information rather than acting. And some people get trapped in methodism, following a predetermined course of action in complete disregard of the information coming in.
Against this, Dorner advocates for having a clear mental model of a system, discrete objectives, and a holistic sense of possible higher-order effects. Make small changes, seek steady states, and do not try and race a chaotic system. He points towards 'wisdom' with maddening vagueness. If there's a major problem with this book, it's that it's been overtaken by the zeitgeist. Dorner's methods are now children's toys rather than cutting edge science. We all 'get' networks and complexity, but we still lack the language to truly understand them.
Top reviews from other countries
シミュレーションの例で「水不足だから井戸を掘る」という施策が、「水が水源に補充されるより汲み上げられる速度を早めてしまう上に、一時的な多量の水の供給によりクライシスの深刻さを増してしまう」という皮肉な結果を示している。
何事も相関を見る事が重要で、単純な一時凌ぎの施策が制御不可能な事態を招いてしまう事を認識させられる。
著者の言う「これはタダのシミュレーションゲームの結果と言えばそれまでかも知れないが、アフリカで起こっている現実と同じではないか」という考えは正しいと思う。
アフリカで伝染病が広がる→医療チームを派遣し成果を上げる→人口爆発が起こる→食料を巡って内戦→健康事情が悪化→最初に戻る
このようなループは誰が起こすのか、どうやったら防げるかのヒントはこの本にあるので、ご一読頂きたい。
ちなみにチェルノブイリはソ連の専門家が時間の制約の中でモスクワから指示を受けた実験を行おうとして操作を誤ったために起こった人災であるそうだ。プレッシャーの中では専門家でも素人と同じミスを犯す。
ドイツ語からの翻訳のためか英語は非常に容易である。シミュレーションの結果のグラフが多い。それだけ様々な実験があり、実験によりクライシスを発生させる心理的な側面を描き出している。ビジネスにおいても有効な著作であると思う。
Sin embargo, una vez dicho esto, cabe señalar que el libro contiene desde trivialidades o cosas muy conocidas que pueden encontrarse en divulgadores del tipo Peter Senge hasta rasgos de brillantez, cuando el autor describe cuáles son los mecanismos que nos conducen a error en el diagnóstico de sistemas o nuestra distinta capacidad para tratar asuntos relacionados con el espacio o con el tiempo.
El libro tiene multitud de perlas que van apareciendo a lo largo del libro y cuya sola presencia bastaría para justificar su compra. Al mismo tiempo, cualquiera que esté familiarizado con los modelos sistémicos encontrará muchas cosas sobradamente conocidas e incluso hasta podrá discutir sobre las bondades de los procesos de simulación, utilizando para ello un argumento que el propio autor da en la página 88:
"If we have no idea how the variables in a system influence one another, we cannot take these influences into account".
Éste es precisamente el punto débil de los procesos de simulación: Son útiles cuando se trata de pronosticar el desarrollo de variables cuya relación ya se conocía. Si esa relación no se conoce -y esto ocurre a menudo en sistemas complejos- la simulación de procesos no es de gran ayuda.






