Customer Reviews


72 Reviews
5 star:
 (42)
4 star:
 (18)
3 star:
 (8)
2 star:
 (2)
1 star:
 (2)
 
 
 
 
 
Average Customer Review
Share your thoughts with other customers
Create your own review
 
 

The most helpful favorable review
The most helpful critical review


116 of 121 people found the following review helpful
5.0 out of 5 stars What makes people poor problem solvers?
Dietrich Dörner is an authority on cognitive behavior and a psychology professor at the University of Bamberg, Germany. His research shows that our habits as problem solvers are typically counterproductive.

Probably our main shortcoming is that we like to oversimplify problems. Dörner offers a long list of self-defeating behaviors, but common to all...
Published on September 23, 2002 by Ronald Scheer

versus
37 of 40 people found the following review helpful
3.0 out of 5 stars Over-promised and under-delivered
The book got my juices flowing in the first chapter, especially with the reference to human interaction with dynamic systems and the tendency to "oversteer". I wrote my doctoral dissertation over 30 years ago on just such a phenomenon as applied to the broiler industry (yes, chickens), which behaves as an underdamped servomechanism. (I'm an engineer).

However...
Published on January 24, 2008 by William H. Franklin Jr.


‹ Previous | 1 28 | Next ›
Most Helpful First | Newest First

116 of 121 people found the following review helpful
5.0 out of 5 stars What makes people poor problem solvers?, September 23, 2002
Dietrich Dörner is an authority on cognitive behavior and a psychology professor at the University of Bamberg, Germany. His research shows that our habits as problem solvers are typically counterproductive.

Probably our main shortcoming is that we like to oversimplify problems. Dörner offers a long list of self-defeating behaviors, but common to all of them is our reluctance to see any problem is part of a whole system of interacting factors. Any problem is much more complex than we like to believe. And failure doesn't have to come from incompetence. The operators of the Chernobyl reactor, as Dörner points out, were "experts." And as experts, they ignored safety standards because they "knew what they were doing."

Dörner identifies four habits of mind and characteristics of thought that account for the frequency of our failures:
1. The slowness of our thinking-We streamline the process of problem solving to save time and energy.
2. Our wish to feel confident and competent in our problem solving abilities-We try to repeat past successes.
3. Our inability to absorb quickly and retain large amounts of information-We prefer unmoving mental models, which cannot capture a dynamic, ever-changing process.
4. Our tendency to focus on immediately pressing problems-We ignore the problems our solutions will create.

Successful problem solving is so complex that there are no hard-and-fast rules that work all the time. The best take-away from the book (and this is my favorite quote): "An individual's reality model can be right or wrong, complete or incomplete. As a rule it will be both incomplete and wrong, and one would do well to keep that probability in mind." The book is 199 easy-to-read pages, and Dörner gives lots of interesting examples from lab tests illustrating people's actual behavior in problem-solving situations.
It's a thought-provoking book for anyone whose job is to tackle complex problems. In one way or another that includes anyone in just about any profession.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


34 of 34 people found the following review helpful
5.0 out of 5 stars "On S'engage Et Puis On Voit!", December 21, 2006
Verified Purchase(What's this?)
Napoleon said "On s'engage et puis on voit!" Loosely translated that means "One jumps into the fray, then figures out what to do next," a common human approach to planning. This discussion (page 161) takes on the adaptability of thought and cautions decision makers about the risks of overplanning in a dynamic, multivariate system. Using examples from Napoleon as well as more concrete examples such as the quotation about soccer strategy (also on page 161,) Dietrich Dörner, the brilliant German behavioral psychologist (University of Bamberg) has created a masterwork on decision making skills in complex systems; I find it to be highly complimentary to Perrow's work and also highly recommend his equally brilliant "Normal Accidents."

A strength of this work is that Dörner takes examples from so many areas including his own computer simulations which show the near-universal applicability of his concepts. One of Dörner's main themes is the failure to think in temporal configurations (page 198): in other words, humans are good at dealing with problems they currently have, but avoid dealing with and tend to ignore problems they don't have (page 189): potential outcomes of decisions are not foreseen, sometimes with tragic consequences. In one computer simulation (page 18) Dörner had a group of hypereducated academics attempt to manage farmland in Africa: they failed miserably. In this experiment Dörner made observations about the decision makers which revealed that they had: "acted without prior analysis of the situation; failed to anticipate side effects and long-term repercussions; assumed the absence of immediately negative effects meant that correct measures had been taken; and let overinvolvement in 'projects' blind them to emerging needs and changes in the situation." (How many governmental bodies the world over does this remind you of?)

I am a safety professional, and am especially interested in time-critical decision making skills. Dörner's treatment of the Chernobyl accident is the most insightful summation I have seen. He makes the point that the entire accident was due to human failings, and points out the lack of risk analysis (and managerial pressure) and fundamental lack of appreciation for the reactivity instability at low power levels (and more importantly how operators grossly underestimated the danger that changes in production levels made, page 30.) Dörner's grasp here meshes the psychology and engineering disciplines (engineers like stasis; any change in reactivity increases hazards.) Another vital point Dörner makes is that the Chernobyl operators knowingly violated safety regulations, but that violations are normally positively reinforced (i.e. you normally "get away with it," page 31.) The discussion about operating techniques on pages 33 and 34 is insightful: the operators were operating the Chernobyl Four reactor intuitively and not analytically. While there is room for experiential decision making in complex systems, analysis of future potential problems is vital.

In most complex situations the nature of the problems are intransparent (page 37): not all information we would like to see is available. Dörner's explanation of the interactions between complexity, intransparence, internal dynamics (and developmental tendencies,) and incomplete (or incorrect) understanding of the system involved shows many potential pitfalls in dynamic decision making skills. One of the most important of all decision making criteria Dörner discusses is the importance of setting well defined goals. He is especially critical of negative goal setting (intention to avoid something) and has chosen a perfect illustrative quote from Georg Christoph Lichtenberg on page 50: "Whether things will be better if they are different I do not know, but that they will have to be different if they are to become better, that I do know." A bigger problem regarding goals occurs when "we don't even know that we don't understand," a situation that is alarmingly common in upper management charged with supervising technical matters (page 60.)

Fortunately Dörner does have some practical solutions to these problems, most in chapter six, "Planning." One of the basics (page 154) is the three step model in any planning decision (condition element, action element, and result element) and how they fit into large, dynamic systems. This is extremely well formulated and should be required reading for every politician and engineer. These concepts are discussed in conjunction with "reverse planning" (page 155) in which plans are contrived backwards from the goal. I have always found this a very useful method of planning or design, but Dörner finds that is rare. Dörner argues that in extremely complex systems (Apollo 13 is a perfect example) that intermediate goals are sometimes required as decision trees are enormous. This sometimes relies on history and analogies (what has happened in similar situations before) but it may be required to stabilize a situation to enable further critical actions. This leads back to the quote that titles this review: 'adaptability of thought' (my term) is vital to actions taken in extremely complex situations. Rigid operating procedures and historical problems may not always work: a full understanding of the choices being made is vital, although no one person is likely to have this understanding; for this reason Dörner recommends there be a "redundancy of potential command" (page 161) which is to say a group of highly trained leaders able to carry out leadership tasks within their areas of specialty (again, NASA during Apollo 13) reportable in a clear leadership structure which values their input. Dörner then points out that nonexperts may hold key answers (page 168); though notes that experts should be in charge as they best understand the thought processes applicable in a given scenario (pages 190-193.) This ultimately argues for more oversight by technicians and less by politicians: I believe (and I am guessing Dörner would concur) that we need more inter- and intra-industry safety monitoring, and fewer congressional investigations and grandstanding.

This is a superb book; I recommend it highly to any safety professional as mandatory reading, and to the general public for an interesting discussion of decision making skills.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


52 of 57 people found the following review helpful
5.0 out of 5 stars Truly a five-star masterpiece!, February 16, 2002
By 
Stephen L. Nelson (Redmond, WA United States) - See all my reviews
(REAL NAME)   
I picked this book up after hearing a book editor I really respect say he re-reads this great book every few years. And, wow, am I glad I did. What this book talks about is decision-making in situations of complexity, uncertainty and intransparence. The author, Dorner, recounts the results of computer simulations that explore how people succeed and fail in decision making and planning. This is one of those "keepers" that I'll read again and again... and get more from the book each time.
Tangential comment: I'm also a writer (my best-selling books have been Quicken for Dummies and QuickBooks for Dummies) and so I have to say that this book is really, really well-written and edited. Wonderful craftsmanship!
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


37 of 40 people found the following review helpful
3.0 out of 5 stars Over-promised and under-delivered, January 24, 2008
Verified Purchase(What's this?)
The book got my juices flowing in the first chapter, especially with the reference to human interaction with dynamic systems and the tendency to "oversteer". I wrote my doctoral dissertation over 30 years ago on just such a phenomenon as applied to the broiler industry (yes, chickens), which behaves as an underdamped servomechanism. (I'm an engineer).

However the early promise of the book didn't bloom as I'd hoped. Rather than use real world examples, all of the author's principles are drawn from simulated experiments. As a doctoral student I was subjected to many simulated business game situations, and while they can be made complex to third and fourth generation consequences, life is more complex than that (think The Tipping Point and Jim Burke's The Pinball Effect).

The effort to draw principles in the last chapter suffered two defects: there are too many of them and they are shallowly explained in terms of real-world usefulness.

While I think the book is worth reading, it over-promised and under-delivered. I'd recommend speed reading it for high level content and avoid getting bogged down in the simulations. I highlight as I read, and the highlighting became less and less as the book wore on. That's the best evidence I have on the value of a book to me when I finish reading it and review my highlighting and notes.

A much more practical book (for me) was Managing the Unexpected by Karl Weick and Kathleen Sutcliffe.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


15 of 15 people found the following review helpful
5.0 out of 5 stars Overcoming Complexity, July 28, 2005
By 
THE LOGIC OF FAILURE: RECOGNIZING AND AVOIDING ERROR IN COMPLEX SITUATIONS by Dietrich Dörner is a great read that has changed my thinking about organizations, decision making and complexity ever since I read it about a month or so ago. I recommend it unequivocally to anyone who makes decisions, and particularly to those in organizations and assessment and management positions.

Dörner, a cognitive psychologist at the University of Bamberg, writes about tendencies in decision making, basing many of his conclusions on experiments and simulations he has run with human subjects. For example, he has had subjects manage a mythical area of Africa, Tanaland, to help the nomadic peoples there. The managers can control all areas of life, public works funding, pesticide control, legal issues of grazing, housing matters, etc. In another, Dörner makes his research subjects the mayor of a mythical town, Greenvale. Again, they can address many issues related to life in this community, education, employment, taxes, public services, business development, tourism, social programming, etc. He also does a brief study of Chernobyl to analyze how that disaster happened through logical failure.

Dealing with how people make decisions, the real and root processes of decision making in humans, Dörner makes the point early on that decision making is not an isolated process. "But I am not concerned with thinking alone, for thinking is always rooted in the total process of psychic activity. There is no thinking without emotion" (p. 8). He also makes the point that thought is rooted in value systems, and that our decisions are generally made to bring us closer to the goals that are based on these values. Given this basis, Dörner shows throughout the book various tendencies individuals have when faced with ambiguity in decision-making situations and with complexity that overwhelms them. Sometimes people will focus so tightly on an area in which they are comfortable or will allow themselves to be distracted by small items to avoid coping with complexity.

Often, we cannot know all we need to know about a topic. Dörner writes about the success rates of participants who utilize planning procedures for a limited time, then go to work, but who revisit their issues and are willing to change, vs. those who just set right in without situational analysis, get confused by unintended consequences and start blaming outside factors or creating "myths" or superstitions about why they are experiencing the problems that they are.

Comfortingly, Dörner's research results show that some planning that is moved into action, with frequent analysis of the results of the decisions is effective. He also writes about other effective methodologies, such as thinking by analogy, to generate better understanding of a process.

The entire book, by enlightening the reader on the factors involved in decision making -- emotion, values, fears -- helps to create greater perception on what we do in our own processes of making decisions and following up on them. The book, divided into chapters called "Some Examples," "The Demands" (which deals with complexity, dynamics, intransparence, ignorance and mistaken hypotheses, and steps in planning and action), "Setting Goals," "Information and Models," "Time Sequences," "Planning" and "So Now What Do We Do" does effectively dissect the processes and issues in complexity and understanding them (how hard it is to really understand exponential growth for example, or time's effects on variables).

The book is conversational, with great examples, and fabulous data charts that illustrate the concepts about which he is writing. And it stays in the top of one's mind. I'm going to keep this handy at every work place I ever inhabit as it is a fabulous reference book. I recommend it for everyone!
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


13 of 14 people found the following review helpful
5.0 out of 5 stars Should be required reading in business and engineering schools!, May 31, 2006
Verified Purchase(What's this?)
The author deals directly with the limits of human beings under the stressful situation of solving dynamic problems---ones with a life of their own. Here's something from the text:

"We don't neglect the "implicit" problems of a situation because thinking about the possible side effects of the measures we are planning would overburden us terribly. Rather we neglect them because we don't have those problems at the moment and therefore are not suffering from their ill effects. In short, we are captives of the moment.

The slowness of our thinking and the small amount of information we can process at any one time, our tendency to protect our sense of competence, the limited inflow capacity of our memory, and our tendency to focus only on immediate pressing problems---these are the simple causes of the mistakes we make in dealing with complex systems."

That sums up the author's points nicely. He presents numerous results of computer trials of people solving problems. The scientific analysis is intriguing and his drive is persistent. These points, excerpted above, are brought home time and again.

If you want to improve your troubleshooting skills, whether in business or engineering, or day-to-day, read this book.

I hope you will take the time to rate this review.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


9 of 9 people found the following review helpful
3.0 out of 5 stars Interesting, dense, convoluted..., July 4, 2007
Verified Purchase(What's this?)
Dense, detailed and often fascinating. Fluid writing despite translation from German. Those who are used to jumping to conclusions will find much to learn here regardless of the overall malfunction. Unfortunately, the promise shown initially is not fully delivered for several reasons. The author makes frequent use of his simulations, tests, and models and asks much of the reader who has only read a brief introduction to them. In the later chapters, it's more tedious to follow along with his constant references; furthermore, the book essentially becomes an argument for the use of computer-simulated research rather than a distilled analysis of failure. The last chapter feigns a comprehensive summary, but drifts away as the author ponders the process of determining the point of failure. It could be taken, then, as either (i) a technical, interesting but inconclusive study of the reasons for failure which focuses on a mere handful of examples; or (ii) an outdated, abstract examination of the process for examining failure, which does not consider alternative approaches and isn't thorough enough to be worthwhile.

If you're interested in note-taking and then drawing your own conclusions, then this book should be an excellent read as it's filled with detail. For others, it delivers a few very fascinating chapters about our cognitive biases and then fails to draw them together cogently.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


11 of 12 people found the following review helpful
5.0 out of 5 stars One just wonders how often we can make *such* mistakes, November 2, 1998
By A Customer
This book influenced me a great way. I became aware of things I never was able to explain. I got an idea of how to manage complex systems. It helps when you agree because of great examples that "problems around us are complex, otherwise they would have been solved."
The graphic data and their analysis is very accurate.
One of the greatest chapters in the book is about time measurement. Never did I understand before reading that in general, people cannot manage time, they don't feel it. We measure it by looking at watch, we make charts, we write histories and mark milestones just because our mind is "current". It is a fascinating book to read. The somewhat academic style is very appropriate. I very strongly recommend it, especially for beginners in business, like me.
With best regards, OLEG.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


7 of 7 people found the following review helpful
4.0 out of 5 stars Pleasant introduction to the psychology of decisions, September 17, 1997
By A Customer
Let's admit it: it is fun to look at other people's mistakes - especially when they happen in inconsequential computer simulations.

Beside having fun, the reader of this book will also learn about the psychological forces behind the way we think and act on complex systems (i.e., in situations we do not fully understand.)

Even though I am no expert in the field, I believe that the author does an excellent job at using a variety of experimental evidence (the simulations range from complex development aid projects to simple temperature regulation)in an entertaining fashion to illustrate serious analyses.

The author refuses to hand out illusory recipes for good decision-making - and gives good reasons for it. That may be a good recipe by itself.

Finally, a minor criticism: too few references to the literature and too many of them are to works in German. Oh, well, I guess you can always use Amazon's search engine.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


10 of 11 people found the following review helpful
5.0 out of 5 stars Essential Reading for Business and Govt Leaders, January 22, 2004
Wow - a superb analysis of why we fail even when doing things right! The lessons contained herein are invaluable to every professional, and more so for those who are in critical decision making and leadership roles. The fallacy of our thinking is something we dont like to admit or understand, this book reveals the pitfalls of the same. You will need some guts to read and understand this book since it will uncover flaws that you will probably hate to admit.
In some places the translation could have been better, however that should not prevent anyone from reading this book. I suggest "Dont just read - but grasp the lessons." It may take more than one reading to get a better understanding, but that investment of time will be well worth the effort. Equally important - reflect on what you read as you try to assimilate the material discussed.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


‹ Previous | 1 28 | Next ›
Most Helpful First | Newest First

Details

Search these reviews only
Send us feedback How can we make Amazon Customer Reviews better for you? Let us know here.