Customer Reviews


53 Reviews
5 star:
 (24)
4 star:
 (10)
3 star:
 (13)
2 star:
 (4)
1 star:
 (2)
 
 
 
 
 
Average Customer Review
Share your thoughts with other customers
Create your own review
 
 

The most helpful favorable review
The most helpful critical review


25 of 28 people found the following review helpful
5.0 out of 5 stars Of Lasting Value, Relevant to Today's Technical Maze
Edit of 2 April 2007 to add link and better summary.

I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never...
Published on January 27, 2003 by Robert David STEELE Vivas

versus
127 of 147 people found the following review helpful
3.0 out of 5 stars Living With High-Risk Conclusions
I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious...
Published on January 29, 2004 by Robert I. Hedges


‹ Previous | 1 26 | Next ›
Most Helpful First | Newest First

127 of 147 people found the following review helpful
3.0 out of 5 stars Living With High-Risk Conclusions, January 29, 2004
Verified Purchase(What's this?)
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious cases he comes to conclusions that are guaranteed to reduce safety (as when he argues that supertankers should be run by committee, and the usefulness of the Captain is no more) or are merely the cherished liberal opinions of an Ivy League sociologist (he teaches at Yale) as when he argues for unilateral nuclear disarmament, government guaranteed income plans, and heroin maintenance (distribution) plans for addicts "to reduce crime." In the case of disarmament, remember this was written during the early 1980s while the Soviet Union was still a huge threat...complete nuclear disarmament would have resulted in fewer US nuclear accidents, but would NOT have made us safer as we would have been totally vulnerable to intentional nuclear attack. He has great personal animosity toward Ronald Reagan, and makes inflammatory statements in the mining section that mining safety regulations would surely be weakened by Reagan, causing many more accidents and deaths. Later in the same section, though, he concludes that mining is inherently dangerous, and no amount of regulation can make it safe. So which is it? Any of this is, at very best, folly, but regardless of political bent (he is a self avowed "leftist liberal") has absolutely no place in a book ostensibly on safety systems. As such I think portions of this book show what is so wrong in American academia today: even genuinely excellent research can be easily spoiled when the conclusions are known before the research is started. This is one of the many reasons that physical scientists scorn the social sciences, and it doesn't have to be this way.
Having said all that there IS a wealth of good information and insight in this book when Perrow sticks to systems and their interactions. The book contains the finest analysis commercially available of the Three Mile Island near-disaster, and his insight about how to improve safety in nuclear plants was timely when the book was written in 1984, though many improvements have been made since then.
Speaking as a commercial airline pilot, I feel his conclusions and observations about aircraft safety were generally true at the time of printing in 1984, but now are miserably out of date. (The same is true of the Air Traffic Control section.) I believe that he generally has a good layman's grasp of aviation, so I am willing to take it as a given that he has a knowledgeable layman's comprehension of the other systems discussed. As an aside, he never gets some of the technicalities quite right. For instance, he constantly uses the term 'coupling' incorrectly in the engineering sense; this is particularly objectionable in the aviation system where it has a very specific meaning to aeronautical engineers and pilots.
The section on maritime accidents and safety is superbly written. Here I am not an expert, but there seems to be a high degree of correlation with the aviation section. His section on "Non Collision Course Collisions" by itself makes this book a worthwhile read. He presents very compelling information and reasoning until the very end of the section, at which point he suggests that since ships are now so big, large ships (especially supertankers) essentially should have no Captain, but should be run by committee. This is an invalid conclusion, and he offers no evidence or substantial argument to support that idea. Clearly, it is an idea hatched in his office and not on a ship (or plane.) There always needs to be a person in a place of ultimate authority in fast moving, dynamic systems, or the potential exists to have crew members begin to work at direct odds with each other, making a marginal situation dangerous. Ironically, in the very same part of the discussion where he concludes that there should be no Captain, he has hit upon the key to the problem. He mentions that he was pleased to see that some European shippers were now training their crews together as a team, and that he expected this to lower accident rates. He is, in fact, exactly right about that. Airlines now have to train crews in Crew Resource Management (CRM) in which each member of the crew has the right and obligation to speak up if they notice anything awry in the operation of their aircraft, and the Captain makes it a priority to listen to the input of others, as everyone has a different set of concerns and knowledge. In this way, the Captain becomes much less dictatorial, and becomes more of a final decision maker after everyone has had their say. It IS critical, though, to maintain someone in command, as there is no time to assemble a staff meeting when a ship is about to run aground, or a mid-air collision is about to occur. Many other well documented studies and books have come to this conclusion, and in the airline industry since CRM was introduced the accident rate has decreased dramatically.
Overall, if you have a desire to understand high risk systems, this book has a lot of good information in it; however it is woefully out of date and for that reason among others, I can only recommend it with reservations. A better and much more contemporary introductory book on the subject is 'Inviting Disaster' by James R. Chiles. Remember, this book was written over twenty years ago, and much has changed since then. There is knowledge to be gleaned here, but you have to be prepared to sort the wheat from the chaff.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


25 of 28 people found the following review helpful
5.0 out of 5 stars Of Lasting Value, Relevant to Today's Technical Maze, January 27, 2003
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
Edit of 2 April 2007 to add link and better summary.

I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.

Edit: my long-standing summary of the author's key point: Simple systems have single points of failure that are easy to diagnose and fix. Complex systems have multiple points of failure that interact in unpredictable and often undetectable ways, and are very difficult to diagnose and fix. We live in a constellation of complex systems (and do not practice the precationary principle!).

This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned.

I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.

Edit: this book is still valuable, but the author has given us the following in 2007:
The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


19 of 22 people found the following review helpful
5.0 out of 5 stars Altogether a fascinating and informative book, March 21, 2003
By 
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
Wow. This is an incredible book. I have to admit, though, that I had some difficulty getting into Normal Accidents. There seemed an overabundance of detail, particularly on the nuclear industry's case history of calamity. This lost me, since I'm not familiar with the particulars of equipment function and malfunction. The book was mentioned, however, by two others of a similar nature and mentioned with such reverence, that after I had finished both, I returned to Perrow's book, this time with more success.
Professor Perrow is a PhD in sociology (1960) who has taught at Yale University Department of Sociology since 1981 and whose research focus has been human/technology interactions and the effects of complexity in organizations. (His most recent publication is the The AIDS disaster : the Failure of Organizations in New York and the Nation, 1990.)
In Normal Accidents, he describes the failures that can arise "normally" in systems, ie. those problems that are expected to arise and can be planned for by engineers, but which by virtue of those planned fail-safe devices, immeasurably complicate and endanger the system they are designed to protect. He describes a variety of these interactions, clarifying his definitions by means of a table (p. 88), and a matrix illustration (p. 97). Examples include systems that are linear vs complex, and loosely vs tightly controlled. These generally arise through the interactive nature of the various components the system itself. According to the matrix, an illustration of a highly linear, tightly controlled system would be a dam. A complex, tightly controlled system would be a nuclear plant, etc.
The degree to which failures may occur varies with each type of organization, as does the degree to which a recovery from such a failure is possible. As illustrations, the author describes failures which have, or could have, arisen in a variety of settings: the nuclear industry, maritime activities, the petrochemical industry, space exploration, DNA research and so on.
The exciting character of the stories themselves are worth the reading; my favorite, and one I had heard before, is the loss of an entire lake into a salt mine. More important still is the knowledge that each imparts. Perrow makes abundantly apparent by his illustrations the ease with which complex systems involving humans can fail catastrophically. (And if Per Bak and others are correct, almost inevitably).
Probably the most significant part of the work is the last chapter. After discussing the fallibility of systems that have grown increasingly complex, he discusses living with high risk systems, particularly why we are and why it should change. In a significant statement he writes, "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries]; risk assessments ignore the social class distribution of risk (p. 310)." How true. "Quo Bono?" as the murder mystery writers might say; "Who benefits?" More to the point, and again with that issue in mind, he writes "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311)." Again, very true.
Professor Perrow examines the degrees of potential danger from different types of system and suggests ways of deciding which are worth it to society to support and which might not be. These include categorizing the degree and the extent of danger of a given system to society, defining the way these technologies conflict with the values of that society, determining the likelihood that changes can be made to effectively alter the dangerous factors through technology or training of operators, and the possibility of placing the burden of spill-over costs on the shoulders of the institutions responsible. The latter might conceivably lead to corrective changes, either by the institutions themselves in order to remain profitable or by consumers through purchasing decisions.
The bibliography for the book is quite extensive and includes a variety of sources. These include not only popular books and publications on the topics of individual disasters, but government documents, research journals, and industry reports as well. I did not find any reference to the Johnstown flood, my particular favorite dam burst story, but there are a wide variety of references to chose from should someone wish to do their own research on the topic.
Altogether a fascinating and informative book.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


14 of 16 people found the following review helpful
5.0 out of 5 stars Insightful perspective on serious industrial accidents., July 16, 1998
By A Customer
Normal Accidents is the best summary of major industrial accidents in the USA that I have encountered. It is written in a factual and technically complete style that is particularly attractive to anyone with a technical background or interest. I was able to read a borrowed copy from a colleague a few years ago when I was appointed as chairman of the safety committee at a manufacturing facility where workers had potential for exposure to toxic gasses, high voltage, x-radiation, and other more everyday industrial hazards. The author's insight is right on target for achieving a workable understanding of the cause and prevention of disaster events. I wanted to buy copies for all our engineering managers and safety committee members, but the book is out of print. It is my fond hope that the author will write an updated version with analysis of more recent events as well as the well-chosen accidents in the previous edition. For any safety related product or process de! ! signer, this book is a must read! For any technically cognizant reader, this book is a delight to read, even if it is a little scary in its implications. For everyone else, it has some really interesting historical stories.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


15 of 18 people found the following review helpful
4.0 out of 5 stars Cool water for hot-headed analysts of complex systems, July 7, 1998
By A Customer
I'm dismayed to discover that 'Normal Accidents' is so difficult to find.
Like all voters, I'm sometimes asked to make choices about the use of potentially devastating technology, despite having no training in engineering and only a sketchy idea of statistical risk analysis. 'Normal Accidents' doesn't reduce my reliance on experts, but it does provide a common language for us to discuss the issues.
Perrow's accident descriptions are masterly, and should disturb anyone who lightly dismisses accidents in complex systems as "simple human error", or assumes that all systems can be made safe by a technological fix. I've used Perrow's complexity / coupling matrix as a tool for thinking about and discussing the risks involved in decisions about many systems in addition to those Perrow actually discusses, not least software systems.
I think this book still has a lot to offer anyone interested in public debate about complex technological issues, and I hope it will be reprinted. A new edition would be even better.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


11 of 13 people found the following review helpful
3.0 out of 5 stars Why we don't hire sociology professors to run nuclear power plants, October 22, 2011
By 
Aaron C. Brown (New York, New York United States) - See all my reviews
(VINE VOICE)    (TOP 1000 REVIEWER)    (REAL NAME)   
Verified Purchase(What's this?)
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
Like Robert I. Hedges, whose comments I generally agree with, I had trouble assigning a rating to this book. I settled on his compromise of three stars. This book has been extremely influential, in fact it probably represents the dominant academic and regulatory attitude toward complex system accidents. One of the most important reasons to read it is to understand that the message is quite different from the popular conception.

The book consists of vivid accounts of disasters in many fields. The author is a talented writer and he conveys a strong sense of overwhelming danger and confusion. He leaves you in no doubt that he would be uselessly panicked in any of these situations. Unfortunately, he mixes routine but scary-sounding elements in jargon and political opinions with the genuine problems. It sounds something like, "Joe was driving to the store at 3:41 PM when his GPS lost satellite reception (satellites launched by the military-industrial complex). He didn't notice because he was distracted by the incessant alarm of the turn signal (TS) he forgot to turn off (system-generated alarm interaction syndrome). While he was attending to that, his hand came within 18 inches of scalding hot 'coffee' restrained only by thin and weak Styrofoam (sold by a soulless corporation indifferent to customer welfare), which unknown to him had a small tear in the rim. Under the right conditions that tear could propagate into total beverage container failure (TBCF), leading to second degree burns and worse, driver incapacitation (2DB&WDI). His engine was producing literally hundreds of powerful explosions every second, possibly because the manufacturer was engaged in union-busting, and was connected to a 'tank' containing enough highly combustible material (supplied at the cost of shocking environmental damage and political corruption) to incinerate Joe in milliseconds." Five pages later, Joe is going to hit a branch fallen into the road and you will be relieved that the world doesn't end.

The disaster accounts are clear and valuable summaries for people interested in interactions of humans and complex machinery. The author claims you don't need to understand the technical details, however he is not qualified to make that statement. You have understand the details to know what is irrelevant. In several cases where I do know a bit about the technology, his "investigating committee" level of explanation is significantly misleading.

The biggest problem in the book is the logical error of studying only disasters. For example, one of its most famous conclusions is that safety devices usually make things worse. In a disaster, the safety systems by definition failed, so they were at best useless and may have caused harm by distracting people or inducing people to rely on them. But you should weigh this versus all the disasters that didn't happen because the safety systems worked.

The other well-known conclusion is that complex and tightly-coupled systems are inherently dangerous. The problem here is the context. A two-lane rural road with light traffic is simpler than the Los Angeles freeway system. Of course, it will have fewer accidents each year. The question is whether it has fewer accidents per unit of transportation it facilitates. If you spread out Los Angeles to the point it could be connected efficiently by simple two-lane roads with few intersections, you might get fewer accidents, or you might not. But it doesn't matter because it's entirely impractical. You might as well say Los Angeles shouldn't exist. Unsurprisingly, this is the author's conclusion about technologies he dislikes including weapons, nuclear power and genetic engineering.

In fact, a single complex system has advantages and disadvantages over many simple systems to accomplish the same end. The book presents no evidence that complex systems are generally less safe. It's true that a single disaster in a complex system is likely to be worse than a single disaster in a simple system, and therefore can attract the attention of the author. But there might be more total damage from many smaller accidents in dispersed simple systems (the author defines his way out of this by calling those things only "failures," not "accidents").

Similarly, tight coupling has advantages and disadvantages. If process A is continuously supplying product to process B, any change in either system must cause an immediate response in the other. If there is an intermediate storage facility for the output of process A, A and B can operate more independently. Problems in one system are less likely to induce problems in others. On the other hand, you now have a store of intermediate product that might create its own dangers. And since immediate communication between A and B is no longer enforced, problems may develop over time that would have to be addressed immediately under tight coupling. Or perhaps the loosely-coupled system is less efficient, and the money saved by tight coupling could more than pay for safety improvements that matter more than the difference between tight and loose coupling. These issues are not addressed in the book, it describes only the problems of tight coupling.

The book does make some powerful points. The title claim is that accidents in tightly-coupled complex systems are normal, like a light bulb that is expected to fail after an average of 1,500 hours in use. However much money and engineering talent are devoting to making things safe, we have to accept that accidents will happen. If we have nuclear power plants, we will have nuclear accidents. If a nuclear accident is unacceptable, we cannot have nuclear power plants. It is not within human power to design nuclear power plants with negligible probability of accident. Designer claims and regulatory requirements to the contrary do great harm.

Another important point is that humans are part of any system. Operators make mistakes. Operators can also demonstrate extraordinary skill and courage to save the day. Designers, builders, owners and regulators respond to their own complex incentives, which are not perfectly aligned with safety considerations and sometimes are completely contrary to safety. How people react to accidents is complicated. Sometimes great losses of life and property are shrugged off, in other cases much smaller damages inspire great outcry. Accidents cannot be measured only in lives lost, and certainly not only in money. This is the author's sweet spot of professional expertise, and his discussion is deeply insightful and illuminating.

Overall the book is a mixture of sense and nonsense, with a lot of valuable material mixed with errors and bad interpretations. It's a book to read to inspire thinking, not one to accept uncritically, and definitely not one for which you can rely on summaries from others.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


18 of 23 people found the following review helpful
3.0 out of 5 stars Where are all the catastrophes he talks about?, June 26, 2002
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
It's a good and interesting theory motivated by a desire to do away with nuclear power. Fine and dandy, but the problem is that while we can tick off a few major disasters: Chernobyl, Bhopal, possibly the intelligence failure leading up to September 11th, the truth is, all his doomsday prophecies have yet to come true. Maybe Perrow is a Cassandra, unlistened to and unappreciated until his prophecies come true. There are a variety of managers who have taken up the task of managing their complex, tightly coupled systems for reliability. Perrow addresses some of them in the afterword to the 2000 edition, but he gives them short shrift. No doubt Perrow touched off a flurry of new research on organizations and this book has, thus, become a classic. But some classics have to be update4d, rethought, and evaluated once again with the searching spotlight of new knowledge. The fact is, at a major weapons lab in the southwest there will be on the order of 1000 fatalities in 1000 years from traffic accidents and only 5 in 1000 years from nuclear related disaster. While traffic movement is complex and tightly coupled, it is probably not a system as he defines it. My suggestion is to read Perrow with Sagan, grasp the underlying theory behind Normal Accident Theory and then read the work of the Berkeley High Reliability Organizations group (Rochlin, LaPorte, Roberts, Weick--at Michigan) etc. THis will give a more well-rounded picture.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


4 of 4 people found the following review helpful
3.0 out of 5 stars A sociologist ponders industrial accidents, December 27, 2010
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
This book was recommended to me by a colleague as an excellent study of industrial accidents and accident prevention. My own assessment of it is mixed. Perrow makes some very good observations and conclusions, including the following:

* Devices and systems added to a larger system for the purpose of safety can often become liabilities in themselves.

* Operator training is not a panacea for managing complex process interactions.

* Near-misses are abundant in industry, and the worst accidents we see are really nothing more than just the right combination of near-misses happening on the same day.

* Murphy was wrong: in most cases, what actually does go wrong is actually a small subset of what COULD HAVE gone wrong.

* In the aftermath of an industrial accident, people are usually a little too cheerful about the "valuable lessons learned" and overlook just how bad things could have been.

The book opens with an analysis of the Three Mile Island (TMI) nuclear reactor partial-meltdown, and that accident is used as a reference throughout the book. In Perrow's view, TMI really encapsulates the essence of "Normal Accidents" in its combination of complexity, process interactions, and "tight coupling" (little or no "slack" inherent to a system to absorb upsets). Perrow advances his own theory of industrial accidents with a Complexity/Coupling grid: rating systems according to how complex they are, and how tightly coupled their interdependent parts are to each other. In systems ranking high on both scales, he argues, accidents become "normal" in the sense that no human being(s) can possibly manage the vast combinations of faults that are bound to occur. These are "System Accidents" rather than specific faults with training, management, or technology.

It is easy to criticize Perrow for his work here. He is a sociologist -- not an engineer -- and so his analyses tend to lack the depth that engineers and technicians want in an analysis of an accident. His work is also "risky" in that he makes some rather bold predictions about future industrial accidents. His conclusions of the nuclear power industry, for example, are pretty far off the mark when you consider nothing even resembling the TMI accident has happened in the years following the book's publication. You've got to hand it to him, though, for daring to step out of his formal discipline (sociology) to take a fresh look at industrial accidents and draw fresh conclusions. If you read Perrow with this in mind, I think you will get a lot more out of his work.

One final note: the book is frequently hilarious. Despite the serious nature of the subject, Perrow manages to bring a tactful sense of levity in his depictions of incidents, showing the absurdity of human behavior in these events. I could easily picture Perrow reading over industrial accident reports saying to himself, "My word, do people actually do these kinds of things on a regular basis?!" For a while I was reading this book as I rode the city bus to work, and one morning someone caught me laughing out loud as I read Perrow's recounting of the Lake Peigneur accident (pp. 251-253). I figured this was not the normal response of a person reading a book with a picture of the Challenger exposlion on the cover, so I stifled my laughter and sunk further down in my seat.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


5 of 6 people found the following review helpful
4.0 out of 5 stars Recommend but there are some errors in the details, April 25, 2006
By 
Robert Arnold (Albany, NY United States) - See all my reviews
(REAL NAME)   
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
I purchased and read the book for the concepts but enjoyed the examples. However, I did find several errors in the events in which I am familiar. Specifically those of space flight. In as little as four pages (pp267-270) he makes three errors. It makes me wonder about the accuracy of the rest of the book's examples.

Just for the record:

1) He attributes the second orbital flight of Mercury to Scott Crossfield when in fact it was Scott Carpenter. Crossfield was an X-15 test pilot of great skill, while Carpenter has been noted as the "worst astronaut in the program" by Chris Kraft. There are many accounts of Mr. Kraft openly wondering how Carpenter got to be an astronaut in the first place let alone being allowed to fly into space. This might better explain the events instead of it being a process - system problem.

2) It was almost certain then, and now after the recent recovery of Gus Grissom's Mercury capsule absolutely confirmed, that Gus did NOT blow the hatch. This would have been know at the time of my copy's printing.

3) It was Glenn's heat shield not the landing pack, which the status light indicated had come loose. This would have explained the reluctance of ground control to inform the astronaut since not much could have been done if true.

Of course these are minor and don't lead the author to any significantly different conclusions than if they were corrected. But the sloppiness does make me lower it to a 4 instead of 5 stars.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


5 of 6 people found the following review helpful
4.0 out of 5 stars Fascinating insight into what goes wrong in complex systems, August 9, 2004
By 
M@ "M@" (Mississippi) - See all my reviews
This review is from: Normal Accidents: Living with High-Risk Technologies (Paperback)
Being a software developer, I am very interested in large systems and why they go bad. The thing that struck me most about this book was that many systems we rely on every day are so complex it is amazing that the ever work at all.

If you have read and enjoyed the book "Systemantics" by John Gall, then you will very likely enjoy this one. Although less lighthearted than Systemantics, the subject is very similar. The more complex a system is, the higher the chance of it failing. The thing that seems counterintuitive but holds true is that any attempt to make a complex system safer adds to it complexity, which makes it more likely to fail. It is the paradox of the information age we now live in.

From his humerous telling of a nuclear disaster avoidance test which ended up in an infinte loop of operators doing the same few actions over and over to some much more serious accounts of chemical plant and maritime disasters, Perrow opens the reader's eyes to the complexity and dangers of the systems that we surround ourselves with.

While this is not "light" reading, it is very informative and, I feel, timely. The only reason I did not give five stars is because it tends to drift occasionaly off topic.

This is not a "Ludditian" book warning us of the evils of technology. Instead it is a well-written review of what we have learned about system failures and a cautionary text of what we (as system designers and users) should keep in mind when deciding to rely on new technologies.
Help other customers find the most helpful reviews 
Was this review helpful to you? Yes No


‹ Previous | 1 26 | Next ›
Most Helpful First | Newest First

Details

Normal Accidents: Living with High-Risk Technologies
Normal Accidents: Living with High-Risk Technologies by Charles Perrow (Paperback - September 27, 1999)
$45.00 $40.32
In Stock
Add to cart Add to wishlist
Search these reviews only
Rate and Discover Movies
Send us feedback How can we make Amazon Customer Reviews better for you? Let us know here.