Normal Accidents: Living with High-Risk Technologies Revised Edition
| Charles Perrow (Author) Find all the books, read about the author, and more. See search results for this author |
Use the Amazon App to scan ISBNs and compare prices.
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.
The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.
Frequently bought together
Customers who viewed this item also viewed
Editorial Reviews
Amazon.com Review
These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.
Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee
Review
"[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal."---Deborah A. Stone, Technology Review
"Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem."---Nick Pidgeon, Nature
About the Author
Don't have a Kindle? Get your Kindle here, or download a FREE Kindle Reading App.
Product details
- Publisher : Princeton University Press; Revised edition (September 27, 1999)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 0691004129
- ISBN-13 : 978-0691004129
- Item Weight : 1.38 pounds
- Dimensions : 6.1 x 1.18 x 9.2 inches
- Best Sellers Rank: #213,628 in Books (See Top 100 in Books)
- #38 in Technology Safety & Health
- #38 in Industrial Health & Safety
- #186 in Safety & First Aid (Books)
- Customer Reviews:
About the author

Charles Perrow is professor emeritus of sociology at Yale University and visiting professor at Stanford University. His interests include the development of bureaucracy in the 19th century, protecting the nation’s critical infrastructure, the prospects for democratic work organizations, and the origins of American capitalism.
Customer reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
The topic may seem difficult to understand, but with multiple examples, Perrow makes it seem clear as day. The historical record of certain industries and what they’ve done for protection explain why some dangerous operations have a good safety record, yet other complex operations break down often despite layers of “protection” - the “normal accident”.
If you wonder why some companies making “dangerous products” have a good safety record, while other heavily regulated industries have more accidents than expected, then this book is for you.
The Challenger explosion, for example, has been blamed on a number of lapses in human judgment. But once one thing on the shuttle failed - I this case the O-rings on the solid rocket booster, an unanticipated cascade of failures followed. Similarly, the accident at Three Mile Island resulted from a faulty alarm which was known to be faulty, and was therefore ignored even as temperatures inside the containment dome were skyrocketing. The failure of a component, coupled with human misinterpretation of the failure. led to a cascade of bad events and wrong decisions.
Perrow emphasizes that these events might be attributed to human error, but the humans involved have not seen this sequences of events before, such a cascade of related failures was never anticipated, and the humans have no way to understand what is happening. So "operator error" is an illusion.
This book was written before the astounding disaster of Fukushima, but that series of catastrophes certainly fits Perrow's paradigm. Perrow is pessimistic about nuclear power. The fact that we haven't had more reactor disasters, Perrow says, is because the devices haven't yet had time to express themselves.
Any social scientist, physical scientist, or policymaker should read this book.
I highly recommend nearly all of this book for anyone that will be designing, operating, or criticizing complex systems. Aside from poorly explained comments on nuclear criticality accidents and naval nuclear reactors, the earlier chapters of the book are technically quite sound, and Perrow's framework is a good starting point to think about how to make complex systems as safe and resilient as possible.
Top reviews from other countries
And beyond the "prescription", I really appreciate the stories and the reminder of some definitions (like "complexity" vs "analysability"), which are often forgotten when you are full of eagerness to create your "system".






