- Paperback: 464 pages
- Publisher: Princeton University Press; Revised ed. edition (September 27, 1999)
- Language: English
- ISBN-10: 0691004129
- ISBN-13: 978-0691004129
- Product Dimensions: 6.2 x 1 x 9.5 inches
- Shipping Weight: 1.4 pounds (View shipping rates and policies)
- Average Customer Review: 62 customer reviews
Amazon Best Sellers Rank:
#59,879 in Books (See Top 100 in Books)
- #14 in Books > Engineering & Transportation > Engineering > Industrial, Manufacturing & Operational Systems > Health & Safety
- #16 in Books > Science & Math > Technology > Safety & Health
- #44 in Books > Politics & Social Sciences > Politics & Government > Public Affairs & Policy > Social Services & Welfare
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Normal Accidents: Living with High-Risk Technologies Revised ed. Edition
Use the Amazon App to scan ISBNs and compare prices.
Fulfillment by Amazon (FBA) is a service we offer sellers that lets them store their products in Amazon's fulfillment centers, and we directly pack, ship, and provide customer service for these products. Something we hope you'll especially enjoy: FBA items qualify for FREE Shipping and Amazon Prime.
If you're a seller, Fulfillment by Amazon can help you increase your sales. We invite you to learn more about Fulfillment by Amazon .
Frequently bought together
Customers who bought this item also bought
Hang a curtain too close to a fireplace and you run the risk of setting your house ablaze. Drive a car on a pitch-black night without headlights, and you dramatically increase the odds of smacking into a tree.
These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.
Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee
[Normal Accidents is] a penetrating study of catastrophes and near catastrophes in several high-risk industries. Mr. Perrow ... writes lucidly and makes it clear that `normal' accidents are the inevitable consequences of the way we launch industrial ventures.... An outstanding analysis of organizational complexity.---John Pfeiffer, The New York Times
[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal.---Deborah A. Stone, Technology Review
Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem.---Nick Pidgeon, Nature
Try the Kindle edition and experience these great reading features:
Read reviews that mention
Showing 1-5 of 62 reviews
There was a problem filtering reviews right now. Please try again later.
The Challenger explosion, for example, has been blamed on a number of lapses in human judgment. But once one thing on the shuttle failed - I this case the O-rings on the solid rocket booster, an unanticipated cascade of failures followed. Similarly, the accident at Three Mile Island resulted from a faulty alarm which was known to be faulty, and was therefore ignored even as temperatures inside the containment dome were skyrocketing. The failure of a component, coupled with human misinterpretation of the failure. led to a cascade of bad events and wrong decisions.
Perrow emphasizes that these events might be attributed to human error, but the humans involved have not seen this sequences of events before, such a cascade of related failures was never anticipated, and the humans have no way to understand what is happening. So "operator error" is an illusion.
This book was written before the astounding disaster of Fukushima, but that series of catastrophes certainly fits Perrow's paradigm. Perrow is pessimistic about nuclear power. The fact that we haven't had more reactor disasters, Perrow says, is because the devices haven't yet had time to express themselves.
Any social scientist, physical scientist, or policymaker should read this book.
Certainly not a page turner. Only managed to get through the first 65%. Glad I rented, as it certainly wasn't worth owning. Initially got the book having found it in the foot notes of "Command and Control."
I highly recommend nearly all of this book for anyone that will be designing, operating, or criticizing complex systems. Aside from poorly explained comments on nuclear criticality accidents and naval nuclear reactors, the earlier chapters of the book are technically quite sound, and Perrow's framework is a good starting point to think about how to make complex systems as safe and resilient as possible.