Start reading Normal Accidents: Living with High-Risk Technologies on your Kindle in under a minute. Don't have a Kindle? Get your Kindle here.
Engineering & Transportation
This title is not currently available for purchase
Anybody can read Kindle books—even without a Kindle device—with the FREE Kindle app for smartphones, tablets and computers.
Sorry, this item is not available in
Image not available for
Image not available

To view this video download Flash Player


Normal Accidents: Living with High-Risk Technologies [Kindle Edition]

Charles Perrow
4.0 out of 5 stars  See all reviews (50 customer reviews)

Pricing information not available.


Amazon Price New from Used from
Kindle Edition $23.99  
Hardcover --  
Paperback $25.38  
Kindle Daily Deals
Kindle Delivers: Daily Deals
Subscribe to find out about each day's Kindle Daily Deals for adults and young readers. Learn more (U.S. customers only)

Book Description

Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.

The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.

Editorial Reviews Review

Hang a curtain too close to a fireplace and you run the risk of setting your house ablaze. Drive a car on a pitch-black night without headlights, and you dramatically increase the odds of smacking into a tree.

These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.

Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee


"[Normal Accidents is] a penetrating study of catastrophes and near catastrophes in several high-risk industries. Mr. Perrow ... writes lucidly and makes it clear that `normal' accidents are the inevitable consequences of the way we launch industrial ventures.... An outstanding analysis of organizational complexity."--John Pfeiffer, The New York Times

"[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal."--Deborah A. Stone, Technology Review

"Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem."--Nick Pidgeon, Nature

Product Details

  • File Size: 5760 KB
  • Print Length: 386 pages
  • Publisher: Princeton University Press; Updated edition (September 27, 1999)
  • Language: English
  • ASIN: B004P1JTI4
  • Text-to-Speech: Enabled
  • X-Ray:
  • Lending: Not Enabled
  • Amazon Best Sellers Rank: #498,539 Paid in Kindle Store (See Top 100 Paid in Kindle Store)
  •  Would you like to give feedback on images?

Customer Reviews

Most Helpful Customer Reviews
124 of 143 people found the following review helpful
3.0 out of 5 stars Living With High-Risk Conclusions January 29, 2004
Format:Paperback|Amazon Verified Purchase
I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious cases he comes to conclusions that are guaranteed to reduce safety (as when he argues that supertankers should be run by committee, and the usefulness of the Captain is no more) or are merely the cherished liberal opinions of an Ivy League sociologist (he teaches at Yale) as when he argues for unilateral nuclear disarmament, government guaranteed income plans, and heroin maintenance (distribution) plans for addicts "to reduce crime." In the case of disarmament, remember this was written during the early 1980s while the Soviet Union was still a huge threat...complete nuclear disarmament would have resulted in fewer US nuclear accidents, but would NOT have made us safer as we would have been totally vulnerable to intentional nuclear attack. He has great personal animosity toward Ronald Reagan, and makes inflammatory statements in the mining section that mining safety regulations would surely be weakened by Reagan, causing many more accidents and deaths. Later in the same section, though, he concludes that mining is inherently dangerous, and no amount of regulation can make it safe. So which is it? Any of this is, at very best, folly, but regardless of political bent (he is a self avowed "leftist liberal") has absolutely no place in a book ostensibly on safety systems. Read more ›
Was this review helpful to you?
24 of 26 people found the following review helpful
5.0 out of 5 stars Of Lasting Value, Relevant to Today's Technical Maze January 27, 2003
Edit of 2 April 2007 to add link and better summary.

I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.

Edit: my long-standing summary of the author's key point: Simple systems have single points of failure that are easy to diagnose and fix. Complex systems have multiple points of failure that interact in unpredictable and often undetectable ways, and are very difficult to diagnose and fix. We live in a constellation of complex systems (and do not practice the precationary principle!).

This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned.

I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.
Read more ›
Was this review helpful to you?
19 of 22 people found the following review helpful
5.0 out of 5 stars Altogether a fascinating and informative book March 21, 2003
Wow. This is an incredible book. I have to admit, though, that I had some difficulty getting into Normal Accidents. There seemed an overabundance of detail, particularly on the nuclear industry's case history of calamity. This lost me, since I'm not familiar with the particulars of equipment function and malfunction. The book was mentioned, however, by two others of a similar nature and mentioned with such reverence, that after I had finished both, I returned to Perrow's book, this time with more success.
Professor Perrow is a PhD in sociology (1960) who has taught at Yale University Department of Sociology since 1981 and whose research focus has been human/technology interactions and the effects of complexity in organizations. (His most recent publication is the The AIDS disaster : the Failure of Organizations in New York and the Nation, 1990.)
In Normal Accidents, he describes the failures that can arise "normally" in systems, ie. those problems that are expected to arise and can be planned for by engineers, but which by virtue of those planned fail-safe devices, immeasurably complicate and endanger the system they are designed to protect. He describes a variety of these interactions, clarifying his definitions by means of a table (p. 88), and a matrix illustration (p. 97). Examples include systems that are linear vs complex, and loosely vs tightly controlled. These generally arise through the interactive nature of the various components the system itself. According to the matrix, an illustration of a highly linear, tightly controlled system would be a dam. A complex, tightly controlled system would be a nuclear plant, etc.
The degree to which failures may occur varies with each type of organization, as does the degree to which a recovery from such a failure is possible.
Read more ›
Comment | 
Was this review helpful to you?
Most Recent Customer Reviews
4.0 out of 5 stars tough read, very technical
This guy obviously knows his stuff. I was frankly hoping to just read about accidents, but this is much more an academy view of the subject. Read more
Published 3 months ago by fourdegreesc
5.0 out of 5 stars Must read.
This is riveting book. Having witnessed many of the incidents described in the book, it is refreshing to read descriptions and accounts that are not driven by the need to coverup,... Read more
Published 3 months ago by J. S. White
5.0 out of 5 stars Too big to fail, will stiil fail
This book paints a good picture of why failure can be inevitable in complex systems.
Nuke plants seem to blow up every decade or so regardless of complex safety systems. Read more
Published 6 months ago by Peter Ashley
1.0 out of 5 stars Pure Dreadfulness
This book was inaccurate, wrong, and useless on every front. The author made so many technical errors in subjects that I understood, that I could only conclude that he made them... Read more
Published 8 months ago by Stuart Steele
3.0 out of 5 stars Seminal but outdated
The review by Robert I. Hedges is smack on nothing much to add.
The work by Perrow certainly was seminal and important, however things have actually progressed since it was... Read more
Published 9 months ago by Arent Arntzen
5.0 out of 5 stars Rethinking Risk Analysis
Risk analysis has been an integral part of the last few decades. Certainly as it applies to technology has been a concentration. Read more
Published 9 months ago by G. Strauss
2.0 out of 5 stars Lack of technical knowledge ruins a good narrative
This book looked intriguing. I have some experience in aircraft reliability design and nuclear reactor safety, and was curious about Perrow's concept of accident inevitability in... Read more
Published 10 months ago by Josh
3.0 out of 5 stars Unforseen interactions in complex systems explained
A bit dry and academic, but nevertheless this book explains that "crap will happen" no matter how hard we try. The author also makes a strong case against nuclear power. Read more
Published 12 months ago by P. T. Murphy
5.0 out of 5 stars Good book
I needed this as a textbook. It came fast and was delivered as promised. There were no marks or any missing pages. Read more
Published 13 months ago by Michael Jadooram
5.0 out of 5 stars WOW.
I was never able to link these things togehther. It is amazing to understand risks and their consequences by understanding processes.
Published 17 months ago by Jurij Kobal
Search Customer Reviews
Only search this product's reviews

More About the Author

Discover books, learn about writers, read author blogs, and more.

What Other Items Do Customers Buy After Viewing This Item?


There are no discussions about this product yet.
Be the first to discuss this product with the community.
Start a new discussion
First post:
Prompts for sign-in

Look for Similar Items by Category