
Amazon Prime Free Trial
FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with FREE Delivery" below the Add to Cart button and confirm your Prime free trial.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited FREE Prime delivery
- Streaming of thousands of movies and TV shows with limited ads on Prime Video.
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
Buy new:
-19% $33.99$33.99
Ships from: Amazon.com Sold by: Amazon.com
Save with Used - Good
$25.95$25.95
Ships from: Amazon Sold by: designcentral
Learn more
1.76 mi | Ashburn 20147
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Normal Accidents: Living with High-Risk Technologies Updated Edition
Purchase options and add-ons
Normal Accidents analyzes the social side of technological risk. Charles Perrow argues that the conventional engineering approach to ensuring safety--building in more warnings and safeguards--fails because systems complexity makes failures inevitable. He asserts that typical precautions, by adding to complexity, may help create new categories of accidents. (At Chernobyl, tests of a new safety system helped produce the meltdown and subsequent fire.) By recognizing two dimensions of risk--complex versus linear interactions, and tight versus loose coupling--this book provides a powerful framework for analyzing risks and the organizations that insist we run them.
The first edition fulfilled one reviewer's prediction that it "may mark the beginning of accident research." In the new afterword to this edition Perrow reviews the extensive work on the major accidents of the last fifteen years, including Bhopal, Chernobyl, and the Challenger disaster. The new postscript probes what the author considers to be the "quintessential 'Normal Accident'" of our time: the Y2K computer problem.
- ISBN-100691004129
- ISBN-13978-0691004129
- EditionUpdated
- PublisherPrinceton University Press
- Publication dateSeptember 27, 1999
- LanguageEnglish
- Dimensions6.25 x 1 x 9.5 inches
- Print length464 pages
Discover the latest buzz-worthy books, from mysteries and romance to humor and nonfiction. Explore more
Frequently bought together

Products related to this item
Editorial Reviews
Amazon.com Review
These are matters of common sense, applied to simple questions of cause and effect. But what happens, asks systems-behavior expert Charles Perrow, when common sense runs up against the complex systems, electrical and mechanical, with which we have surrounded ourselves? Plenty of mayhem can ensue, he replies. The Chernobyl nuclear accident, to name one recent disaster, was partially brought about by the failure of a safety system that was being brought on line, a failure that touched off an unforeseeable and irreversible chain of disruptions; the less severe but still frightening accident at Three Mile Island, similarly, came about as the result of small errors that, taken by themselves, were insignificant, but that snowballed to near-catastrophic result.
Only through such failures, Perrow suggests, can designers improve the safety of complex systems. But, he adds, those improvements may introduce new opportunities for disaster. Looking at an array of real and potential technological mishaps--including the Bhopal chemical-plant accident of 1984, the Challenger explosion of 1986, and the possible disruptions of Y2K and genetic engineering--Perrow concludes that as our technologies become more complex, the odds of tragic results increase. His treatise makes for sobering and provocative reading. --Gregory McNamee
Review
"[Perrow's] research undermines promises that `better management' and `more operator training' can eliminate catastrophic accidents. In doing so, he challenges us to ponder what could happen to justice, community, liberty, and hope in a society where such events are normal."---Deborah A. Stone, Technology Review
"Normal Accidents is a testament to the value of rigorous thinking when applied to a critical problem."---Nick Pidgeon, Nature
About the Author
Excerpt. © Reprinted by permission. All rights reserved.
NORMAL ACCIDENTS
Living with High-Risk TechnologiesBy CHARLES PERROWPrinceton University Press
Copyright © 1999 Princeton University PressAll right reserved.
ISBN: 978-0-691-00412-9
Contents
Abnormal Blessings............................................................................................viiIntroduction..................................................................................................31. Normal Accident at Three Mile Island.......................................................................152. Nuclear Power as a High-Risk System: Why We Have Not Had More TMIs—But Will Soon.....................323. Complexity, Coupling, and Catastrophe......................................................................624. Petrochemical Plants.......................................................................................1015. Aircraft and Airways.......................................................................................1236. Marine Accidents...........................................................................................1707. Earthbound Systems: Dams, Quakes, Mines, and Lakes.........................................................2328. Exotics: Space, Weapons, and DNA...........................................................................2569. Living with High-Risk Systems..............................................................................304Afterword.....................................................................................................353Postscript: The Y2K Problem...................................................................................388List of Acronyms..............................................................................................413Notes.........................................................................................................415Bibliography..................................................................................................426Index.........................................................................................................441Chapter One
Normal Accident at Three Mile IslandOur first example of the accident potential of complex systems is the accident at the Three Mile Island Unit 2 nuclear plant near Harrisburg, Pennsylvania, on March 28, 1979. I have simplified the technical details a great deal and have not tried to define all of the terms. It is not necessary to understand the technology in any depth. What I wish to convey is the interconnectedness of the system, and the occasions for baffling interactions. This will be the most demanding technological account in the book, but even a general sense of the complexity will suffice if one wishes to merely follow the drama rather than the technical evolution of the accident.
TMI is clearly our most serious nuclear power plant accident to date. The high drama of the event gripped the nation for a fortnight, as reassurance gave way to near panic, and we learned of a massive hydrogen bubble and releases that sent pregnant women and others fleeing the area. The President of the United States toured the plant while two feeble pumps, designed for quite other duties, labored to keep the core from melting further. (One of them soon failed, but fortunately by the time the second pump failed the system had cooled sufficiently to allow for natural circulation.) The subsequent investigations and law suits disclosed a seemingly endless story of incompetence, dishonesty, and cover-ups before, during, and after the event; indeed, new disclosures were appearing as this book went to press. Yet, as we shall see in chapter 2 when we examine other accidents, the performance of all concerned—utility, manufacturer, regulatory agency, and industry—was about average. Rather sizeable bits and pieces of the TMI disaster can be found elsewhere in the industry; they had just never been put together so dramatically before.
Unit 2 at Three Mile Island (TMI) had a hard time getting underway at the end of 1978. Nuclear plants are always plagued with start-up problems because the system is so complex, and the technology so new. Many processes are still not well understood, and the tolerances are frightfully small for some components. A nuclear plant is also a hybrid creation—the reactor itself being complex and new and carefully engineered by one company, while the system for drawing off the heat and using it to turn turbines is a rather conventional, old, and comparatively unsophisticated system built by another company. Unit 2 may have had more than the usual problems. The maintenance force was overworked at the time of the accident and had been reduced in size during an economizing drive. There were many shutdowns, and a variety of things turned out, in retrospect, to be out of order. But one suspects that it was not all that different from other plants; after a plant sustains an accident, a thorough investigation will turn up numerous problems that would have gone unnoticed or undocumented had the accident been avoided. Indeed, in the 1982 court case where the utility, Metropolitan Edison, sued the builder of the reactor, Babcock and Wilcox, the utility charged the builder with an embarrassing number of errors and failures, and the vendor returned the favor by charging that the utility was incompetent to run their machine. But Metropolitan Edison runs other machines, and Babcock and Wilson have built many reactors that have not had such a serious accident. We know so much about the problems of Unit 2 only because the accident at Three Mile Island made it a subject for intense study; it is probably the most well-documented examination of organizational performance in the public record. At last count I found ten published technical volumes or books on the accident alone, perhaps one hundred articles, and many volumes of testimony.
The accident started in the cooling system. There are two cooling systems. The primary cooling system contains water under high pressure and at high temperature that circulates through the core where the nuclear reaction is taking place. This water goes into a steam generator, where it bathes small tubes circulating water in a quite separate system, the secondary cooling system, and heats this water in the secondary system. This transfer of heat from the primary to the secondary system keeps the core from overheating, and uses the heat to make steam. Water in the secondary system is also under high pressure until it is called upon to turn into steam, which drives the turbines that generate the electric power. The accident started in the secondary cooling system.
The water in the secondary system is not radioactive (as is the water in the primary system), but it must be very pure because its steam drives the finely precisioned turbine blades. Resins get into the water and have to be removed by the condensate polisher system, which removes particles that are precipitated out.
The polisher is a balky system, and it had failed three times in the few months the new unit had been in operation. After about eleven hours of work on the system, at 4:00 A.M. on March 28, 1979, the turbine tripped (stopped). Though the operators did not know why at the time, it is believed that some water leaked out of the polisher system—perhaps a cupful—through a leaky seal.
Seals are always in danger of leaking, but normally it is not a problem. In this case, however, the moisture got into the instrument air system of the plant. This is a pneumatic system that drives some of the instruments. The moisture interrupted the air pressure applied to two valves on two feedwater pumps. This interruption "told" the pumps that something was amiss (though it wasn't) and that they should stop. They did. Without the pumps, the cold water was no longer flowing into the steam generator, where the heat of the primary system could be transferred to the cool water in the secondary system. When this flow is interrupted, the turbine shuts down, automatically—an automatic safety device, or ASD.
But stopping the turbine is not enough to render the plant safe. Somehow, the heat in the core, which makes the primary cooling system water so hot, has to be removed. If you take a whistling tea kettle off the stove and plug its opening, the heat in the metal and water will continue to produce steam, and if it cannot get out, it may explode. Therefore, the emergency feedwater pumps came on (they are at H in Figure 1.1; the regular feedwater pumps which just stopped are above them in the figure). They are designed to pull water from an emergency storage tank and run it through the secondary cooling system, compensating for the water in that system that will boil off now that it is not circulating. (It is like pouring cold water over your plugged tea kettle.) However, these two pipes were unfortunately blocked; a valve in each pipe had been accidently left in a closed position after maintenance two days before. The pumps came on and the operator verified that they did, but he did not know that they were pumping water into a closed pipe.
The President's Commission on the Accident at Three Mile Island (the Kemeny Commission) spent a lot of time trying to find out just who was responsible for leaving the valves closed, but they were unsuccessful. Three operators testified that it was a mystery to them how the valves had gotten closed, because they distinctly remembered opening them after the testing. You probably have had the same problem with closing the freezer door or locking the front door; you are sure you did, because you have done it many times. Operators testified at the Commission's hearings that with hundreds of valves being opened or closed in a nuclear plant, it is not unusual to find some in the wrong position—even when locks are put on them and a "lock sheet" is maintained so the operators can make an entry every time a special valve is opened or closed.
Accidents often involve such mysteries. A safety hatch on a Mercury spacecraft prematurely blew open (it had an explosive charge for opening it) as the recovery helicopter was about to pick it up out of the water after splashdown. Gus Grissom, the astronaut, insisted afterwards that he hadn't fired it prematurely or hit it accidentally. It just blew by itself. (He almost drowned.) It is the old war between operators and the equipment others have designed and built. The operators say it wasn't their fault; the designers say it wasn't the fault of the equipment or design. Ironically, the astronauts had insisted upon the escape hatch being put in as a safety device in case they had to exit rapidly; it is not the only example we shall uncover of safety devices increasing the chances of accidents. The Three Mile Island operators finally had to concede reluctantly that large valves do not close by themselves, so someone must have goofed.
There were two indicators on TMI's gigantic control panel that showed that the valves were closed instead of open. One was obscured by a repair tag hanging on the switch above it. But at this point the operators were unaware of any problem with emergency feedwater and had no occasion to make sure those valves, which are always open except during tests, were indeed open. Eight minutes later, when they were baffled by the performance of the plant, they discovered it. By then much of the initial damage had been done. Apparently our knowledge of these plants is quite incomplete, for while some experts thought the closed valves constituted an important operator error, other experts held that it did not make much difference whether the valves were closed or not, since the supply of emergency feedwater is limited and worse problems were appearing anyway.
With no circulation of coolant in the secondary system, a number of complications were bound to occur. The steam generator boiled dry. Since no heat was being removed from the core, the reactor "scrammed." In a scram the graphite control rods, 80 percent silver, drop into the core and absorb the neutrons, stopping the chain reaction. (In the first experiments with chain reactions, the procedure was the same—"drop the rods and scram"; thus the graphic term scram for stopping the chain reaction.) But that isn't enough. The decaying radioactive materials still produce some heat, enough to generate electricity for 18,000 homes. The "decay heat" in this 40-foot-high stainless steel vessel, taller than a three-story building, builds up enormous temperature and pressure. Normally there are thousands of gallons of water in the primary and secondary cooling systems to draw off the intense heat of the reactor core. In a few days this cooling system should cool down the core. But the cooling system was not working.
There are, of course, ASDs to handle the problem. The first ASD is the pilot-operated relief valve (PORV), which will relieve the pressure in the core by channeling the water from the core through a big vessel called a pressurizer, and out the top of it into a drain pipe (called the "hot leg"), and down into a sump. It is radioactive water and is very hot, so the valve is a nuisance. Also, it should only be open long enough to relieve the pressure; if too much water comes through it, the pressure will drop so much that the water can flash into steam, creating bubbles of steam, called steam voids, in the core and the primary cooling pipes. These bubbles will restrict the flow of coolant, and allow certain spots to get much hotter than others—in particular, spots by the uranium rods, allowing them to start fissioning again.
The PORV is also known by its Dresser Industries' trade name of "electromatic relief valve." (Dresser Industries is the firm that sponsored ads shortly after the accident saying that actress Jane Fonda was more dangerous than nuclear plants. She was starring in the China Syndrome, a popular movie playing at the time that depicted a near meltdown in a nuclear plant.) It is expected to fail once in every fifty usages, but on the other hand, it is seldom needed. The President's Commission turned up at least eleven instances of it failing in other nuclear plants (to the surprise of the Nuclear Regulatory Commission and the builder of the reactor, Babcock and Wilcox, who only knew of four) and there had been two earlier failures in the short life of TMI-Unit 2. Unfortunately, it just so happened that this time, with the block valves closed and one indicator hidden, and with the condensate pumps out of order, the PORV failed to reseat, or close, after the core had relieved itself sufficiently of pressure.
This meant that the reactor core, where the heat was building up because the coolant was not moving, had a sizeable hole in it—the stuck-open relief valve. The coolant in the core, the primary coolant system, was under high pressure, and was ejecting out through the stuck valve into a long curved pipe, the "hot leg," which went down to a drain tank. Thirty-two thousand gallons, one third of the capacity of the core, would eventually stream out. This was no small pipe break someplace as the operators originally thought; the thing was simply uncorked, relieving itself when it shouldn't.
Since there had been problems with this relief valve before (and it is a difficult engineering job to make a highly reliable valve under the conditions in which it must operate), an indicator had recently been added to the valve to warn operators if it did not reseat. The watchword is "safety" in nuclear plants. But, since nothing is perfect, it just so happened that this time the indicator itself failed, probably because of a faulty solenoid, a kind of electromagnetic toggle switch. Actually, it wasn't much of an indicator, and the utility and supplier would have been better off to have had none at all. Safety systems, such as warning lights, are necessary, but they have the potential for deception. If there had been no light assuring them the valve had closed, the operators would have taken other steps to check the status of the valve, as operators did in a similar accident at another plant a year and a half before. But if you can't believe the lights on your control panel, an army of operators would be necessary to check every part of the system that might be relevant. And one of the lessons of complex systems and TMI is that any part of the system might be interacting with other parts in unanticipated ways.
The indicator sent a signal to the control board that the valve had received the impulse to shut down. (It was not an indication that the valve had actually shut down; that would be much harder to provide.) So the operators noted that all was fine with the PORV, and waited for reactor pressure to rise again, since it had dropped quickly when the valve opened for a second. The cork stayed off the vessel for two hours and twenty minutes before a new shift supervisor, taking a fresh look at the problems, discovered it.
We are now, incredibly enough, only thirteen seconds into the "transient," as engineers call it. (It is not a perversely optimistic term meaning something quite temporary or transient, but rather it means a rapid change in some parameter, in this case, temperature.) In these few seconds there was a false signal causing the condensate pumps to fail, two valves for emergency cooling out of position and the indicator obscured, a PORV that failed to reseat, and a failed indicator of its position. The operators could have been aware of none of these.
(Continues...)
Excerpted from NORMAL ACCIDENTSby CHARLES PERROW Copyright © 1999 by Princeton University Press. Excerpted by permission of Princeton University Press. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Product details
- Publisher : Princeton University Press; Updated edition (September 27, 1999)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 0691004129
- ISBN-13 : 978-0691004129
- Item Weight : 1.45 pounds
- Dimensions : 6.25 x 1 x 9.5 inches
- Best Sellers Rank: #300,993 in Books (See Top 100 in Books)
- #34 in Technology Safety & Health
- #34 in Industrial Health & Safety
- #157 in Safety & First Aid (Books)
- Customer Reviews:
About the author

Charles Perrow is professor emeritus of sociology at Yale University and visiting professor at Stanford University. His interests include the development of bureaucracy in the 19th century, protecting the nation’s critical infrastructure, the prospects for democratic work organizations, and the origins of American capitalism.
Related products with free delivery on eligible orders
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book interesting and informative, providing a good introduction to complex systems. They appreciate the insightful framework and analysis. However, some readers feel the content is slightly dated. Opinions differ on the pacing - some find it well-written and edited, while others consider it a tough read with typographical errors.
AI-generated from the text of customer reviews
Customers find the book interesting and useful. They say it provides good information and clear summaries on complex systems. The book is a landmark that explains exactly why disasters occur. It's a must-read for engineers, even though some of the lessons are timeless.
"The material is a bit dated, but the lessons are timeless. Perrow references an old adage that "Man's reach always exceeds his grasp"...." Read more
"...a desire to understand high risk systems, this book has a lot of good information in it; however it is woefully out of date and for that reason..." Read more
"...A must read for any Engineer." Read more
"...I settled on his compromise of three stars. This book has been extremely influential, in fact it probably represents the dominant academic and..." Read more
Customers find the book provides an insightful framework and analysis of complex systems. They appreciate the refreshing descriptions and accounts that are not biased. The book provides a good overview of modern technology, how it's used, and its risks. Readers also mention that the Aviation and Space treatment is good in variety and detail presented. Overall, customers find the book instructive and entertaining.
"This book introduces a theory of complex systems and why they may be more accident prone than non-complex ("linear") systems...." Read more
"...Perrow presents a wide variety of remarkably well researched accidents across many industries to illustrate the problems with our complex modern..." Read more
"...The book contains the finest analysis commercially available of the Three Mile Island near-disaster, and his insight about how to improve safety in..." Read more
"...With that in mind I thought the Aviation and Space treatment was good in variety and detail presented...." Read more
Customers have different views on the pacing of the book. Some find it well-written and edited, with a strong sense of danger. Others describe it as a tough read, very technical, and not a page-turner.
"...The section on maritime accidents and safety is superbly written...." Read more
"...Certainly not a page turner. Only managed to get through the first 65%. Glad I rented, as it certainly wasn't worth owning...." Read more
"Well written and edited. Really highlights just what a flimsy house of cards we have built...." Read more
"...this is not light reading and is not a novel, it is non fiction but if you are interested in the one original review about why planes crash, nuclear..." Read more
Customers find the book's content somewhat dated, but still relevant.
"The material is a bit dated, but the lessons are timeless. Perrow references an old adage that "Man's reach always exceeds his grasp"...." Read more
"...this book has a lot of good information in it; however it is woefully out of date and for that reason among others, I can only recommend it with..." Read more
"...The issue I see with the book is that the information seems to be dated...." Read more
"The book is dated...." Read more
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
- Reviewed in the United States on June 20, 2023This book introduces a theory of complex systems and why they may be more accident prone than non-complex ("linear") systems. Perrow then explores how the theory plays out in multiple industries, which all vary in interesting ways: nuclear power, chemical plants/refineries, air travel, marine transport, mining/dams, and more.
I work in high tech (large scale online systems, think Big Tech internet/cloud computing) and the theory described in this book is extremely prescient in anticipating the sort of technical problems we sometimes encounter today -- despite having been written 40 years ago. I wish I had read this years ago, to be honest.
My one minor complaint with the book: the 1999 reprint, which includes a pretty hastily assembled and speculative Y2K treatment (written before the fact) features a photo of the Challenger accident on the cover. In fact the Challenger is not a good example of a system accident (it was just a bad component design, plus institutional failure/pressure to keep flying it). A much better example would be a nuclear meltdown (e.g. Chernobyl), which Perrow predicted multiple times; again his text being written in 1982-1983 mainly. Anyhow, the publisher can be forgiven for insufficiently thought-through cover photo selection.
- Reviewed in the United States on August 13, 2018Perrow provides an insightful framework for understanding the complex systems we live with every day and the ways they fail. Unfortunately, he spends his conclusion trying to make policy recommendations that aren't actually well supported by his own framework.
I highly recommend nearly all of this book for anyone that will be designing, operating, or criticizing complex systems. Aside from poorly explained comments on nuclear criticality accidents and naval nuclear reactors, the earlier chapters of the book are technically quite sound, and Perrow's framework is a good starting point to think about how to make complex systems as safe and resilient as possible.
- Reviewed in the United States on May 4, 2011The material is a bit dated, but the lessons are timeless. Perrow references an old adage that "Man's reach always exceeds his grasp". That seems to be so true when we look at today's catastrophes such as the Gulf oil spill or the financial meltdown. Perrow presents a wide variety of remarkably well researched accidents across many industries to illustrate the problems with our complex modern systems. He makes reasonable recommendations as to how to deal with these problems, and along the way, he points out the major flaws of accident investigation boards and the risk assessment profession. Read this book! Its both instructional and (morbidly) entertaining.
- Reviewed in the United States on January 29, 2004I have been mulling over this review for a while now, and am still undecided on the correct rating to award this book. On the one hand Perrow offers some genuine insight into systems safety, but frequently does not understand the technicalities of the systems (or occasionally their operators) well enough to make informed decisions and recommendations. In more egregious cases he comes to conclusions that are guaranteed to reduce safety (as when he argues that supertankers should be run by committee, and the usefulness of the Captain is no more) or are merely the cherished liberal opinions of an Ivy League sociologist (he teaches at Yale) as when he argues for unilateral nuclear disarmament, government guaranteed income plans, and heroin maintenance (distribution) plans for addicts "to reduce crime." In the case of disarmament, remember this was written during the early 1980s while the Soviet Union was still a huge threat...complete nuclear disarmament would have resulted in fewer US nuclear accidents, but would NOT have made us safer as we would have been totally vulnerable to intentional nuclear attack. He has great personal animosity toward Ronald Reagan, and makes inflammatory statements in the mining section that mining safety regulations would surely be weakened by Reagan, causing many more accidents and deaths. Later in the same section, though, he concludes that mining is inherently dangerous, and no amount of regulation can make it safe. So which is it? Any of this is, at very best, folly, but regardless of political bent (he is a self avowed "leftist liberal") has absolutely no place in a book ostensibly on safety systems. As such I think portions of this book show what is so wrong in American academia today: even genuinely excellent research can be easily spoiled when the conclusions are known before the research is started. This is one of the many reasons that physical scientists scorn the social sciences, and it doesn't have to be this way.
Having said all that there IS a wealth of good information and insight in this book when Perrow sticks to systems and their interactions. The book contains the finest analysis commercially available of the Three Mile Island near-disaster, and his insight about how to improve safety in nuclear plants was timely when the book was written in 1984, though many improvements have been made since then.
Speaking as a commercial airline pilot, I feel his conclusions and observations about aircraft safety were generally true at the time of printing in 1984, but now are miserably out of date. (The same is true of the Air Traffic Control section.) I believe that he generally has a good layman's grasp of aviation, so I am willing to take it as a given that he has a knowledgeable layman's comprehension of the other systems discussed. As an aside, he never gets some of the technicalities quite right. For instance, he constantly uses the term 'coupling' incorrectly in the engineering sense; this is particularly objectionable in the aviation system where it has a very specific meaning to aeronautical engineers and pilots.
The section on maritime accidents and safety is superbly written. Here I am not an expert, but there seems to be a high degree of correlation with the aviation section. His section on "Non Collision Course Collisions" by itself makes this book a worthwhile read. He presents very compelling information and reasoning until the very end of the section, at which point he suggests that since ships are now so big, large ships (especially supertankers) essentially should have no Captain, but should be run by committee. This is an invalid conclusion, and he offers no evidence or substantial argument to support that idea. Clearly, it is an idea hatched in his office and not on a ship (or plane.) There always needs to be a person in a place of ultimate authority in fast moving, dynamic systems, or the potential exists to have crew members begin to work at direct odds with each other, making a marginal situation dangerous. Ironically, in the very same part of the discussion where he concludes that there should be no Captain, he has hit upon the key to the problem. He mentions that he was pleased to see that some European shippers were now training their crews together as a team, and that he expected this to lower accident rates. He is, in fact, exactly right about that. Airlines now have to train crews in Crew Resource Management (CRM) in which each member of the crew has the right and obligation to speak up if they notice anything awry in the operation of their aircraft, and the Captain makes it a priority to listen to the input of others, as everyone has a different set of concerns and knowledge. In this way, the Captain becomes much less dictatorial, and becomes more of a final decision maker after everyone has had their say. It IS critical, though, to maintain someone in command, as there is no time to assemble a staff meeting when a ship is about to run aground, or a mid-air collision is about to occur. Many other well documented studies and books have come to this conclusion, and in the airline industry since CRM was introduced the accident rate has decreased dramatically.
Overall, if you have a desire to understand high risk systems, this book has a lot of good information in it; however it is woefully out of date and for that reason among others, I can only recommend it with reservations. A better and much more contemporary introductory book on the subject is 'Inviting Disaster' by James R. Chiles. Remember, this book was written over twenty years ago, and much has changed since then. There is knowledge to be gleaned here, but you have to be prepared to sort the wheat from the chaff.
Top reviews from other countries
SeajayReviewed in the United Kingdom on September 14, 20215.0 out of 5 stars An accident theory classic
I'm a safety engineer, so this stuff is my bread and butter. I can tell you that this is a highly respected book in the field. A little dated now so don't take everything it says as gospel but it's essential reading to understand where modern safety systems came from.
Charles ReidReviewed in Canada on July 19, 20175.0 out of 5 stars Dated but worthwhile!
A fine old book. Glad I could connect with a great quality used copy. It will make you think in different ways.
Chris DunnReviewed in Australia on August 31, 20233.0 out of 5 stars Content Excellent
Great read and as valid today as it was in 1979, Printing quality in the book is poor especially for the price.
-
ISGBReviewed in Italy on December 3, 20155.0 out of 5 stars illuminante
- analisi interessante -a -b -c -d -e -f -g -h -i -l m-n -o -p -q -r s-t u-v -z
-
N&MReviewed in Japan on October 17, 20135.0 out of 5 stars 東電の福島第一原発事故の研究で使える
かつて、組織社会学でベストセラーだった作家の作品。東電の原発事故・JR西日本福知山線脱線事故を予見したようなもの。これを読むと、首都圏での暮らし、仕事を続けることが辛くなる。さらに、身の丈にあわなかったトヨタの急成長、そして過度に近代化しようと努力した名古屋の都市機能の限界もわかる。タイ・中国に進出してしまった日本企業は後悔することだろう。オリンピック招致で浮き足立っている首都圏にある企業・学校・公共施設・交通網に従事する人は、必読。さらに、今後、地震被害が予想される中京・関西・四国沿岸部での生活、企業活動をしている人達も「誤った」備えをしないためにも、読むべきだ。
