Edit of 2 April 2007 to add link and better summary.
I read this book when it was assigned in the 1980's as a mainstream text for graduate courses in public policy and public administration, and I still use it. It is relevant, for example, to the matter of whether we should try to use nuclear bombs on Iraq--most Americans do not realize that there has never (ever) been an operational test of a US nuclear missile from a working missle silo. Everything has been tested by the vendors or by operational test authorities that have a proven track record of falsifying test results or making the tests so unrealistic as to be meaningless.
Edit: my long-standing summary of the author's key point: Simple systems have single points of failure that are easy to diagnose and fix. Complex systems have multiple points of failure that interact in unpredictable and often undetectable ways, and are very difficult to diagnose and fix. We live in a constellation of complex systems (and do not practice the precationary principle!).
This book is also relevant to the world of software. As the Y2K panic suggested, the "maze" of software upon which vital national life support systems depend--including financial, power, communications, and transportation software--has become very obscure as well as vulnerable. Had those creating these softwares been more conscious of the warnings and suggestions that the author provides in this book, America as well as other nations would be much less vulnerable to terrorism and other "acts of man" for which our insurance industry has not planned.
I agree with another review who notes that this book is long overdue for a reprint--it should be updated. I recommended it "as is," but believe an updated version would be 20% more valuable.
Edit: this book is still valuable, but the author has given us the following in 2007:The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters