- Hardcover: 384 pages
- Publisher: Addison-Wesley Professional; 1 edition (December 9, 2007)
- Language: English
- ISBN-10: 0321477898
- ISBN-13: 978-0321477897
- Product Dimensions: 6.3 x 1.2 x 9.3 inches
- Shipping Weight: 1.3 pounds (View shipping rates and policies)
- Average Customer Review: 17 customer reviews
- Amazon Best Sellers Rank: #1,714,234 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Geekonomics: The Real Cost of Insecure Software 1st Edition
Use the Amazon App to scan ISBNs and compare prices.
Frequently bought together
Customers who bought this item also bought
About the Author
David Rice is an internationally recognized information security professional and an accomplished educator and visionary. For a decade he has advised, counseled, and defended global IT networks for government and private industry. David has been awarded by the U.S. Department of Defense for "significant contributions" advancing security of critical national infrastructure and global networks. Additionally, David has authored numerous IT security courses and publications, teaches for the prestigious SANS Institute, and has served as adjunct faculty at James Madison University. He is a frequent speaker at information security conferences and currently Director of The Monterey Group.
Excerpt. © Reprinted by permission. All rights reserved.
You may or may not have an inkling of what insecure software is, how it impacts your life, or why you should be concerned. That is OK. This book attempts to introduce you to the full scope and consequence of software's impact on modern society without baffling the reader with jargon only experts understand or minutia only experts care about. The prerequisite for this book is merely a hint of curiosity.
Although we interact with software on a daily basis, carry it on our mobile phones, drive with it in our cars, fly with it in our planes, and use it in our home and business computers, software itself remains essentially shroudeda ghost in the machine; a mystery that functions but only part of the time. And therein lays our problem.
Software is the stuff of modern infrastructure. Not only is software infused into a growing number of commercial products we purchase and services we use, but government increasingly uses software to manage the details of our lives, to allocate benefits and public services we enjoy as citizens, and to administer and defend the state as a whole. How and when we touch software and how and when it touches us is less our choice every day. The quality of this software matters greatly; the level of protection this software affords us from harm and exploitation matters even more.
As a case in point, in mid-2007 the country of Estonia, dubbed "the most wired nation in Europe" because of its pervasive use of computer networks for a wide array of private and public activities, had a significant portion of its national infrastructure crippled for over two weeks by cyber attacks launched from hundreds of thousands of individual computers that had been previously hijacked by Russian hackers. Estonia was so overwhelmed by the attacks Estonian leaders literally severed the country's connection to the Internet and with it the country's economic and communications lifeline to the rest of the world. As one Estonian official lamented, "We are back to the stone age." The reason for the cyber attack? The Russian government objected to Estonia's removal of a Soviet-era war memorial from the center of its capital Tallinn to a military cemetery.
The hundreds of thousands of individual computers that took part in the attack belonged to innocents; businesses, governments, and home users located around the world unaware their computers were used as weapons against another nation and another people. Such widespread hijacking was made possible in large part because of insecure softwaresoftware that, due to insufficient software manufacturing practices leaves defects in software that allows, among other things, hackers to hijack and remotely control computer systems. Traditional defensive measures employed by software buyers such as firewalls, anti-virus, and software patches did little to help Estonia and nothing to correct software manufacturing practices that enabled the attacks in the first place.
During the same year, an experienced "security researcher" (a euphemism for a hacker) from IBM's Internet Security Systems was able to remotely break into and hijack computer systems controlling a nuclear power plant in the United States. The plant's owners claimed their computer systems could not be accessed from the Internet. The owners were wrong. As the security researcher later stated after completing the exercise, "It turned out to be the easiest penetration test I'd ever done. By the first day, we had penetrated the network. Within a week, we were controlling a nuclear power plant. I thought, 'Gosh, this is a big problem.'"
Indeed it is.
According to IDC, a global market intelligence firm, 75 percent of computers having access to the Internet have been infected and are actively being used without the owner's knowledge to conduct cyber attacks, distribute unwanted email (spam), and support criminal and terrorist activities. To solely blame hackers or hundreds of thousands of innocent computer users, or misinformedand some might say "sloppy"power plant owners for the deplorable state of cyber security is shortsighted and distracts from the deeper issue. The proverbial butterfly that flaps its wings in Brazil causing a storm somewhere far away is no match for the consequences brought about by seemingly innocuous foibles of software manufacturers. As one analyst commented regarding insecure software as it related to hijacking of the nuclear reactor's computer systems, "These are simple bugs mistakes in software, but very dangerous ones."
The story of Estonia, the nuclear reactor, and thousands of similar news stories merely hint at the underlying problem of modern infrastructure. The "big problem" is insecure software and insecure software is everywhere. From our iPhones (which had a critical weakness in its software discovered merely two weeks after its release) to our laptops, from the XBOX to public utilities, from home computers to financial systems, insecure software is interconnected and woven more tightly into the fabric of civilization with each passing day and with it, as former U.S. Secretary of Defense William Cohen observed, an unprecedented level of vulnerability. Insecure software is making us fragile, vulnerable, and weak.
The threat of global warming might be on everyone's lips, and the polar ice caps might indeed melt but not for a time. What is happening right now because of world-wide interconnection of insecure software gives social problems once limited by geography a new destructive range. Cyber criminals, terrorists, and even nation states are currently preying on millions upon millions of computer systems (and their owners) and using the proceeds to underwrite further crime, economic espionage, warfare, and terror. We are only now beginning to realize the enormity of the storm set upon us by the tiny fluttering of software manufacturing mistakes and the economic and social costs such mistakes impose. In 2007, "bad" software cost the United States roughly $180 billion; this amount represents nearly 40 percent of the U.S. military defense budget for the same year ($439 billion) or nearly 55 percent more than the estimated cost to the U.S. economy ($100 billion) of Hurricane Katrina, the costliest storm to hit the United States since Hurricane Andrew.1
Since the 1960s, individuals both within and outside the software community have worked hard to improve the quality, reliability, and security of software. Smart people have been looking out for you. For this, they should be commended. But the results of their efforts are mixed.
After 40 years of collaborative effort with software manufacturers to improve software quality, reliability, and security, Carnegie Mellon's Software Engineering Institute (SEI)an important contributor to software research and improvementdeclared in the year 2000 that software was getting worse, not better.. Such an announcement by SEI is tantamount to the U.S. Food and Drug Administration warning that food quality in the twenty-first century is poorer now than when Upton Sinclair wrote The Jungle in 1906.2 Unlike progress in a vast majority of areas related to consumer protection and national security, progress against "bad" software has been fitful at best.
While technical complications in software manufacturing might be in part to blame for the sorry state of software, this book argues that even if effective technical solutions were widely available, market incentives do not work for, but work against better, more secure software. This has worrisome consequences for us all.
Incentives matter. Human beings are notoriously complex and fickle creatures that will do whatever it takes to make themselves better off. There is nothing intrinsically wrong with this behavior. People, looking out for their own best interests are what normal, rational human beings are want to do. However, the complication is that society is a morass of competing, misaligned, and conflicting incentives that leads to all manner of situations where one individual&...
Top customer reviews
This is an easily readable, no deep tech knowledge necessary book, written from a well-researched, mostly legal point of view, addressing those concerns. A must-read wake-up call to anyone thinking about such matters.
I add this line to my previous review after buying 2 more of these books for gifts for policy makers.
i wish that the author will improve future editions of textbook with more examples. The comments are insightful, but a part of me wish to see statistics to believe what the author is saying.
While the examples the author uses to relate standards of other industries help with understanding the fragile state of software and why this is so alarming, it does make the book drag on at times. There is a good possibility the reader will finish the book knowing more about several topics besides software.
The title of the book is completely misleading and has a very tenuous connection to the actual contents. "Geekonomics" would seem to imply some extent of actual economic analysis of insecure software, but there is exactly zero in the book along those lines. There is nothing to educate or assist a person attempting to perform a software security cost benefit analysis or calculating residual risk of a vulnerability, which would have been the expectation based on the title. Additionally, the subtitle "The Real Cost of Insecure Software" would seem to indicate that the book was focused on software security, but the primary examples and arguments presented are actually regarding software safety rather than security. These are two distinct and separate fields of study and the author does not seem to understand or appreciate the difference. This is a serious flaw.
The primary thesis of the book is that software is bad, the people and companies who provide software are negligent, and that licensing of software developers and unrestrained liability in the courts is the solution. Clearly the author is passionate along these lines, but this simplistic solution is presented without any actual supporting evidence that it would solve the problems outlined in the book. The author fails to identify or cite any examples of software done "correctly" and does not establish any credibility that he has experience either developing or marketing software products.
The ironic thing to me is that I really wanted to like the book and actually appreciate some of the out of the box "Big Ideas" presented. For example, the concepts of recognizing and mitigating "Software of Unknown Pedigree" (SOUP) is of particular concern to software security engineers, and I believe that the author may deserve credit transitioning the nomenclature from the FDA document where it was coined to the broader security community and that acronym has been popping up at more and more security conferences since the book was published. Other ideas fell flat to me, such as comparing the licensing of prostitutes in Nevada to the lack of licensing of software engineers, which was entertaining but never actually seemed to make a salient point.
I can see how the book would have great appeal to presenters at security conferences, as many of the outlandish analogies and examples in the book would make for entertaining presentation source material. Indeed, I purchased the book on the recommendation of one such speaker and it clearly has its fans.