Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Other Sellers on Amazon
+ $6.82 shipping
+ Free Shipping
+ $3.99 shipping
The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution Hardcover – October 7, 2014
|New from||Used from|
The Amazon Book Review
Author interviews, book reviews, editors picks, and more. Read it now
Frequently bought together
Customers who bought this item also bought
Customers who viewed this item also viewed
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
The bad news: this book will serve as the definitive history of computing through the first decade of the 21st century. It is at best technically wrong, misses some of the key threads in computing history and starts with a premise (that innovation comes from collaboration) and attempts to write history to fit.
The difference between and a reporter and a historian is that one does a superficial run-through of a rolodex of contacts and the other tries to find the truth. Unfortunately Isaacson's background as reporter for Time and CNN makes this "history" feel like he was comfortable going through his Rolodex of "Silicon Valley" sources connecting interviews, and calling it history.
I'm sure Isaacson would claim, "more details get in the way of a good story," however that is exactly the difference between a throwaway story on CNN and a well written history. The same epic sweep could have embraced and acknowledged the other threads that Isaacson discarded. The gold standard for a technical history is Richard Rhodes "The Making of the Atomic Bomb."
(Other reviewers have pointed out pointed several critical missing parts of computing history. I'll add one more. While perpetuating the "Intel invented the microprocessor" story makes great business press copy it's simply wrong. Intel commercialized something they knew someone else had already done. Lee Boysel at Four Phase invented the first microprocessor. If Isaacson had done his homework he would have found out that Bob Noyce was on the Four Phase board, knew about the chip and encouraged Intel to commercialize the concept.)
Finally, one of the "facts" in this book that differentiate reporting from history is the garbled bio of Donald Davies, one of the key inventors of Packet Switching. Davies is described as "during the war he worked at Birmingham University creating alloys for nuclear weapons tubes..." I started laughing when I read that sentence. It's clear Isaacson had no idea what Davies did in WWII. He obviously found a description of Davies' war work, didn't understand it and re-edited it into something accidently amusing - and revealing. What Davies had actually done during the war is worked on the British nuclear weapons program - codenamed "TubeAlloys".
Understanding the distinction is the difference between a reporter and a historian.
Isaacson, always interested in what makes some people truly significant and others merely dreamers or money makers, focuses on the need for sensitivity to the ability of computers to complement, instead of replace, human intelligence. He also observes that the major figures in computing were able to blend insights from the humanities and sciences and tended to work in close collaboration with others. The myth of the lonely creative genius turns out, at least in computing, to be mostly a myth.
The book travels rather well trodden ground and is not a book for those who want an understanding of the development of computer science. But if you are interested in sketches--almost universally positive as is Isaacson's style--of the major figures in computing along with a simple explanation as to why they're important, this book is a good purchase.
Isaacson's prose is easy to read--I read the whole book in less than day--which means that the book is not only a worthy exercise in lifetime learning but a pleasurable experience as well. I would have preferred more technical descriptions of computer science but I work in data analytics so my opinion may not accord with the majority of readers.
Somewhat simplistic, too universally positive but still an interesting survey of the major figures in computing. Not life changing but I can think of worse ways to spend nine hours than reading a work with as interesting a subject and as polished prose as this book.