Amazon.com: Customer Reviews: Turing's Cathedral: The Origins of the Digital Universe
Your Garage Buy 2 kids' books and save Amazon Fashion Learn more nav_sap_plcc_ascpsc $5 Albums Fire TV Stick Happy Belly Coffee Totes Summer-Event-Garden Amazon Cash Back Offer PilotWave7B PilotWave7B PilotWave7B  Amazon Echo  Echo Dot  Amazon Tap  Echo Dot  Amazon Tap  Amazon Echo Introducing new colors All-New Kindle Oasis DollyParton Shop Now STEM

Format: Paperback|Change
Price:$11.76+ Free shipping with Amazon Prime
Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

The physicist John Wheeler who was famous for his neologisms once remarked that the essence of the universe could be boiled down to the phrase "it from bit", signifying the creation of matter from information. This description encompasses the digital universe which now so completely pervades our existence. Many moments in history could lay claim as the creators of this universe, but as George Dyson marvelously documents in "Turing's Cathedral", the period between 1945 and 1957 at the Institute for Advanced Study (IAS) in Princeton is as good a candidate as any.

Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.

Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.

There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.

Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.

All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.

Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.
22 comments| 136 people found this helpful. Was this review helpful to you?YesNoReport abuse
on March 7, 2012
The focus of George Dyson's well-written, fascinating but essentially misleading book,'Turing's Cathedral', is curiously not on celebrated mathematician, code-breaker and computer theorist Alan Turing but on his equally gifted and innovative contemporary John von Neumann. Von Neumann, whose extraordinarily varied scientific activities included inter alia significant contributions to game theory, thermodynamics and nuclear physics, is especially associated with the early development of the electronic digital computer (i.e. the 'EDC'), an interest apparently sparked by reading Turing's seminal 1936 paper 'On Computational Numbers' which attempted to systematize and express in mathematical terminology the principles underlying a purely mechanical process of computation. Implicit in this article, but at a very theoretical level, was a recognition of the relevance of stored program processing (whereby a machine's instructions and data reside in the same memory), a concept emanating from the work of mid-Victorian computer pioneer Charles Babbage but which demanded a much later electronic environment for effective realization.

What Mr Dyson insufficiently emphasizes is that, despite a widespread and ever-growing influence on the mathematical community, Turing's paper was largely ignored by contemporary electronic engineers and had negligible overall impact on the early development of the EDC. Additionally, he omits to adequately point out that von Neumann's foray into the new science of electronic computers involved a virtual total dependence on the prior work, input and ongoing support of his engineering colleagues. Invited in August 1944 to join the Moore School, University of Pennsylvania, team responsible for ENIAC, the world's first general purpose computer being built for the US Army, von Neumann was quickly brought up to speed courtesy of the machine's lead engineers, J. Presper Eckert and John Mauchly. As early as the fall of 1943, Eckert and Mauchly had become seriously frustrated by the severe processing limitations imposed by ENIAC's design and were giving serious consideration to implementing major modifications, in particular the adoption of Eckert's own mercury delay line technology to boost the machine's miniscule memory capacity and enable a primitive stored-program capability. These proposals were subsequently vetoed by the School's authorities on the quite understandable grounds that they would seriously delay ENIAC's delivery date; instead it was decided to simultaneously begin research on a more advanced machine (i.e. EDVAC) to incorporate the latest developments. As a new member of the group, von Neumann speedily grasped the essentials of the new science and contributed valuable theoretical feedback, but an almost total lack of hands-on electronic expertise on his part prevented any serious contribution to the nuts and bolts of the project. Relations with Eckert and Mauchly rapidly deteriorated when an elegantly written, but very high-level, document of his entitled 'First Draft of a Report on the EDVAC' was circulated among the scientific community. Not only had this document not been previewed, let alone pre-approved, by Eckert and Mauchly, but it bore no acknowledgment whatsoever of their overwhelming responsibility for much of the content. By default, and in view too of his already very considerable international reputation, the content was therefore attributed exclusively to von Neumann, an impression he made no attempt thereafter to correct, the term 'Von Neumann Architecture' being subsequently bestowed on the stored program setup described in the document.

The public distribution of von Neumann's 'Draft' denied Eckert and Mauchly the opportunity to patent their technology. Worse still, despite academic precedents to the contrary, they were refused permission by the Moore School to proceed with EDVAC's development on a commercial basis. In spite of his own links to big business (he represented IBM as a consultant), von Neumann likewise opposed their efforts to do so. All this resulted in a major rift, von Neumann thereafter being shunned by Eckert and Mauchly and forced to rely on lesser mortals to help implement various stored-program projects, notably the IAS computer at Princeton. The following year (1946) Eckert and Mauchly left the School to focus on developing machines for the business market. Before doing so, they jointly delivered a series of state of the art lectures on ENIAC and EDVAC to an invited audience at the School. Among the attendees was British electronics engineer Maurice Wilkes, a fellow academic of Turing's from Cambridge University, but with relatively little interest in the latter's ongoing activity (by this time Turing, a great visionary, had also turned his attention to designing stored-program computers). Blown away by Eckert and Mauchly's presentation, Wilkes returned to England to forge ahead with a new machine called EDSAC, which was completed in May 1949 and represented the first truly viable example of a stored program computer (an experimental prototype christened 'Baby' had already been developed at Manchester University the year before). Back in the US, Eckert and Mauchly continued their efforts, but persistent problems with funding and also Eckert's own staunch refusal to compromise on quality delayed progress, their partnership finally culminating in the development of the UNIVAC 1, the world's first overtly business-oriented computer, delivered initially to the Census Bureau in March 1951.

Mr Dyson is quite right of course (and he does this well) to trace the beginnings of the modern computer to the stored program concept, but his obsessive focus on von Neumann's role obscures the impact of Eckert and Mauchly's vastly more significant contribution to its development. The triumph of the EDC depended almost wholly on the efforts and expertise of utterly dedicated and outstanding electronics specialists like them, not on mathematicians, logicians and generalists like von Neumann or even Turing. Never one to deny credit where it was due, Wilkes (who later spearheaded advances in software, became the doyen of Britain's electronic community and ended his long and distinguished career as professor emeritus of computer science at Cambridge) unceasingly acknowledged his major debt to Eckert and Mauchly. Hopefully, Mr Dyson, a writer of considerable talent, might one day decide to tell in full their story and set the record straight.
2424 comments| 384 people found this helpful. Was this review helpful to you?YesNoReport abuse
TOP 500 REVIEWERon March 9, 2012
Turing's Cathedral: The Origins of the Digital Universe by George Dyson

"Turing's Cathedral" is the uninspiring and rather dry book about the origins of the digital universe. With a title like, "Turing's Cathedral" I was expecting a riveting account about the heroic acts of Alan Turing the father of modern computer science and whose work was instrumental in breaking the wartime Enigma codes. Instead, I get a solid albeit "research-feeling" book about John von Neumann's project to construct Turing's vision of a Universal Machine. The book covers the "explosion" of the digital universe and those applications that propelled them in the aftermath of World War II. Historian of technology, George Dyson does a commendable job of research and provide some interesting stories involving the birth and development of the digital age and the great minds behind it. This 432-page book is composed of the following eighteen chapters: 1.1953, 2. Olden Farm, 3. Veblen's Circle, 4. Neumann Janos, 5. MANIAC, 6. Fuld 219, 7. 6J6, 8. V-40, 9. Cyclogenesis, 10. Monte Carlo, 11. Ulam's Demons, 12. Barricelli's Universe, 13. Turing's Cathedral, 14. Engineer's Dreams, 15. Theory of Self-Reproducing Automota, 16. Mach 9, 17. The Tale of the Big Computer, and 18. The Thirty-ninth Step.

Positives:
1. A well researched book. The author faces a daunting task of research but pulls it together.
2. The fascinating topic of the birth of the digital universe.
3. A who's who of science and engineering icons of what will eventually become computer science. A list of principal characters was very welcomed.
4. For those computer lovers who want to learn the history behind the pioneers behind digital computing this book is for you.
5. Some facts will "blow" you away, "In March 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth".
6. Some goals are counterintuitive. "The new computer was assigned two problems: how to destroy life as we know it, and how to create life of unknown forms".
7. There are some interesting philosophical considerations.
8. As an engineer, I enjoy the engineering challenges involved with some of their projects.
9. Amazing how the Nazi threat gave America access to some of the greatest minds. The author does a good job of describing these stories.
10. The fascinating life of the main character of this book, John von Neumann.
11. So much history interspersed throughout this book.
12. The ENIAC..." a very personal computer". A large portion of this book is dedicated to the original computer concepts, challenges, parts, testing, etc...
13. The fundamental importance of Turing's paper of 1936. It's the inspiration behind the history of the digital universe.
14. Some amusing tidbits here and there, including Einstein's diet.
15. The influence of Godel. How he set the stage for the digital revolution.
16. Blown away with Leibniz. In 1679, yes that is correct 1679 he already imagined a digital computer with binary numbers...
17. So many great stories of how these great minds attacked engineering challenges. Computer scientists will get plenty of chuckles with some of these stories involving the types of parts used in the genesis of computing. Vacuum tubes as an example.
18. There are many engineering principles devised early on that remain intact today. Many examples, Bigelow provides plenty of axioms.
19. I enjoyed the stories involving how computers improved the art of forecasting the weather.
20. "Filter out the noise". A recurring theme and engineering practice that makes its presence felt in this book.
21. Computers and nuclear weapons.
22. The Monte Carlo method a new, key domain in mathematical physics and its invaluable contribution to the digital age.
23. The fascinating story of the summer of 1943 at Los Alamos.
24. The Teller-Ulam invention.
25. How the digital universe and the hydrogen bomb were brought into existence simultaneously.
26. Barricelli and an interesting perspective on biological evolution.
27. The amazing life of Alan Mathison Turing and his heroic contributions.
28. A fascinating look at the philosophy of artificial intelligence and its future.
29. The collision between digital universe and two existing stores of information: genetic codes and information stored in brains.
30. The basis for the power of computers.
31. The five distinct sets of problems running on the MANIAC by mid-1953. All in JUST 5 kilobytes.
32. A look at global digital expansion and where we are today.
33. The unique perspective of Hannes Alfven. Cosmology.
34. The future of computer science.
35. Great quotes, "What if the price of machines that think is people who don't?"
36. The author does a great job of providing a "where are they now" narration of all the main characters of the book.
37. Links worked great.
38. Some great illustrations in the appendix of the book. It's always great to put a face on people involved in this story.

Negatives:
1. It wasn't an enjoyable read. Plain and simple this book was tedious to read. The author lacked panache.
2. The title is misleading. This title is a metaphor regarding Google's headquarters in California. The author who was given a glimpse inside the aforementioned organization sensed Turing's vision of a gathering of all available answers and possible equations mapped out in this awe-inspiring facility. My disappointment is that this book despite being inspired by Alan Turing's vision, in fact, has only one chapter dedicated to him. The main driver behind this book was really, John von Neumann.
3. A timeline chart would have added value. With so many stories going back and forth it would help the reader ground their focus within the context of the time that it occurred.
4. Some of the stories really took the scenic route to get to the point.
5. The photos should have been included within the context of the book instead of a separate section of its own.
6. The book was probably a hundred pages too long.

In summary, I didn't enjoy reading this book. The topic was of interest to me but between the misleading title and the very dry prose, the book became tedious and not intellectually satisfying. The book felt more like a research paper than a book intended for the general audience. For the record, I am engineer and a lot of the topics covered in this book are near and dear my heart but the author was never able to connect with me. This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital origins but I felt like I was reading code instead of a story. This book will have a limited audience; if you are an engineer, scientist or in the computer field this book may be of interest but be forewarned it is a monotonous and an uninspiring read.

Recommendations: "Steve Jobs" by Walter Isaacson, "The Quantum Universe: (And Why Anything That Can Happen, Does)" by Brian Cox, "Physics of the Future: How Science Will Shape Human Destiny and Our Daily Lives by the Year 2100" Michio Kaku, "Warnings: The True Story of How Science Tamed the Weather" by Mike Smith, "Spycraft: The Secret History of the CIA's Spytechs, from Communism to Al-Qaeda" by Robert Wallace and H. Keith Melton.
22 comments| 137 people found this helpful. Was this review helpful to you?YesNoReport abuse
on May 23, 2012
This is a fascinating, occasionally irritating, extraordinarily well-researched examination of the emergence of digital computers and the part played by the Institute for Advanced Study in Princeton, NJ. It is highly recommended for anyone who wants to learn more about early computing machines and the people who made them possible.

In many ways this is as much a biography of John von Neumann as it is a history of the origins of modern computing and the information universe. Von Neumann was astonishingly talented and was centrally involved with many of the intellectual and technological revolutions of the 20th century. The early story of computing cannot be told without concentrating on him. Alan Turing was also remarkably talented and pivotal in the origins of computing. In this, Turing's centenary year, one has to look to Andrew Hodges biography for a definitive description of his role in these matters.

Others have pointed to problems with the book. I feel these are minor:

- the title is somewhat misleading: the book is not focused on Alan Turing and it only tangentially concerns itself with the cathedral of information that IT has made available to us (see below),

- the digressions whether they be into the disposition of George Washington's forces before the battle of Princeton, the design of the hydrogen bomb, or the parallels between the evolution of genes and the evolution of code (programs) can be either fascinating or tedious depending on your interests. I enjoyed them,

- it is assumed that the reader knows what registers, accumulators, and the other standard components of a central processor are. It is also assumed that the reader is familiar with relatively obscure computing techniques such as content-addressable memory,

- Dyson strains to the breaking point to find neat analogies to describe the emergence of "the digital universe." For example, "Turing's model was one-dimensional...von Neumann's implementation was two dimensional. The landscape is now three-dimensional". In fact, Turing's model was deliberately designed to be as simple as it possibly could be, MANIAC used a three dimensional memory, and the digital universe is at least 4 dimensional.

However, not all the criticisms of the book are valid. To pick just one example:

- Dyson is concerned about the role that the IAS played in the emergence of our modern information universe. He acknowledges that Eckert and Mauchly played a pivotal role but he isn't trying to disentangle who should have how much credit for which aspect of creating the modern stored program electronic computer. This impossibility of doing this is declared in the Preface.

I find it interesting that both von Neumann and Turing were absorbed with the idea that future stages of computer development would be towards artificial intelligence, self-reproducing machines, and machine consciousness. They saw, and Turing for one feared, computers as potentially replacing humans as the next evolutionary step. Neither of them, and as far as one can tell no other pioneer except Thomas J. Watson, saw that the computing ability of machines is of lesser importance. Far more important, in fact crucial, has been the integration of computers, communications, and, above all, information. This is what constitutes the digital universe.

Dyson describes a visit to Google's HQ saying it felt like "entering a fourteenth-century cathedral". Is it the building or the work that goes on there that reminds him of a huge edifice dedicated to the soul? I suspect he deliberately allows this to remain ambiguous.

Nevertheless, there is some incongruity between this metaphor, which is clearly significant as it provides the book's title, and the very strong emphasis in the later chapters on the biological character of computing. Cathedrals are everything that biology is not: rigid and frigid. Indeed the similarity between biological life and computing is a recurring theme. It may be helpful to use biological analogies to give insight into the future of computing but Dyson appears to believe that these are actual descriptions not analogies. For example, "Google's one million servers constitute a collective, metazoan organism." The server network may resemble an organism but it is not one. Further, he devotes a chapter to Barricelli's early and relatively unsuccessful efforts to simulate evolution using MANIAC. This chapter ends with some elegant but absurd statements such as "We have already outsourced much of our cultural heritage to the Internet, and are outsourcing our genetic inheritance as well."

Dyson is to be commended for producing a book of remarkable scope and depth.
0Comment| 14 people found this helpful. Was this review helpful to you?YesNoReport abuse
on November 18, 2012
I might easily have given this book four stars if Dyson could have stuck to history instead of indulging himself in inane speculations and endless commentaries. The connections he draws between completely unrelated aspects of technology and biology are so strained that whenever I read a particularly grievous one, I'm forced to put the book down and walk around the room until the waves of stupidity subside a bit. For example, at one point Dyson asks us to consider whether digital computers might be "optimizing our genetic code ... so we can better assist them." At another he explains the reason we can't predict the evolution of the digital universe is because algorithms that predicted airplane movements in WW2 had to be normalized to the reference frame of the target... or something? Throughout the entire book there's a complete disconnect between the technical nature of the things he describes and the vague abstractions that he twists into obscenely trite metaphors.

Dyson seems to live in some sort of science-fiction wonderland where every computer program is a kind of non-organic organism. He calls code "symbiotic associations of self-reproducing numbers" that "evolved into collector societies, bringing memory allocations and other resources back to the collective nest." They are active, autonomous entities which "learned how to divide into packets, traverse the network, correct any errors suffered along the way, and reassemble themselves at the other end." By the end of the book I'm not even sure if Dyson means this as a metaphor - he appears to genuinely believe that it's merely a matter of perspective.

The truth is, if every human died tomorrow and the internet was left to run from now to infinity, not a single advance would be made in the state of computing. The viruses would quickly burn themselves away, the servers would grind monotonously at their maintenance routines, and the Google webcrawlers would stoically trudge through all the porn sites on Earth, an infinite number of times.

Dyson might respond that programs integrate humans as a symbiotic part of their evolution, but you could say the same thing about any aspects of culture, such as clothing, music, or furniture. In this light the IKEA franchise must be seen as a great self-replicating organism, conscripting humans in the propagation of its global hegemony of coffee tables.
0Comment| 20 people found this helpful. Was this review helpful to you?YesNoReport abuse
on April 21, 2012
I bought this book as soon as it came out, after reading a good review in the "Guardian". The subject matter is certainly fascinating, but its treatment by the author is, in my opinion, awful. I was expecting a simple technical description of the basic concepts of early computing (like the famous "40-bit line of code" which he mentions without ever describing it). The only interesting technical discussion concerns the challenges faced by the engineers who built the machine in Princeton. Otherwise, we are treated to the biographies of all the main characters in the story (sometimes going back to their grand-parents...) and we learn more about partying in Budapest in the 20's than about Turing's ideas.
Of course, the description of how military applications triggered the development of the first computers is very interesting, but it could have been much more streamlined.
Finally, I found the discussion of computer intelligence very wishy-washy, not based on clear facts and arguments, and, once more, disappointing.
0Comment| 30 people found this helpful. Was this review helpful to you?YesNoReport abuse
on March 19, 2012
Turing's Cathedral: The Origins of the Digital Universe
by George Dyson

"Yeah but did you like the book?????" is the comment on one review of this book. A good question. And difficult in the case of this unique book.

"Turing's Cathedral" gives a real-life feel for what was going on between 1945 and 1957 particularly the "three technological revolutions [that] dawned in 1953: thermonuclear weapons, stored-program computers, and the elucidation of how life stores its own instructions as strings of DNA". It diverts continuously, often inquisitively, often academically, sometimes to the extent that one forgets what the topic of the book was in the first place. And then there is the "Cathedral" part of the title.

For a target audience, the book is a gem. As various reviewers said it is "Impossible to pigeonhole this book", "What he has written is certainly more than just history", it is "100 pages too long". Then again it could be 1,000 pages longer - if the reader wants to feel part of the history then this is it. Do not aim to finish the book, or a chapter, just become part of it.

The book gives snippets of information from all over the universe. My closest recall is the feeling chatting with other users as I submitted paper tapes to the English Electric Leo KDF9 computer (successor to SILLIAC) at the University of Sydney while trying to debug programs. One heard about the fascinating things other people were doing; appreciated what was being done; but really one could not research everything; could not stop to analyze each fleeting statement.

Dyson is an aggressive writer "There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky." This is the first sentence! No explanation. No particular context. Just continues about John von Neumann in 1945 at the Institute for Advanced Study in Princeton. In turn this allows a diversion on the history of Princeton. Further on there is "What if the price of machines that think is people who don't?" while the advent of Los Alamos sparks a geological history of the mesa.

And so it goes, thematically with diversions, only roughly chronological: the atomic bomb, weather forecasting ("It mattered little that the twenty-four hour prediction was twenty-four-hours in the [computing time] making."), the Monte Carlo Method, the hydrogen bomb, human evolution, stellar evolution, self reproducing automata. The author's snippets often ring as the truest on the record, mixed with daily trivia. Yet the topic is John von Neumann and the development of the first computers.

To answer "Yeah but did [will] you like the book?????" If you have read this far, are academic, interested in making sense of the early days of computing, and enjoy confusion, then the answer is "Yes!" It is a different real life concept of history writing. And the "Cathedral"? Alan Turing's 1950 quote, or Google's Californian headquarters, or the architecture of the stored program computer, or the feeling of where the digital universe led (and is leading) us?

Malcolm Cameron
20 March 2012
0Comment| 17 people found this helpful. Was this review helpful to you?YesNoReport abuse
TOP 1000 REVIEWERon January 22, 2014
I was disappointed in this book. I expected it to be about the work Alan Turing did to develop the modern computer, or at least about the origins of the modern computer. While this book touches on these subjects it is actually mostly about John von Neumann’s work to develop the modern computer. Interspersed are chapters covering some of the initial problems that these early computers were able to tackle. Among them were weather forecasting, the evolution of biological systems, the evolution of stars, and most importantly the feasibility and design of the hydrogen bomb.

I did not dislike the book, so I give it three-stars, but only recommend it with the reservation that a perspective reader consider my reasons, given below, for why I did not rate it higher. While this book might not be for a general audience, I could recommend it to someone who wants more information on the Institute for Advanced Studies, IAS, the people who worked there and want information specifically about John von Neumann’s computer efforts.

In more detail - I found the book to have the following less than desirable features:
- The book is overstuffed – The author’s father worked at the IAS and he grew up there. As a result he was fascinated by everything to do with Princeton, IAS and the people who worked there. For instance, the book discusses - the Native American tribe who lived in the land occupied by Princeton University and IAS, William Penn and the founding of Pennsylvania, the design and construction of the IAS buildings and even a discussion of the woman who was the personal assistant to the director of IAS. I was hoping for a book, which had more about computers and less about this background material. Unfortunately, I estimate that this background material takes up about half of the book!

- The book is disorganized – I feel that the numerous digressions, described above, distracted from the story of computers. While at first I even enjoyed them, after a while I found that they caused me to loose the train of the narrative. Furthermore, chapters covering things like weather forecasting are thrown in-between those on the building of the computers, further disrupting the narrative concerning computer design and construction. I feel that the book is more of a collection of essays than a chronological story. These essays shift from building computers in the 1950’s, to Europe in the 1930’s, and then back to the 50’s, then to Los Alamos in the 1940’s and back again to the computers of the 50’s, and I feel that this tended to defocus the narrative.

- The writing style – I found the writing style to be somewhat cumbersome, especially when discussing the finer points of the design and construction of computers. In many cases the author quotes mathematicians or computer experts concerning their work, and while this produces text that is readily understandable to people conversant with these fields, I found it to be less than perfectly clear. In contrast, most of the best writers of popular science have a knack of describing complex work in a readily understandable form, but which I all too often found lacking in this book. I would have been much happier if the book was clearer on the details of the evolution of computer design, rather than on the design of the buildings in which they were designed and built.

- The book is misrepresented – The title of the book implies that it is about Alan Turing and computers, but Alan Turing is only a minor character. The subtitle “The Origins of the Digital Universe” implies that the book covers the origin of computer in general. As has been noted, while Turing and computers in general are discussed they are not the focus of the book. A more accurate title might have been “von Neumann’s Computer”, but I guess that the publisher, who generally decides on the title, felt that this title would not sell as well as the one he chose (especially since the year the book was published was the 100th anniversary of Turing’s birth).

While I learned something’s about John von Neumann, the computers that he worked on, and some important problems that these computers were put to work solving, the most important things that I learned were that I would like read a full von Neumann biography and a book that covers all of the different efforts that have led to the “digital universe”. This book only whetted my appetite to learn more about these subjects.
0Comment| 6 people found this helpful. Was this review helpful to you?YesNoReport abuse
on March 8, 2012
About the title, which seems to be misunderstood: The epigraph at the beginning of Chapter 13 is a quote from Alan Turing about artificial intelligence: "In attempting to construct such machines we should not be irreverently usurping His power of creating souls, any more than we are in the procreation of children: rather we are, in either case, instruments of His will providing mansions for the souls that He creates ..." The point being that what is all around us today is far more than the mansions Turing spoke of; it's a cathedral: a collaboration built by so many that it's hard to point to any one, a thing of huge scale built over time, and continuously adapted by multiple architects to their own purposes.

George Dyson follows one of the main ribs of this cathedral in this book centered on - but not exclusively about - John von Neumann's development of an electronic computer at the Institute for Advanced Study in Princeton during the aftermath of WW2. This is such a great book on so many levels. First, the author is simply a gifted writer and storyteller, bringing characters to life as complete and complex people, capturing their personal histories, motivations, and interactions with each other. He anchors the story in the high drama of the times, especially the desperation of brilliant refugees as they fled their European homelands and reinvented themselves in the universities and corporations of North America, and the looming specter of nuclear war. He elucidates the contributions of the frequently overlooked, in particular the women who were the early coders, and brings in the perspectives of wives, secretaries, workmen, and administrators. In scope, detail, and drama the book reminded me of Richard Rhodes' The Making of the Atomic Bomb, and indeed there is some overlap with the characters and the topic. The chapters are not strictly chronological but rather thematic, which works somehow, so that by the book's end, multiple themes are equally developed and the story has been tightly woven - almost magically - into a finished tapestry. The author used truly new sources, for example the decades-long personal correspondence between John von Neumann and his second wife Klari from Budapest. He traces the ghost of the vacuum-tube room-sized monstrosity through time directly into our cell phones and our search engines and perhaps our psyches, speculates where the digital universe may be going, and wraps up with a last chapter that tells us not only the fate of the MANIAC but the separate destinies of those who built it.
0Comment| 15 people found this helpful. Was this review helpful to you?YesNoReport abuse
on October 18, 2012
Color me disappointed.

I expected a history of the construction of ENIAC, one of the first general purpose computers; arguably the most historically and technically important, in that its design lives on in the heart of nearly all modern computers. I also expected something of a history of the Center for Advanced Studies at Princeton, seeing as that's where the machine was built and where the author, George Dyson, son of Institute physicist Freeman Dyson, spent at least part of his childhood as, "one of a small band of eight- to ten-year-olds who spent our free time exploring the Institute Woods..."

It's not that I didn't get both histories, and more; all the information that I wanted to read is there, in this book. It would seem, however, that Mr. Dyson's primary goal with this book was not so much historical as it was prophetic. The structure of the book is thus not chronological, but conceptual, with each chapter presenting the history of a single idea, and how each idea was conceived or how it evolved in the new world of digital calculation.

Concepts such as shock waves, Monte Carlo method, self-reproducing automata, biology, weather prediction and climate modeling - all of these were wrestled to the ground and cranked through ENIAC and its brethren, MANIC and JOHNNIAC and SILLIAC and many others. Some of these ideas were entirely new; others were suddenly computable.

What Dyson tries to do with this book is to show us how these ideas came about, how they changed, how they grew and branched into still other ideas, and how the digital universe was born in the very exploration of these ideas. Whole new dimensions of space and time - measured in bits and nanoseconds - sprang into existence with ENIAC. The physics of these dimensions is defined in the code we execute there; the worlds that float in that universe are built of the data that we inject from outside.

It's a grand concept, as is his suggestion that humanity's successor is already here, sharing the planet with us. That, with the birth of the internet, the development of cloud computing and other abstractions of computation and data storage , we have created the species that will ultimately, if not replace, then supersede, our own. The machines that allow us to interact with our world, our bank accounts and our friends; that predict our weather or guide our spacecraft - these machines also manipulate the DNA of creatures living in our own universe; in fact must act as our intermediaries, because our minds simply cannot operate at the speeds or scales required to communicate with that microscopic world.

He may be right; his arguments are compelling. But it wasn't what I expected.

By following the ideas, rather than the calendar or even geography, every chapter seems to start in the 1920s or 1930s, in Hungary or England, finding its way to New Jersey or Maryland, before crossing the continent to Los Alamos or the Atlantic and back to Europe in the 1950s. With each chapter, I learn a bit more about Von Neumann or Ulam or Turing before being flung back to 1924 Budapest.

In the end, maybe the problem is mine, not Dyson's. Maybe I simply wanted a nice, straightforward history of science and computing that didn't strain my mind too badly; something that I could pick up and put down easily. Perhaps if I had treated it as a collection of essays with a common theme, I might have been happier.

As it is, I am just lost in time and space. And I'm disappointed.
22 comments| 10 people found this helpful. Was this review helpful to you?YesNoReport abuse