Enjoy fast, free delivery, exclusive deals, and award-winning movies & TV shows with Prime
Try Prime
and start saving today with fast, free delivery
Amazon Prime includes:
Fast, FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with Fast, FREE Delivery" below the Add to Cart button.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited Free Two-Day Delivery
- Instant streaming of thousands of movies and TV episodes with Prime Video
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
- Unlimited photo storage with anywhere access
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
Buy new:
$16.99$16.99
FREE delivery: Dec 26 - Jan 3 on orders over $35.00 shipped by Amazon.
Ships from: Amazon.com Sold by: Amazon.com
Buy used: $8.06
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Follow the author
OK
Turing's Cathedral: The Origins of the Digital Universe Paperback – December 11, 2012
| Price | New from | Used from |
|
Audible Audiobook, Unabridged
"Please retry" |
$0.00
| Free with your Audible trial | |
|
Hardcover, Deckle Edge
"Please retry" | $21.95 | $2.62 |
|
Audio CD, Audiobook, Unabridged
"Please retry" | $39.99 | $11.24 |
Purchase options and add-ons
A Wall Street Journal Best Business Book of 2012
A Kirkus Reviews Best Book of 2012
In this revealing account of how the digital universe exploded in the aftermath of World War II, George Dyson illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world.
In the 1940s and ‘50s, a small group of men and women—led by John von Neumann—gathered in Princeton, New Jersey, to begin building one of the first computers to realize Alan Turing’s vision of a Universal Machine. The codes unleashed within this embryonic, 5-kilobyte universe—less memory than is allocated to displaying a single icon on a computer screen today—broke the distinction between numbers that mean things and numbers that do things, and our universe would never be the same. Turing’s Cathedral is the story of how the most constructive and most destructive of twentieth-century inventions—the digital computer and the hydrogen bomb—emerged at the same time.
- Length
464
Pages
- Language
EN
English
- PublisherVintage
- Publication date
2012
December 11
- Dimensions
5.2 x 1.0 x 8.0
inches
- ISBN-101400075998
- ISBN-13978-1400075997
Frequently bought together

Similar items that may ship from close to you
Editorial Reviews
Review
“The best book I’ve read on the origins of the computer. . . not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—The Boston Globe
“A groundbreaking history . . . the book brims with unexpected detail.”
—The New York Times Book Review
“A technical, philosophical and sometimes personal account . . . wide-ranging and lyrical.”
—The Economist
“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.”
—The New York Review of Books
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.”
—The Philadelphia Inquirer
“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.”
—The Oregonian
“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself."
—The Wall Street Journal
“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.”
—San Francisco Chronicle
“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.”
—The Seattle Times
“A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.”
—The Guardian
“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.”
—Literary Review
“More than just a great book about science. It’s a great book, period.”
—The Globe and Mail
About the Author
George Dyson is a science historian as well as a boat designer and builder. He is also the author of Baidarka, Project Orion, and Darwin Among the Machines.
Excerpt. © Reprinted by permission. All rights reserved.
Preface
POINT SOURCE SOLUTION
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.
Product details
- Publisher : Vintage; First Edition (December 11, 2012)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 1400075998
- ISBN-13 : 978-1400075997
- Item Weight : 13.2 ounces
- Dimensions : 5.24 x 1 x 7.95 inches
- Best Sellers Rank: #329,632 in Books (See Top 100 in Books)
- #88 in Computing Industry History
- #657 in Scientist Biographies
- #919 in History & Philosophy of Science (Books)
- Customer Reviews:
Important information
To report an issue with this product, click here.
About the author

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonReviews with images
Submit a report
- Harassment, profanity
- Spam, advertisement, promotions
- Given in exchange for cash, discounts
Sorry, there was an error
Please try again later.-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
Turing and von Neumann make their appearances here, of course, along with Mauchley, Eckert, Oppenheimer, Ulam, Freeman Dyson (the authors' father), and other notables of the era. But Dyson also tells the story of a number of pioneers and contributors to the design, construction, and most of all the theory of computation, who have been overlooked by history. Most remarkable, perhaps, is Nils Barricelli, who could justifiably be called the founder of computational biology. Working in the early 1950s with a computer having less computational power and memory than a modern day sewing machine, he created a one-dimensional, artificial,universe in order to explore the relative power of mutation and symbiosis is the evolution of organisms. His work led to a number of original discoveries and conclusions that would only be rediscovered or proposed decades later, such as the notion that genes originated as independent organism, like viruses, that combined to create more complex organisms.
There's an entire chapter on a vacuum tube, the lowly 6J6, a dual triode created during the war that combined several elements necessary for the creation of a large scale computer: Simplicity, ruggedness, and economy. It fulfilled one of von Neumann's guiding principals for ENIAC: Don't invent anything. That is, don't waste time inventing where solutions already exist. By the nature of its relative unreliability and wide production tolerances relative to project goals, it also helped stimulate a critical line of research, that of how to created reliable systems from unreliable components- something more important now than ever in this era of microprocessors and memory chips with millions and even billions of components on a chip.
The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much more accurate description of his work and his contributions to computational science. The great importance of his conceptual computer- the "Turing Machine"- is not, as is commonly stated in popular works, that it can perform the work of any other computer. It is that it demonstrated how any possible computing machine can be represented as a number, and vice versa. This allowed him to construct a proof that there exist uncomputable strings, I.e., programs for which it could not be determined a priori whether they will eventually halt. This was strongly related to Godel's work on the completeness of formal systems, and part of a larger project to disprove Godel's incompleteness theorem.
What makes this a particularly exceptional book is the manner in which Dyson connects the stories of individuals involved in the birth of electronic computing with the science itself. He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of computability in a way that the intelligent read who may not have advanced math skills will understand. More importantly, he understands the material well enough to know what are the critical concepts and accomplishments of these pioneers of computing, and doesn't fall into the trap of repeating the errors of far too many popular science writers. The result is a thoroughly original, accurate, and tremendously enjoyable history. Strongly recommended to anyone curious about the origins of computers and more importantly, the science of computing itself.
Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.
Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.
There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.
Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.
All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.
Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.
Top reviews from other countries
The first of these is a certain lack of balance. Despite the title, Alan Turing is given only a minor role, and - despite some acknowledgement of British contributions to both the MANIAC project and other early computers - the author clearly takes the view that von Neumann and IAS were the principle inventors of the modern stored program computer. This is debatable. British computer developments were ahead of US developments at many stages during this period, including the completion of Colossus ahead of ENIAC, the completion of the Manchester Baby ahead of MANIAC and other early computers, and the introduction of the Ferranti Mark1 as the first commercially available computer. von Neumann's "First Draft of a Report on the EDVAC" (1945) was the first published account of the idea of a stored program computer, and gave rise to the term "von Neumann architecture" which is still used today, but the idea had by then been current for a year or two and others, including Turing, were already experimenting with it. It can be argued that storage, or "memory" was the key innovation that allowed computing to develop and, once used for intermediate results during a computation, its use to store programs was an invention waiting to happen. Therefore, the book should be read in conjunction with Andrew Hodge's "Alan Turing: The Enigma" and other books on early computers to arrive at a balanced view.
The second flaw is, unfortunately, more serious. Dyson's view of the "digital universe" is based on his perception of current offerings from companies such as Amazon, Facebook and Google and on a dystopian interpretation of modern developments in which computers and networks reproduce themselves and become the controllers of mankind rather than its servant - a view more reminiscient of works of science fiction such as The Matrix rather than serious history. Several of the later chapters contain uncritical discussions of this theme. Dyson argues that computers have influenced human behaviour - and so, of course, has every other new technology - but he also says "Facebook defines who we are; Amazon defines what we want; Google defines what we think." Really? We are just waking up to the fact these companies pay little or no tax in the UK but, given the fact that their current services are easily fooled, perhaps we don't to worry about them taking over our minds just yet.
Because of - or perhaps in spite of - this background I found it extremely difficult to review George Dyson's book. The claim on the back cover that the book 'can be read as literature whether or not you have any interest in computers and machine intelligence' is, in my view, grossly misleading and dangerously inaccurate.
For example, we learn on page 301 that (verbatim) "the codes spawned in 1951 have proliferated, but their nature has not changed. They are symbiotic associations of self-reproducing numbers (starting with a primitive alphabet of order codes) that were granted limited, elemental powers, the way a limited alphabet of nucleotide sequences code for an elemental set of aminio acids - with polynucleotides, proteins, and everything else that follows developing from there."
This, I submit, is hardly something that can be read as literature. Although I have a reasonable scientific background I had similar difficulties with sections dealing with Monte Carlo statistical techniques, chaos theory in meteorology and with the theory of self-reproducing automata.
The research that Mr Dyson carried out in developing the various chapters is, of course, impressive but I would have found the book far more interesting and informative had he concentrated on developing the subject matter within a chronological timeline - and, even better, had he focused on explaining it rather than simply relying on extremely erudite statements. He also, and very obviously, found it difficult to decide whether to concentrate on:
1. tracing the development of the digital computer itself. If so, the material on the theory of Turing's Universal Machine should appear before page 243 whilst a summary of the prophetic work of Gottfried Leibniz at the end of the 17th century would be better located before pages 103 to 105. There is, admittedly, a large amount of information on the development of various digital components and storage techniques but, unfortunately, this is scattered throughout the book.
or on
2. examining the work of a number of eminent scientists and focusing on how, by applying the evolving digital technology to their research work, they influenced and contributed to the development of that technology. There is a large amount of interesting background information on the scientists themselves (and on the occasional clash of mercurial personalities) including such anecdotal gems as the hospital at Los Alamos charging one dollar a day for diapers. But...
The depth of material in 'Turing's Cathedral' is immense which - had it been the sole criteria - would have justified a five-star rating. However, the lack of a coherent timeline and his difficulty in dealing with highly complex scientific issues reduces my rating to a more than generous three stars.
In my opinion the 1953 book Faster Than Thought: A Symposium on Digital Computing Machines gives a far better overview of developments prior to that date. That edition is, unfortunately, now out of print but 1955, 1957 and 1963 reprints are listed on Amazon. Out of interest the copy on my bookshelf contains, as a bookmark, a receipt dated 3rd December 1953 showing that it cost me £1:16s:3d...!
I skimmed sections that seemed dense in technical details of valves and command lines, but the stories of wives and women working on computer hardware and programmes, plus the vibrant "work hard, play hard" atmosphere in the various campus-type living arrangements were fascinating. Klari von Neumann's narrative was one of the most engaging for me. I also quite like stories of how institutions are shaped, so I wasn't put off by this strand.
A stand out comment related to the power of computer processing keeping men honest, because we've all seen how powerful computer models can be created and used dishonestly.
The Manchester University Small Scale Experimental Machine or Baby was repeatedly referred to in the same breath as Colossus and thus was a bit confusing. For instance "the core of the computing group from Bletchley Park were continuing from where their work on Colossus had left off". I (unlike the author who counts Max Newman as the core) imagine that the core of the computing group were the ones who actually designed and built the machine; Williams, Kilburn and Tootill who had all been based at the Telecommunications Research Establishment in Malvern. It isn't the most straightforward of family trees, but these vague references don't help to give people their proper credits or to understand why things came about in the way they did.
Kindle-wise, quite a few of the photos at the end seemed to have become separated from their captions on the following page which is a bit annoying, but I don't remember any particularly awful lay out issues.








