Buy new:
-29% $13.56$13.56
Delivery Wednesday, August 7
Ships from: Amazon.com Sold by: Amazon.com
Save with Used - Good
$9.50$9.50
Delivery Thursday, August 8
Ships from: Amazon Sold by: MOTIF CAFE
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Follow the author
OK
Turing's Cathedral: The Origins of the Digital Universe Paperback – December 11, 2012
Purchase options and add-ons
A Wall Street Journal Best Business Book of 2012
A Kirkus Reviews Best Book of 2012
In this revealing account of how the digital universe exploded in the aftermath of World War II, George Dyson illuminates the nature of digital computers, the lives of those who brought them into existence, and how code took over the world.
In the 1940s and ‘50s, a small group of men and women—led by John von Neumann—gathered in Princeton, New Jersey, to begin building one of the first computers to realize Alan Turing’s vision of a Universal Machine. The codes unleashed within this embryonic, 5-kilobyte universe—less memory than is allocated to displaying a single icon on a computer screen today—broke the distinction between numbers that mean things and numbers that do things, and our universe would never be the same. Turing’s Cathedral is the story of how the most constructive and most destructive of twentieth-century inventions—the digital computer and the hydrogen bomb—emerged at the same time.
- Print length464 pages
- LanguageEnglish
- PublisherVintage
- Publication dateDecember 11, 2012
- Dimensions5.2 x 1 x 8 inches
- ISBN-101400075998
- ISBN-13978-1400075997
Frequently bought together

Customers who bought this item also bought

Darwin among the Machines: The Evolution of Global IntelligencePaperback$12.02 shippingOnly 8 left in stock (more on the way).
Analogia: The Emergence of Technology Beyond Programmable ControlHardcover$13.28 shippingGet it as soon as Thursday, Aug 8Only 2 left in stock - order soon.
Editorial Reviews
Review
“The best book I’ve read on the origins of the computer. . . not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
—The Boston Globe
“A groundbreaking history . . . the book brims with unexpected detail.”
—The New York Times Book Review
“A technical, philosophical and sometimes personal account . . . wide-ranging and lyrical.”
—The Economist
“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.”
—The New York Review of Books
“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.”
—The Philadelphia Inquirer
“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.”
—The Oregonian
“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself."
—The Wall Street Journal
“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.”
—San Francisco Chronicle
“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.”
—The Seattle Times
“A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.”
—The Guardian
“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.”
—Literary Review
“More than just a great book about science. It’s a great book, period.”
—The Globe and Mail
About the Author
George Dyson is a science historian as well as a boat designer and builder. He is also the author of Baidarka, Project Orion, and Darwin Among the Machines.
Excerpt. © Reprinted by permission. All rights reserved.
Preface
POINT SOURCE SOLUTION
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.
Product details
- Publisher : Vintage; First Edition (December 11, 2012)
- Language : English
- Paperback : 464 pages
- ISBN-10 : 1400075998
- ISBN-13 : 978-1400075997
- Item Weight : 15.2 ounces
- Dimensions : 5.2 x 1 x 8 inches
- Best Sellers Rank: #180,746 in Books (See Top 100 in Books)
- #44 in Computing Industry History
- #432 in Scientist Biographies
- #571 in History & Philosophy of Science (Books)
- Customer Reviews:
About the author

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonCustomers say
Customers find the book has a lot of information and interesting history. They also say the book is mesmerizing and the title is misleading. Opinions differ on the writing quality, with some finding it wonderful and eloquent while others say it's incoherent and wishy-washy.
AI-generated from the text of customer reviews
Customers find the book has a lot of information, excellent insight into people and events surrounding the beginnings of electronic technology, and deep thoughts on artificial intelligence and the future of humanity. They also say the author has done an amazing job cataloging events that have changed our world.
"...these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters...." Read more
"...and storyteller, bringing characters to life as complete and complex people, capturing their personal histories, motivations, and interactions with..." Read more
"...This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital..." Read more
"...Worse, there is way too much information; the book doesn't really need to be 400 pages long and should have been better-edited...." Read more
Customers find the history interesting and fascinating, capturing personal histories and motivations. They also say the chapter on Alan Turing is particularly good, covering much of his work. Readers also mention that the story is full of remarkably odd people and unexpected twists and turns.
"...The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much..." Read more
"...Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of..." Read more
"...characters to life as complete and complex people, capturing their personal histories, motivations, and interactions with each other...." Read more
"...This book is well researched and includes some fascinating stories about some of the icons of science and the engineering involved with the digital..." Read more
Customers have mixed opinions about the writing quality of the book. Some find it wonderful, eloquent, and understandable, while others say it's tedious, cumbersome, and wishy-washy. They also mention that the discussion of computer intelligence is wishy washy and not based on clear facts.
"...He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of..." Read more
"...Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters...." Read more
"...The misleading title, the overabundance of info and poor editing: usually these are not enough to stop me in a case like this...." Read more
"...First, the author is simply a gifted writer and storyteller, bringing characters to life as complete and complex people, capturing their personal..." Read more
Customers are mixed about the enjoyment of the book. Some find it insightful and entertaining, while others say it's boring and excessively long.
"...Overall enjoyable and informative...." Read more
"...Turing's Cathedral" is the uninspiring and rather dry book about the origins of the digital universe...." Read more
"...biographical, and the technical, the reader gets a well rounded, and fun, picture of how the first sparks of the computer age were lit...." Read more
"This is an entertaining account of how the digital computer was developed in the US mixing bits of history, biography, and science...." Read more
Customers find the book title misleading, misrepresented, and thematically presented. They also mention that the topics are presented multiple times.
"...The chapters are not strictly chronological but rather thematic, which works somehow, so that by the book's end, multiple themes are equally..." Read more
"...The author lacked panache.2. The title is misleading. This title is a metaphor regarding Google's headquarters in California...." Read more
"...The misleading title, the overabundance of info and poor editing: usually these are not enough to stop me in a case like this...." Read more
"...The book is misrepresented – The title of the book implies that it is about Alan Turing and computers, but Alan Turing is only a minor character...." Read more
Reviews with images
-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
Turing and von Neumann make their appearances here, of course, along with Mauchley, Eckert, Oppenheimer, Ulam, Freeman Dyson (the authors' father), and other notables of the era. But Dyson also tells the story of a number of pioneers and contributors to the design, construction, and most of all the theory of computation, who have been overlooked by history. Most remarkable, perhaps, is Nils Barricelli, who could justifiably be called the founder of computational biology. Working in the early 1950s with a computer having less computational power and memory than a modern day sewing machine, he created a one-dimensional, artificial,universe in order to explore the relative power of mutation and symbiosis is the evolution of organisms. His work led to a number of original discoveries and conclusions that would only be rediscovered or proposed decades later, such as the notion that genes originated as independent organism, like viruses, that combined to create more complex organisms.
There's an entire chapter on a vacuum tube, the lowly 6J6, a dual triode created during the war that combined several elements necessary for the creation of a large scale computer: Simplicity, ruggedness, and economy. It fulfilled one of von Neumann's guiding principals for ENIAC: Don't invent anything. That is, don't waste time inventing where solutions already exist. By the nature of its relative unreliability and wide production tolerances relative to project goals, it also helped stimulate a critical line of research, that of how to created reliable systems from unreliable components- something more important now than ever in this era of microprocessors and memory chips with millions and even billions of components on a chip.
The chapter on Alan Turing is particularly good, covering as it does much of his work that has been neglected in biographies and presenting a much more accurate description of his work and his contributions to computational science. The great importance of his conceptual computer- the "Turing Machine"- is not, as is commonly stated in popular works, that it can perform the work of any other computer. It is that it demonstrated how any possible computing machine can be represented as a number, and vice versa. This allowed him to construct a proof that there exist uncomputable strings, I.e., programs for which it could not be determined a priori whether they will eventually halt. This was strongly related to Godel's work on the completeness of formal systems, and part of a larger project to disprove Godel's incompleteness theorem.
What makes this a particularly exceptional book is the manner in which Dyson connects the stories of individuals involved in the birth of electronic computing with the science itself. He does an exceptional job of explaining difficult topics like Godel incompleteness, the problems of separating noise from data, and the notion of computability in a way that the intelligent read who may not have advanced math skills will understand. More importantly, he understands the material well enough to know what are the critical concepts and accomplishments of these pioneers of computing, and doesn't fall into the trap of repeating the errors of far too many popular science writers. The result is a thoroughly original, accurate, and tremendously enjoyable history. Strongly recommended to anyone curious about the origins of computers and more importantly, the science of computing itself.
Dyson's book focuses on the pioneering development of computing during the decade after World War II and essentially centers on one man- John von Neumann. Von Neumann is one of the very few people in history to whom the label "genius" can authentically be applied. The sheer diversity of fields to which he made important contributions beggars belief- Wikipedia lists at least twenty ranging from quantum mechanics to game theory to biology. Von Neumann's mind ranged across a staggeringly wide expanse of thought, from the purest of mathematics to the most applied nuclear weapons physics. The book recounts the path breaking efforts of him and his team to build a novel computer at the IAS in the late 1940s. Today when we are immersed in a sea of computer-generated information it is easy to take the essential idea of a computer for granted. That idea was not the transistor or the integrated circuit or even the programming language but the groundbreaking notion that you could have a machine where both data AND the instructions for manipulating that data could be stored in the same place by being encoded in a common binary language. That was von Neumann's great insight which built upon the idea of Alan Turing's basic abstract idea of a computing machine. The resulting concept of a stored program is at the foundation of every single computer in the world. The IAS computer practically validated this concept and breathed life into our modern digital universe. By present standards its computing power was vanishingly small, but the technological future it unleashed has been limitless.
Dyson's book excels mainly in three ways. Firstly, it presents a lively history of the IAS, the brilliant minds who worked there and the culture of pure thought that often looked down on von Neumann's practical computational tinkering. Secondly, it discusses the provenance of von Neumann's ideas which partly arose from his need to perform complex calculations of the events occurring in a thermonuclear explosion. These top-secret calculations were quietly run at night on the IAS computer and in turn were used to tweak the computer's workings; as Dyson pithily puts it, "computers built bombs, and bombs built computers". Von Neumann also significantly contributed to the ENIAC computer project at the University of Pennsylvania. Thirdly, Dyson brings us evocative profiles of a variety of colorful and brilliant characters clustered around von Neumann who contributed to the intersection of computing with a constellation of key scientific fields that are now at the cutting edge.
There was the fascinating Stan Ulam who came up with a novel method for calculating complex processes - the Monte Carlo technique - that is used in everything from economic analysis to biology. Ulam who was one of the inventors of thermonuclear weapons originally used the technique to calculate the multiplication of neutrons in a hydrogen bomb. Then there was Jule Charney who set up some of the first weather pattern calculations, early forerunners of modern climate models. Charney was trying to implement von Neumann's grand dream of controlling the weather, but neither he nor von Neumann could anticipate chaos and the fundamental sensitivity of weather to tiny fluctuations. Dyson's book also pays due homage to an under-appreciated character, Nils Barricelli, who used the IAS computer to embark on a remarkable set of early experiments that sought to duplicate evolution and artificial life. In the process Barricelli discovered fascinating properties of code, including replication and parasitism that mirrored some of the great discoveries taking place in molecular biology at the time. As Dyson tells us, there were clear parallels between biology and computing; both depended on sequences of code, although biology thrived on error-prone duplication (leading to variation) while computing actively sought to avoid it. Working on computing and thinking about biology, von Neumann anticipated the genesis of self-reproducing machines which have fueled the imagination of both science fiction fans and leading researchers in nanotechnology.
Finally, Dyson introduces us to the remarkable engineers who were at the heart of the computing projects. Foremost among them was Julian Bigelow, a versatile man who could both understand code and fix a car. Bigelow's indispensable role in building the IAS computer brings up an important point; while von Neumann may have represented the very pinnacle of abstract thought, his computer wouldn't have gotten off the ground had Bigelow and his group of bright engineers not gotten their hands dirty. Great credit also goes to the two lead engineers on the ENIAC project, J. Presper Eckert and John Mauchly, who were rather unfairly relegated to the shadows and sidetracked by history. Dyson rightly places as much emphasis on discussing the nitty-gritty of the engineering hurdles behind the IAS computer as he does on its lofty mathematical underpinnings. He makes it clear that the ascendancy of a revolutionary technology requires both novel theoretical ideas as well as fine craftsmanship. Unfortunately in this case, the craftsmanship was ultimately trampled by the institute's mathematicians and humanists, which only added to its reputation as a refuge for ivory tower intellectuals who considered themselves above pedestrian concerns like engineering. At the end of the computing project the institute passed a resolution which forbade any kind of experimentation from ever taking place; perhaps keeping in line with his son's future interest in the topic, Freeman Dyson (who once worked on a nuclear spaceship and genuinely appreciates engineering details) was one of the few dissenting voices. But this was not before the IAS project spawned a variety of similar machines which partly underlie today's computing technology.
All these accounts are supplemented with gripping stories about weather prediction, the US thermonuclear program, evolutionary biology, and the emigration of European intellectuals like Kurt Godel and von Neumann to the United States. The book does have its flaws though. For one thing it focuses too heavily on von Neumann and the IAS. Dyson says relatively very little about Turing himself, about pioneering computing efforts at Manchester and Cambridge (the first stored-program computer in fact was the Manchester "Baby" machine) and about the equally seminal development of information theory by Claude Shannon. James Gleick's "The Information" and Andrew Hodges's "Alan Turing: The Enigma" might be useful complements to Dyson's volume. In addition, Dyson often meanders into one too many digressions that break the flow of the narrative; for instance, do we really need to know so much about Kurt Godel's difficulties in obtaining a visa? And do we need to get bogged down in minutiae such as the starting dates and salaries for every member of the project and the list of items on the cafeteria menu? Details like these might put casual readers off.
Notwithstanding these gripes, the book is beautifully written and exhaustively researched with copious quotes from the main characters. It's certainly the most detailed account of the IAS computer project that I have seen. If you want to know about the basic underpinnings of our digital universe, this is a great place to start even with its omissions. All the implications, pitfalls and possibilities of multiple scientific revolutions can be connected in one way or another to that little machine running quietly in a basement in Princeton.
Top reviews from other countries
Bought this book after buying the MANIC by Benjamin Labatut






