Computers rely on such things as semiconductors, memory chips, and electricity. But they also rely on a hard-won body of scientific knowledge that has enabled the now-ubiquitous devices to perform complex calculations, multitask, and even play a game of solitaire.
Martin Davis, a fluent interpreter of mathematics and philosophy, locates the source of this knowledge in the work of the remarkable German thinker G. W. Leibniz, who, among other accomplishments, was a distinguished jurist, mining engineer, and diplomat but found time to invent a contraption called the "Leibniz wheel," a sort of calculator that could carry out the four basic operations of arithmetic. Leibniz subsequently developed a method of calculation called the calculus raciocinator, an innovation his successor George Boole extended by, in Davis's words, "turning logic into algebra." (Boole emerges as a deeply sympathetic character in Davis's pages, rather than as the dry-as-dust figure of other histories. He explained, Davis reports, that he had turned to mathematics because he had so little money as a student to buy books, and mathematics books provided more value for the money because they took so long to work through.) Davis traces the development of this logic, essential to the advent of "thinking machines," through the workshops and studies of such thinkers as Georg Cantor, Kurt Gödel, and Alan Turing, each of whom puzzled out just a little bit more of the workings of the world--and who, in the bargain, made the present possible. --Gregory McNamee
From Publishers Weekly
This thoroughly enjoyable mix of biographical portraits and theoretical mathematics reveals how a sequence of logicians posed the conceptual questions and contributed the crucial insights resulting in the development of computers long before the technology was available to build even the simplest machines. An intriguing portrait of the great 17th-century mathematician G.W. Leibniz, a pivotal figure in the history of the search for human knowledge, launches this account by New York University professor emeritus Davis (Computability and Unsolvability). Steeped in Aristotelian ideas of perfection but trained in modern engineering, Leibniz conceived the idea of a universal system for determining truth. His contributions to this system are as diverse as the ingenious Leibniz Wheel (an early calculating machine) and the notation used today for calculus. His ideasDin particular, his recognition of the deep connection between systems of notation and actual physical devices for performing computationDinspired mathematicians and logicians, including George Boole, Gottlob Frege, Georg Cantor, David Hilbert and Kurt G del, until Alan Turing used them to develop the powerful mathematical tools that underlie modern computers as well as some of the earliest computer prototypes. After Leibniz, people thought about the problem of building computational systems; after Turing, people got busy building the machines. Davis has told the fascinating story in between. Full of well-honed anecdotes and telling detail, the book reads like a masterful lecture. Presenting key mathematical ideas in moderate depth, it also offers a solid introduction to the field of computer science that will captivate motivated readers. Agent, Alex Hoyt.
Copyright 2000 Reed Business Information, Inc.