Buy new:
$14.75$14.75
$4.50
delivery:
Tuesday, April 23
Ships from: gamesters3 Sold by: gamesters3
Buy used: $12.70
Other Sellers on Amazon
& FREE Shipping
91% positive over last 12 months
FREE Shipping
98% positive over last 12 months
FREE Shipping
94% positive over last 12 months
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Boltzmanns Atom: The Great Debate That Launched A Revolution In Physics Hardcover – January 18, 2001
Purchase options and add-ons
- Print length272 pages
- LanguageEnglish
- PublisherFree Press
- Publication dateJanuary 18, 2001
- Dimensions9 x 1 x 6 inches
- ISBN-100684851865
- ISBN-13978-0684851860
The Amazon Book Review
Book recommendations, author interviews, editors' picks, and more. Read it now.
Frequently bought together

Similar items that may deliver to you quickly
Editorial Reviews
Amazon.com Review
Opposed by the then-influential physicist and philosopher Ernst Mach, who urged that scientists stick to classical thermodynamics, Boltzmann was hard-pressed to convince his colleagues that the behavior of atoms could be explained by laws thought to apply only to the gaming table. Mach objected, and with some cause, that "the fact that the theory worked was not enough to prove that the assumptions on which the theory rested were true." It would take the next generation of scientists, among them Albert Einstein, to provide more solid proof for Boltzmann's hunches. And, while Mach's contributions to physics have largely been superseded, Boltzmann's endure in quantum mechanics and the Maxwell-Boltzmann distribution for the velocities of atoms in a gas. In this lively account, David Lindley tells the story of Boltzmann's many failures, and of his eventual success. --Gregory McNamee
From Publishers Weekly
Copyright 2000 Reed Business Information, Inc.
From Booklist
Copyright © American Library Association. All rights reserved
Review
About the Author
Excerpt. © Reprinted by permission. All rights reserved.
Chapter 1: A Letter from Bombay Lessons in Obscurity
On December 11, 1845, a lengthy manuscript arrived in the London offices of the Royal Society, the highest scientific association in Great Britain. The author of this work hoped his essay might be published in the Society's august Philosophical Transactions, and the manuscript, by standard practice, was duly sent to a couple of experts for evaluation of its worth. "Nothing but nonsense" was the verdict of one of these eminent reviewers. The other allowed that the paper demonstrated "much skill and many remarkable accordances with the general facts," but concluded nevertheless that the ideas were "entirely hypothetical" and, in the end, "very difficult to admit."
On these recommendations, the manuscript was never published. Worse still, the author, one John James Waterston, never found out what had happened. Waterston was living at the time in Bombay, teaching navigation and gunnery to naval cadets employed by the East India Company. Born and educated in Edinburgh, he spent his life working as a civil engineer and teacher, retiring from his position in India in 1857 to return to Scotland, where he lived modestly on his savings and continued to dabble in science: astronomy, chemistry, and physics. He was known during his lifetime, if at all, as one of the numerous amateurs of Victorian science, working in isolation, contributing from time to time ideas that were more or less sound but of no great consequence.
His rejected manuscript of 1845 embodied Waterston's one truly innovative and profound piece of work, but it was ahead of its time. Only by a few years, admittedly, but that was enough to ensure its unhappy reception by the experts of the Royal Society. Waterston proposed that any gas consisted of numerous tiny particles -- he called them molecules -- bouncing around and colliding with each other. He showed that the energy of motion in these particles corresponded to the temperature of the gas, and that the incessant impacts of the particles on the walls of the container gave rise to the effect commonly known as pressure. There was more: Waterston calculated the "elasticity" of gases (their ability to flow, roughly speaking) from his model, and he made the subtle observation that in a mixture of different gases all the tiny particles would, on average, have the same energy, so that heavier molecules would move more slowly than lighter ones. He was not right in every detail, but his general arguments and suppositions have survived the test of time. Waterston's fundamental idea, that a gas is made of tiny, colliding particles whose microscopic behavior produces the measurable properties of the gas as a whole, was exactly right.
Waterston's calculations were somewhat rough and ready, and his proofs were not quite solid. It may have been these deficiencies that led to the rejection of his paper -- that, and the fact that his name was unknown. It was certainly not revolutionary, in the middle of the 19th century, to suggest that gases consisted of tiny particles. The terms atom and molecule were known in scientific circles, although they designated objects whose true nature was unclear. Even the idea that the motion and collision of these particles had something to do with temperature and pressure was not altogether new. The Royal Society, admirably consistent, had in fact rejected a very similar proposal some 25 years earlier. The author of this earlier attempt was John Herapath, another unsung amateur of Victorian science and engineering. His work was by no means as sophisticated as Waterston's, but he had the right general idea: heat equals the motion of atoms or molecules. He wrote up his ideas in 1820 and sent them to the Royal Society. The chemist Humphrey Davy, then president of the society, declined to publish the paper. Though he was not unsympathetic to atomic thinking, Davy found Herapath's calculations unconvincing, and in truth, Herapath was confused about the mechanics of atomic collisions and came up with an incorrect formula for the temperature of a gas. Still, Herapath succeeded in getting accounts of his work published in other scientific journals, where they were roundly ignored by the scientific community of that age.
Waterston knew of Herapath's work, and of his erroneous formula for temperature, but neither of these two men, it appears, was aware that the atomic picture of a gas was close to a century old by the time they came to it. In 1738 Daniel Bernoulli, one of an extended Swiss clan of Bernoullis that made notable contributions to both mathematics and physics, succeeded in deriving theoretically a relationship between the pressure exerted by a gas and the energy of vibration of the supposed atoms within it. His theory attracted little attention, and was soon forgotten.
Bernoulli's was the first modern atomic or molecular model of a gas. He explained pressure in terms of atomic motion, but not temperature, largely because the nature of heat itself was quite mysterious in Bernoulli's day. Even so, neither he nor Herapath nor Waterston can take any credit for the idea of atoms themselves. They were the inheritors of a centuries-old tradition in natural philosophy according to which everything in the universe is composed fundamentally of minute, indivisible objects. The word atom is of Greek origin, meaning "uncuttable," and it is from ancient Greece that the idea itself descends.
Knowledge of the atomic hypothesis from ancient times is owed largely to the survival of a long poem called De Rerum Natura (On the Nature of Things) by the Roman writer Lucretius. The names of both this poem and its author had faded into oblivion in the centuries after the fall of Rome, but a church official traveling around the monasteries of France and Germany in the 13th century happened across a copy (not an original) and brought it back to the Vatican in 1417. Manuscripts dating back to the 9th or 10th century were subsequently rediscovered and found to be substantially the same as the Vatican copy. From these versions descend all modern editions of De Rerum Natura. Its author, Titus Lucretius Carus, lived from about 95 to 55 B.C. The six books of his great opus lay out a philosophical reflection on life as well as an exposition of a scientific hypothesis. It is fiercely atheistic. It enjoyed a good deal of renown in its time, but was later attacked by the Emperor Augustus in his attempt to restore some of the faded glory of the declining Roman world by reviving the ancient pre-Christian religion.
Lucretius derived his atheism from his adherence to what can be called, with the benefit of two millennia of hindsight, an atomic theory of the natural world. For example:
clothes hung above a surf-swept shore
grow damp; spread in the sun they dry again.
Yet it is not apparent to us how
the moisture clings to the cloth, or flees the heat.
Water, then, is dispersed in particles,
atoms too small to be observable.
In other words, a wet garment has atoms (we would now say molecules) of water clinging to its fabric; heat drives the atoms off, and thus dries the material. An atomic theory of clothes drying seems to be some way from disproving the existence of deities, but Lucretius goes on to observe that the atoms have no volition, and instead move willy-nilly:
For surely the atoms did not hold council, assigning
order to each, flexing their keen minds with
questions of place and motion and who goes where.
But shuffled and jumbled in many ways, in the course
of endless time they are buffeted, driven along,
chancing upon all motions, combinations.
At last they fall into such an arrangement
as would create this universe...
Examined closely, Lucretius says, the range and variety of all the familiar phenomena of the world about us arise from invisible atoms zipping aimlessly this way and that. No need for gods to direct events, or inspire actions and consequences. On the other hand, Lucretius's vision seems to leave little room for human decision or free will either. If the universe takes its course because atoms are following their random paths, then neither gods nor human beings have any control over their destinies; what will happen, will happen, and there is nothing anyone can do to change it.
This is a bleak form of atheism, implying what is nowadays called determinism, meaning that what happens in the future is wholly determined by what has happened in the past. To Lucretius and his followers this view was nevertheless a liberation. In their day the gods were fickle, cruel, and capricious, more inclined to pranks and practical jokes than to love or compassion. The citizens of Rome decidedly did not wish for a god to enter their lives. To believe, as Lucretius insisted, that there were no gods, and that the world proceeded for good or ill quite indifferent to human desires, was by contrast to achieve a measure of repose through calm acceptance. Even death was not to be feared: when the atoms of one's soul and body were forever dispersed, there could be no sensation, no pain. Compared to being taunted or tortured for all eternity by frivolous, merciless gods, that was indeed a blessing.
In his philosophy, based on atomism, Lucretius found a reason to give up the struggle against blind fate, and to live instead with equanimity in the world as it was. He lived in the time of Julius Caesar, when the Roman republic was failing. Tyrants, wayward generals, and corrupt politicians would thereafter take over. Peace was to be found in withdrawing as far as possible from the vicissitudes of life. Whether Lucretius was able to live according to his own recommendation is doubtful. He suffered periods of insanity or mental disturbance, and killed himself when he was about 40 years old. In a story handed down by St. Jerome, Lucretius was so much and so often wrapped in thought that his wife grew resentful, and to restore marital relations secretly gave him a love potion. Unfortunately, the potion was stronger than necessary, drove him mad, and thus impelled him to suicide. Tennyson wrote a poem about the poet and described the reasons for his wife's unhappiness:
Yet often when the woman heard his foot
return from pacings in the field, and ran
to greet him with a kiss, the master took
small notice, or austerely, for -- his mind
half-buried in some weightier argument,
or fancy-borne perhaps on the rise
and long roll of the hexameter -- he past
to turn and ponder those three hundred scrolls
left by the Teacher, whom he held divine.
This Teacher, the man by the contemplation of whose scrolls Lucretius earned his wife's displeasure, was the philosopher Epicurus, whose name survives in the notion of putting pleasure foremost among one's goals in life. A contemporary critic sniped that the ideal Epicurean way of life consisted of "eating, drinking, copulation, evacuation, and snoring," but there was more to it than that. Epicurus aimed for what might better be called contentedness, which meant freedom from pain and satiation of one's desires rather than any sort of unbridled hedonistic pleasure seeking.
To Epicurus, the greatest fear in life was the fear of death, or rather the fear of an unendurable afterlife that nevertheless had to be endured. As Lucretius reports, Epicurus employed the notion of atoms to argue that death was the final release from suffering, to be regretted, perhaps, but not feared. Lucretius differed from his teacher in one significant way: he went from atomism to atheism, but Epicurus still believed in the gods, and found the determinism of the atomic philosophy not to his taste. For that reason he introduced what seems now a rather odd idea:
When the atoms are carried straight down through the void
by their own weight, at an utterly random time
and a random point in space they swerve a little,
only enough to call it a tilt in motion.
Lucretius goes on to indicate that these "swerves" in the motion of atoms are what cause the atoms to cluster together or collide or otherwise interact in ways that can produce natural phenomena. The main point, however, was apparently to get around strict determinism by allowing atoms to alter their trajectories spontaneously, without any immediate cause. Perhaps this restores free will, or the ability of the gods to meddle, but it strikes the modern reader as an "unscientific" addition to the theory.
It was, indeed, Epicurus's own ill-considered addition. He did not dream up the notion of atoms, but got them from a still earlier source, in the writings of the Greek philosopher Democritus, and his teacher, Leucippus.
Of Leucippus little is known except that he flourished and taught in the years following 440 B.C. in what is now Turkey. His pupil, Democritus, lived from about that time until 371 B.C., mostly in northern Greece, and whether the beginnings of atomism should properly be credited to him or to Leucippus is impossible to say, since the latter's teaching is preserved only in the former's writings. Nevertheless, between the two of them, they put together what we can easily -- perhaps too easily -- see as the first intimation of a recognizably modern theory of atoms. They proposed that there exists a void, and in this void atoms move about, always in motion. Atom and void are all there is. The atoms come in a variety of distinct types and are indivisible; they band together in different ways to create the tangible and visible ingredients of the world.
To Democritus it was evident that there could be no up or down in an infinite void, and he therefore proposed that atoms move endlessly in all directions, changing course only when they ran into each other. But this implies determinism: once the atoms are off and running, their courses are fixed. There is still room for a deity at the beginning -- a prime mover, an uncaused cause, or some other extraphysical influence that sets the atoms up and pushes them off in certain directions -- but once that's done, determinism takes over. Does this mean there is no free will or volition? That the future is completely determined by the past? That question has haunted atomic theory, indeed physics in general, since the time of Democritus, and haunts us still today.
What distinguished Leucippus and Democritus from most of their contemporaries, and from almost all of the thinkers who followed them over the next two millennia, was that they were mainly interested in trying to understand how the world worked. Other philosophers began to focus their attention not so much on the universe as on the position of human beings in the universe, the extent to which human beings could know or understand the world around them, and how humans ought to behave. Thus arose the numerous brands of philosophy that have concerned themselves with the nature of knowledge and thought, and with the ethics and morality of human behavior. Religious philosophers took for granted that the universe has a purpose, and that humans have a purpose within it, which they may aspire to or fall away from. Leucippus and Democritus were, by contrast, scientists, aiming to understand as dispassionately as possible what is out there. Since their time, science and philosophy have become separate and frequently combative disciplines.
Atomic theory, with its implicit atheism and determinism, lost the favor of philosophical thinkers for a long period. But it crops up from time to time, for example in the writings of Isaac Newton:
It seems probable to me that God in the beginning form'd matter in solid, massy, hard, impenetrable, movable particles, of such sizes and figures and with such other properties, and in such proportion to space, as most conduced to the end for which he form'd them.
Whether from personal belief or caution, Newton is careful to cede to God the responsibility of creating atoms in the first place. But how, if at all, is this statement an advance on anything that Democritus (through Epicurus and then Lucretius) had said two thousand years earlier? Newton lists the attributes that atoms must or might have, but then concludes, quite circularly, that the properties and behavior are such "as most conduce" to the effects they need to generate. What atoms do, in other words, is whatever they need do in order to produce the phenomena of the natural world. Neither Democritus nor Newton is able to say how, in any specific sense, atoms behave so as to generate physical effects. In the absence of any such elaboration, atomism was bound to remain an appealing but speculative picture rather than a truly scientific theory.
By contrast there were, from the earliest times, plausibly scientific criticisms of the atomic philosophy. One objection that arose in Democritus's time was later taken up with enthusiasm by Aristotle: how could atoms move constantly, without let-up, for all time? In Aristotelian mechanics, inferred from direct observation, moving objects came to a halt unless something intervened to keep them moving. You had to keep kicking a rock to keep it rolling. What, therefore, kept atoms moving?
Once Newton came along with his laws of motion, this argument lost much of its force. Newton directly contradicted Aristotle: objects keep moving, in straight lines, until something stops them. The kicked rock rumbles to a halt because the impacts it suffers sap its energy.
The other knock against atomism was that the atoms moved around in empty space, a void, and many philosophers had satisfied themselves that a void was impossible. Their reasoning, briefly, was that for anything to exist, it must have a name that referred to something rather than nothing, and since nothing by definition could not have such a name, it could not therefore exist. This argument, we would now say, is the result of a philosophical confusion between the name of a thing and the thing itself, but it took philosophers a long time to sort that one out. If indeed they have, even now.
Democritus answered these objections in essence by refusing to answer them. He simply asserted that atoms exist and that they move incessantly in the void. He didn't attempt to provide any proof of these statements, but regarded them instead as assumptions from which he and the other atomists sought to explain what they saw in the world about them.
This attitude is strikingly modern and scientific. As Democritus saw it, you have to start somewhere. You make an assumption and explore the consequences. This is exactly what scientists continue to do today, and the fact that a certain assumption leads to all kinds of highly successful predictions and explanations does not, strictly speaking, prove that the original assumption is correct. To jump abruptly to the present day, many theoretical physicists now believe that the elementary particles of the universe are creatures called superstrings -- literally, lines or loops that wiggle around in multidimensional space, creating, by wiggling in different ways, electrons and quarks and photons. (More recently still, these superstrings have been subsumed into more complicated multidimensional structures called branes.) Enthusiasts for superstring theory and its variants maintain that they have hit on a fundamentally simple explanation for everything in the physical world, although working out the observable consequences of that explanation is admittedly a complicated and perhaps inconclusive business. Critics point out that whether superstring theory can be tested satisfactorily depends crucially on whether working out the details can be done, even in principle. Neither side expects that anyone will ever see a superstring in its native form.
The modern debate over superstrings is philosophically not very different from the ancient debate over atoms. To Democritus it was self-evidently a step forward to be able to explain the wildly varying and seemingly unpredictable phenomena of the natural world in terms of unchanging and eternal atoms. But even this idea had detractors. Heraclitus -- famous for his observation that "you can't step into the same river twice, for fresh waters are always flowing in upon you" -- believed that change, not permanency, was the essential nature of the world.
How much credit should we give Democritus and the few atomists of his era for correctly anticipating what we now know to be true? The universe is either constant in its fundamental nature, or ever-changing; matter is either continuous and infinitely divisible, or else made of a finite number of indivisible parts. There seem to be no other possibilities. On both questions, Democritus happened to choose the right side.
Then again, the early atomists were far from right about everything. They believed that the soul was composed of especially subtle atoms. Lucretius had a theory that sweet and bitter tastes arise when the tongue encounters smooth or jagged atoms. With hindsight, we tend to dismiss these errors as the products of overenthusiasm and seize on the points where the atomists got it more or less right. As Bertrand Russell put it, "By good luck, the atomists hit on a hypothesis for which, more than two thousand years later, some evidence was found, but their belief, in their day, was nonetheless destitute of any solid foundation."
What's most important about Democritus is his insistence that explanation, if it is to have lasting value, must itself rest on permanent foundations -- a requirement that seems today almost a definition of what science must aim for. The Heraclitean idea that all is change and flux, on the other hand, seems to lead nowhere. Democritus, in his style of thinking, was more like a modern scientist than any other ancient philosopher. He argued that we ought to understand the universe first and worry about our place in it afterward, not adjust our view of the universe for the sake of our own peace of mind. He believed that the complexity of the world at large could, in principle, be explained by means of a simple underlying hypothesis. He believed it was not foolish to imagine the world was made of tiny components, even if those components were too tiny ever to be seen. These self-same principles, and the controversy they engendered, rose up once again almost two thousand years after Democritus, when the modern version of atomic theory began its ascent.
In that interim, atomic theory languished, never quite forgotten but not much amplified either. A taint of atheism hung over it, and natural philosophy in the post-Roman, prescientific world was powerfully religious, or at least mystical. Philosophers of the middle ages set as their most important undertaking the task of proving that God existed. The alchemists, meanwhile, tried vainly to find secret recipes that would transform, in mysterious ways, one substance into another, and in particular, base metals into gold. The towering but enigmatic Isaac Newton was in many ways both the first modern scientist and also, in Keynes's phrase, the last alchemist. When he wasn't propounding mechanical laws of motion or inventing the differential and integral calculus, Newton pored over the Bible and other ancient texts, trying out bizarre numerological schemes in the pursuit of arcane knowledge.
Nevertheless, modern science gradually emerged. The alchemists -- mystics and sorcerers -- changed almost imperceptibly, as pupil outgrew teacher, into chemists. Both were looking to unlock the nature of the physical world and the transformations within it, but where alchemists stumbled blindly, hoping to come across secret recipes, chemists slowly adopted a more purposeful strategy, hoping to control chemical transformations by first understanding the rules that governed them.
Atomic theory began to rise again. The rules that chemists learned imposed some restrictions that would have gravely disappointed their alchemical predecessors. Metals such as iron, copper, and gold were elemental quantities, they found, that could under no circumstances be forcibly converted one into another. Fire, on the other hand, which to alchemists had always been the supernatural agent of transformation, turned out to be just such a transformation in its own right: a chemical reaction.
Chemists grasped the idea of elements and of chemical reactions as combinations of elements changing partners according to strict rules, as in a country dance. Water, for example, was a compound of two parts of hydrogen to one of oxygen. From there it was not so big a leap to think of "atoms" of these gases combining, two of hydrogen with one of oxygen, to create an "atom" of water. (The modern distinction between atoms and molecules, which consist of several atoms bonded together, did not become fully clear until chemists had sorted out what were elements and what were compounds of those elements. In the meantime, scientists used the terms atom and molecule somewhat interchangeably.)
Still, the chemists didn't care very much (because they didn't need to) what the atoms looked like, how they behaved, how they congregated or dispersed. Whether they were tiny, hard things flying about in empty space or fat, squishy things packed closely together like oranges in a carton didn't matter much. And it wasn't at all clear whether the atoms of hydrogen and oxygen were genuine, indivisible entities, or whether the two-plus-one formula for combining them into water was simply a handy accounting method. As had been the case for Democritus and Lucretius, atoms seemed like a nice idea, at least to those disposed to think that way, but still there didn't yet seem to be anything necessary or compelling about them.
What seems surprising in retrospect, perhaps, is that it took so long for physicists to combine Newton's laws of motion, so well established as the foundation of physics in the 17th and 18th centuries, with the resurgent atomic hypothesis -- to think, in other words, of atoms as little objects moving, colliding, and bouncing off each other in accordance with standard Newtonian mechanics. This is what Daniel Bernoulli first tried, in 1738, with his argument deriving pressure from a consideration of atomic motion. But even after that, in 1763, Roger Boscovich wrote an exposition called Theoria Philosophiae Naturalis in which he offered an atomic theory that relied on essentially stationary atoms. Boscovich, a peripatetic philosopher-priest of Serbo-Croatian origins, argued that at very short range, atoms attracted each other: that was why a piece of cloth soaked up water. At somewhat longer range, however, atoms pushed each other away: that was why a gas exerted pressure.
Boscovich's account, though it has some modern elements, also illustrates why atomic theory was not taken seriously by many scientists for such a long time. Rather than imagining atoms as having certain properties, and seeking to draw conclusions about their behavior, he instead gave the atoms whatever properties he needed in order to explain the phenomena he addressed. This put into practical terms Newton's suggestion that atoms must "conduce themselves" so as to produce the behavior we see. It is easy to criticize this thinking as wholly speculative and unscientific. First you imagine that atoms exist, and then you imagine that they have whatever properties they need in order to account for the phenomena you want to explain.
These philosophical considerations aside, the other great barrier against the acceptance of atomism, especially as it applied to gases, was ignorance of the true nature of heat. At the beginning of the 19th century, opinion was divided. Some scientists thought that heat was a mechanical property of some sort, related to energy and other Newtonian concepts, but others, perhaps a majority, subscribed to the notion that heat was a kind of vaporous fluid or tenuous substance that went by the name caloric. This caloric was supposed to be an entity in its own right, not something composed or built from other components, and it could somehow soak into or pervade material objects, bestowing on them the property we recognize as heat. When a warm object lost heat to a colder object in contact with it, that was because caloric dribbled out of one and seeped into the other.
An argument against the caloric theory came from the Massachusetts-born scientist and inventor Benjamin Thompson, who spied for Britain in the years preceding the Revolutionary War, fled to London in 1775, returned briefly to America while the war was still going on, and after the newly independent United States had won, returned to Britain as a refugee. The appreciation shown to him there fell short of his expectations, and through political connections he obtained an appointment to the royal court of Bavaria, where he served mainly as a military adviser but succeeded in making himself indispensable in a variety of ways. He laid out the English Gardens in Munich, concocted a recipe for soup (along with specific chewing and swallowing instructions) that was meant to keep soldiers well nourished, and designed a portable coffeemaker. For these and other services he was made, in 1792, Count Rumford of the Holy Roman Empire -- a name familiar to many American home renovators today in connection with the Rumford fireplace, an efficient hearth he designed in order to keep smokiness to a minimum.
Besides all this, Rumford also showed a genuine aptitude for scientific insight, and he made a number of useful observations concerning the nature of heat and energy. In his capacity as a military engineer in Bavaria he oversaw the boring out of cannons, and noticed that a dull bit would grind endlessly into a chunk of metal, achieving little except the generation of heat. He concluded that the amount of heat obtainable was essentially limitless, as long as the drill bit kept boring away. That was hard to understand if heat represented caloric being drawn out of the drilled metal; surely the original supply of caloric would run out after a while. Rumford saw instead that heat generation had something to do with the physical work of grinding the bit on the metal.
The caloric theory of heat lingered on into the first decades of the 19th century, despite observations such as Rumford's and despite the fact that no one could really say what sort of a substance caloric was supposed to be. In that respect, however, atoms -- invisible particles with unknown properties -- had no firmer standing. But physicists were at least familiar with gases and fluids in a general way, and if caloric was a peculiar kind of fluid, that was because heat was a peculiar kind of quantity. Atoms, on the other hand, were a complete unknown, and to explain something familiar yet enigmatic, such as heat, in terms of tiny, hard masses must have struck scientists of the early 19th century as too great a leap of imagination for them to follow.
Accustomed as we are nowadays to the idea of explaining all manner of observable or detectable phenomena in terms of remote, invisible entities -- quarks and photons, electromagnetic fields, curved space, and the like -- scientists of two hundred years ago were still essentially rooted in what they could see and measure directly. Heat could be detected at the fingertips; it was an undoubted physical phenomenon. The pressure of a gas could likewise be felt in the tautness of an inflated balloon or the powerful stroke of a piston in a steam engine. What did it mean to explain such direct and unarguable perceptions in terms of the undetectable actions of invisible objects? Certainly, one could imagine tiny atoms bashing against a piston head, and mustering enough collective force to push it out, but what was the advantage in imagining such a thing? As an explanation, this seemed to be going the wrong way, portraying something immediate and tangible in terms of "atoms" that were forever concealed from the human eye. Although Bernoulli, then Herapath, then Waterston had made more precise and specific the atomic picture devised by Democritus and recounted by Lucretius, they had not yet improved greatly on the nature of the argument: a critic could still argue, with considerable reason, that this was a nice picture, but hardly a scientific theory. It might explain one thing in terms of another, but not yet in a sufficiently broad way to make physics overall any simpler.
As late as 1845, when John Waterston submitted his ill-fated manuscript to the Royal Society, the explanation of heat as atoms in motion -- it was known as the kinetic theory of heat -- was to some extent a hypothesis in search of a problem. If you were inclined to believe in atoms in the first place, kinetic theory seemed like a pleasing extension of a broad and fundamental picture of nature. But if you were disinclined to atomism, the kinetic explanation didn't seem to say anything you didn't already know.
And yet, in just 12 years, kinetic theory went from outlandish idea to respectable proposal. It was not so much that the theory was suddenly improved, or found able to explain wholly new matters, but rather that a handful of influential people began to take it seriously. In 1857, the German physicist Rudolf Clausius, already well known for his work on the relationship of heat and mechanical energy, published an influential work titled The Kind of Motion We Call Heat. Clausius was 35 years old at the time, and his reputation was solid but not yet remarkable. He had been working for some years in opposition to the caloric theory of heat, trying to prove instead that heat was, as Rumford's observation had indicated, intimately linked to mechanical work and energy. He said, in 1857, what Bernoulli and Herapath had hinted at, and Waterston had propounded in some detail. If a volume of gas consists of tiny atoms in relentless motion, then both the pressure it exerts and its temperature are related in a simple way to the square of the average velocity of the atoms. Temperature, in fact, is nothing other than the average kinetic energy of these presumed atoms.
It is hard, in retrospect, to see why Clausius's work was taken so much more seriously in 1857 than Waterston's had been a dozen years earlier, except that Clausius was an established professor of physics at the polytechnic institute in Zurich, while Waterston was an instructor at a naval college in Bombay. In both the German-speaking and English-speaking worlds, seveot yet remarkable. He had been working for some years in opposition to the caloric theory of heat, trying to prove instead that heat was, as Rumford's observation had indicated, intimately linked to mechanical work and energy. He said, in 1857, what Bernoulli and Herapath had hinted at, and Waterston had propounded in some detail. If a volume of gas consists of tiny atoms in relentless motion, then both the pressure it exerts and its temperature are related in a simple way to the square of the average velocity of the atoms. Temperature, in fact, is nothing other than the average kinetic energy of these presumed atoms.
It is hard, in retrospect, to see why Clausius's work was taken so much more seriously in 1857 than Waterston's had been a dozen years earlier, except that Clausius was an established professor of physics at the polytechnic institute in Zurich, while Waterston was an instructor at a naval college in Bombay. In both the German-speaking and English-speaking worlds, several notable physicists had become convinced that heat was ultimately mechanical in nature, and in fact controversies erupted from time to time over whether Clausius or his British rivals had propounded certain ideas first. It was true that between 1845 and 1857 quantitative laws had been enunciated (by Clausius, among others), making the connection between heat and energy more precise, and it may be that the passage of 12 years was just enough to carry the kinetic explanation of heat across the threshold of credibility, from speculative suggestion to scientific proposal.
Clausius, at any rate, is the man whose work brought the atomic theory of heat into the scientific world. With his imprimatur, kinetic theory was taken more seriously and attracted new adherents. Younger scientists saw it as an enticing idea and worked to improve it. In 1860, James Clerk Maxwell published in England an elaboration of Clausius's theory, taking into account not simply the average speed of the atoms, but their distribution of speeds as well -- that is, how many, at any given time, are moving at speeds greater or smaller than the average. He derived, on somewhat abstract and not entirely persuasive grounds, a mathematical form for this distribution, in effect a graph of the typical speeds of atoms in a volume of gas at any given temperature.
Maxwell was at that time only 28 years old, but signs of his brilliance were already apparent. The next step came from a still younger man, and one whose name was then known to hardly anyone. In 1868, the 24-year-old Ludwig Boltzmann, newly graduated from the University of Vienna, published a more convincing physical explanation for the formula Maxwell had derived. By analyzing what would happen to a volume of gas rising in Earth's gravitational field -- a case for which the change in pressure with altitude was already well understood -- Boltzmann showed that Maxwell's formula correctly predicted how the number of atoms or molecules with a particular energy would correspondingly change. This was a remarkable stroke of insight for a young man who had only recently finished his studies.
Boltzmann's argument gave a direct and easily grasped physical justification for Maxwell's formula, and showed moreover that there was real physics in the new kinetic theory. Maxwell himself was impressed, and wrote to one of the senior Viennese physicists expressing his admiration for Boltzmann's work. The formula that these two young physicists proposed became known as the Maxwell-Boltzmann distribution for the velocities of atoms in a gas, and it remains the cornerstone of the atomic depiction of gases. Clausius, the somewhat older man, set kinetic theory on the path to respectability, and Maxwell made fundamental contributions when he was not occupied with other theoretical endeavors. But it was Boltzmann who made the full development of kinetic theory his life's cause and who took its missteps as much as its successes on his shoulders. During the second half of the 19th century, the peaks and depths of Boltzmann's difficult life mirrored exactly the stumbling ascendancy and frequent reversals of kinetic theory itself.
In due course, Daniel Bernoulli's forgotten century-old proposal was rediscovered, and his insight was recognized by Boltzmann and everyone else. Herapath, after his early forays into kinetic theory, had published some letters in the London Times attacking Davy, but he then focused his energies in other directions. He went briefly and unsuccessfully into teaching, became interested in the blossoming railway industry, and in the end became a well-known writer and commentator on that business. He remained an amateur scientist and published a few small works in Railway Magazine and Annals of Science, of which he was conveniently the editor. Maxwell, much later, acknowledged that Herapath had had the right general idea but observed that his calculations of atomic collisions were incorrect.
Waterston's story has no happy ending. His rejected paper of 1845 contained the essence of what Clausius shortly afterward became famous for, and contained hints too of many other ideas that were not fully developed until some time later. A brief abstract of his paper appeared in the Royal Society's Proceedings in 1846, and another short notice was published in 1851, but these accounts were so brief, and gave so little indication of the conclusions Waterston had reached, that they went unremarked. His original manuscript was not returned to him but languished for decades in the files of the Royal Society. After he had returned from India to Scotland he published scientific papers here and there without ever becoming much known to his contemporaries. In 1878, he resigned from the Royal Astronomical Society after two of his papers were rejected for publication. Thereafter, according to a nephew, any mention in Waterston's presence of the scientific societies "brought out considerable abuse without any definite reasons assigned." He died in 1883, at the age of 72.
Waterston's achievement finally came to light in 1891, when the English physicist Lord Rayleigh, then secretary of the Royal Society, discovered the lost manuscript in the course of tracking down some old citations. By that time the kinetic theory amounted to a sophisticated and well-known body of knowledge, and Rayleigh immediately perceived the true merit of Waterston's ideas. He arranged for its belated publication as the first item in the first issue of the Philosophical Transactions for 1892, along with a brief commentary on its tortured history.
Acknowledging that Waterston had submitted his work at a time when scientists thought very differently than they were accustomed to doing just a few decades later, Rayleigh admitted nevertheless he was surprised that the Royal Society's expert reviewers were so dismissive of the paper. "The omission to publish it at the time was a misfortune, which probably retarded the subject by ten or fifteen years," he wrote. Rayleigh suggested that Waterston might have done better to mention that he was working to elaborate ideas previously suggested by Daniel Bernoulli, whose reputation was unarguable; that might have made a reviewer hesitate. But Bernoulli's work had itself been forgotten, and it is the strength of Waterston's claim to unjust treatment that he indeed came up with his reasoning entirely by himself. On that score Rayleigh had another observation: "Perhaps...a young author who believes himself capable of great things would usually do well to secure the favourable recognition of the scientific world by work whose scope is limited, and whose value is easily judged, before embarking on greater flights." The reliable route to scientific fame, in other words, requires brilliance judiciously combined with careerism. Just as well, perhaps, that the already bitter Waterston didn't live to see this endorsement of his unrewarded endeavors.
Copyright © 2001 by David Lindley
Product details
- Publisher : Free Press; First Edition (January 18, 2001)
- Language : English
- Hardcover : 272 pages
- ISBN-10 : 0684851865
- ISBN-13 : 978-0684851860
- Item Weight : 1.1 pounds
- Dimensions : 9 x 1 x 6 inches
- Best Sellers Rank: #1,540,491 in Books (See Top 100 in Books)
- #3,062 in Scientist Biographies
- #12,036 in Physics (Books)
- Customer Reviews:
About the author

Discover more of the author’s books, see similar authors, read author blogs and more
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on Amazon-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
David Lindley is a fine writer and his descriptions of Boltzmann’s life and work are clear and easy to follow. My only problem with the book is with the digressions. Lindley also wrote the superb biography of Lord Kelvin published three years after this book. In the Kelvin biography he also digresses and talks for several pages about people that Kelvin worked with. I found those discussions interesting, relevant and well-written. But I did not feel that way about some of the digressions in this book. Lindley’s discussions of other historical figures are less directly relevant here at times. For example, he spends much of the second chapter talking about the atomic theories of Lucretius and Democritus. The idea is to give a thumbnail sketch of the history of atomic theory but to this reader it was much more than I wanted to know in a book about the life of Boltzmann. His dozen page discussion of the background, life and work of the American Josiah Gibbs is much more directly relevant to Boltzmann’s work but seemed to include more than was necessary. Other digressions like the ones on Helmholtz and Maxwell I thought were better tied to the overall context. But this is really a minor problem. Perhaps it is because his book on Kelvin is so consistently strong that my disappointment rises because I am comparing Lindley’s work here against his own later book. In any case this work stands by itself as a worthwhile book on both Boltzmann and the state of physics in the late 1800's.
Ludwig Boltzmann is not a household word, even among those with some background in science. But his place in the history of physics is critical in the development of the modern scientific worldview. Lindley’s book gives Boltzmann his due and fleshes out the life of a brilliant but often tortured person. I recommend the book.
One downside is the lack of more in-depth science. Only one equation is written (S=klnw). It would be nice to see more of the physics being developled...possibly an idea for a new textbook...
All in all, very fun. I would love to read more history of physics books that are written similarly.








