Buy new:
$36.73$36.73
FREE delivery:
Wednesday, Jan 25
Ships from: Amazon Sold by: bookgardens
Buy used: $8.96
Other Sellers on Amazon
& FREE Shipping
100% positive over last 12 months
& FREE Shipping
95% positive over last 12 months
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required. Learn more
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce Them as Truths Hardcover – May 24, 2011
| Michael Shermer (Author) Find all the books, read about the author, and more. See search results for this author |
| Price | New from | Used from |
|
Audible Audiobook, Unabridged
"Please retry" |
$0.00
| Free with your Audible trial | |
|
Paperback, International Edition
"Please retry" | $10.61 | $7.70 |
Enhance your purchase
Bestselling author Michael Shermer's comprehensive and provocative theory on how beliefs are born, formed, reinforced, challenged, changed, and extinguished.
In this work synthesizing thirty years of research, psychologist, historian of science, and the world's best-known skeptic Michael Shermer upends the traditional thinking about how humans form beliefs about the world. Simply put, beliefs come first and explanations for beliefs follow. The brain, Shermer argues, is a belief engine. From sensory data flowing in through the senses, the brain naturally begins to look for and find patterns, and then infuses those patterns with meaning. Our brains connect the dots of our world into meaningful patterns that explain why things happen, and these patterns become beliefs. Once beliefs are formed the brain begins to look for and find confirmatory evidence in support of those beliefs, which accelerates the process of reinforcing them, and round and round the process goes in a positive-feedback loop of belief confirmation. Shermer outlines the numerous cognitive tools our brains engage to reinforce our beliefs as truths.
Interlaced with his theory of belief, Shermer provides countless real-world examples of how this process operates, from politics, economics, and religion to conspiracy theories, the supernatural, and the paranormal. Ultimately, he demonstrates why science is the best tool ever devised to determine whether or not a belief matches reality.
- Print length400 pages
- LanguageEnglish
- PublisherTimes Books
- Publication dateMay 24, 2011
- Dimensions6.48 x 1.44 x 9.36 inches
- ISBN-100805091254
- ISBN-13978-0805091250
The Amazon Book Review
Book recommendations, author interviews, editors' picks, and more. Read it now.
Frequently bought together

- +
- +
Customers who viewed this item also viewed
Editorial Reviews
Review
“Michael Shermer has long been one of our most committed champions of scientific thinking in the face of popular delusion. In The Believing Brain, he has written a wonderfully lucid, accessible, and wide-ranging account of the boundary between justified and unjustified belief. We have all fallen more deeply in his debt.” ―Sam Harris, author of the New York Times bestsellers The Moral Landscape, Letter to a Christian Nation, and The End of Faith.
“The physicist Richard Feynman once said that the easiest person to fool is yourself, and as a result he argued that as a scientist one has to be especially careful to try and find out not only what is right about one's theories, but what might also be wrong with them. If we all followed this maxim of skepticism in everyday life, the world would probably be a better place. But we don't. In this book Michael Shermer lucidly describes why and how we are hard wired to 'want to believe'. With a narrative that gently flows from the personal to the profound, Shermer shares what he has learned after spending a lifetime pondering the relationship between beliefs and reality, and how to be prepared to tell the difference between the two.” ―Lawrence M. Krauss, Foundation Professor and Director of the Origins Project at Arizona State University and author of The Physics of Star Trek, Quantum Man and A Universe from Nothing
“Michael Shermer has long been one of the world's deepest thinkers when it comes to explaining where our beliefs come from, and he brings it all together in this important, engaging, and ambitious book. Shermer knows all the science, he tells great stories, he is funny, and he is fearless, delving into hot-button topics like 9-11 Truthers, life after death, capitalism, Barack Obama, Sarah Palin, and the existence of God. This is an entertaining and thoughtful exploration of the beliefs that shape our lives.” ―Paul Bloom, author of How Pleasure Works
“The Believing Brain is a tour de force integrating neuroscience and the social sciences to explain how irrational beliefs are formed and reinforced, while leaving us confident our ideas are valid. This is a must read for everyone who wonders why religious and political beliefs are so rigid and polarized--or why the other side is always wrong, but somehow doesn't see it.” ―Dr. Leonard Mlodinow, physicist and author of The Drunkard's Walk and The Grand Design (with Stephen Hawking)
“We might think that we learn how the world works, because we take the time to observe and understand it. Shermer says that's just not so. We just believe things, and then make our world fit our perceptions. Believe me; you don't have to take my word for it. Just try clearing some space in your own Believing Brain.” ―Bill Nye, the Science Guy ©, Executive Director of The Planetary Society
“The Believing Brain is a fascinating account of the origins of all manner of beliefs, replete with cutting edge evidence from the best scientific research, packed with nuggets of truths and then for good measure, studded with real world examples to deliver to the reader, a very personable, engaging and ultimately, convincing set of explanations for why we believe.” ―Professor Bruce Hood, Chair of Developmental Psychology, Bristol University and author of Supersense: Why We Believe in the Unbelievable
About the Author
Michael Shermer is the author of The Believing Brain, Why People Believe Weird Things, The Science of Good and Evil, The Mind Of The Market, Why Darwin Matters, Science Friction, How We Believe and other books on the evolution of human beliefs and behavior. He is the founding publisher of Skeptic magazine, the editor of Skeptic.com, a monthly columnist for Scientific American, and an adjunct professor at Claremont Graduate University. He lives in Southern California.
Product details
- Publisher : Times Books; Later Printing edition (May 24, 2011)
- Language : English
- Hardcover : 400 pages
- ISBN-10 : 0805091254
- ISBN-13 : 978-0805091250
- Item Weight : 1.42 pounds
- Dimensions : 6.48 x 1.44 x 9.36 inches
- Best Sellers Rank: #564,328 in Books (See Top 100 in Books)
- #972 in Medical Cognitive Psychology
- #1,829 in Cognitive Psychology (Books)
- #3,293 in Medical General Psychology
- Customer Reviews:
About the author

Dr. Michael Shermer is the Founding Publisher of Skeptic magazine, the host of the Science Salon Podcast, and a Presidential Fellow at Chapman University where he teaches Skepticism 101. For 18 years he was a monthly columnist for Scientific American. He is the author of New York Times bestsellers Why People Believe Weird Things and The Believing Brain, Why Darwin Matters, The Science of Good and Evil, The Moral Arc, and Heavens on Earth. His new book is Giving the Devil His Due: Reflections of a Scientific Humanist.
(Photo by Jordi Play)
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonReviewed in the United States on April 11, 2022
-
Top reviews
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
Our species developed this pattern-detection software as a means of survival: recognizing patterns in the jungle alerted early humans to things like predators, prey, and weather changes. Recognizing the herding patterns of buffalo or the seasonal patterns of rain were paramount to surviving in a dangerous world. “Because we must make associations in order to survive and reproduce, natural selection favored all association-making strategies,” Shermer writes, “even those that resulted in false positives.” This is important, because while not recognizing a rustle in the leaves as a poisonous snake can lead us to be bitten and killed, recognizing a rustle in the leaves as a potential danger when there is no actual danger proves no harm. Our pattern recognition software is always running, regardless of whether the patterns detected are helpful or erroneous.
In his book, Shermer defines two news vocabulary words. The first is patternicity, which he defines as “the tendency to find meaningful patterns in both meaningful and meaningless data.” Our rational brains have become exceedingly good at finding patterns everywhere, even when no real pattern actually exists Our modern world gives ample opportunity for this pattern detection system to operate. The second word that Shermer brings to his readers is agenticity, which he defines as “the tendency to infuse patterns with meaning, intention, and agency.” Again, this comes from our long evolution as a species. It is important when the sky changes color—incoming storm clouds means something. The problem we have today is when we place too much importance on our false-positive beliefs. When the leaves are rustling, and we believe there is a snake, we might take the longer path home, when in reality there is no snake and we are only making our own lives more difficult. “With this evolutionary perspective we can now understand that people believe weird things because of our evolved need to believe non weird things.”
Sadly, our human brains have only one neural network that is in charge of our beliefs, as opposed to two separate areas for belief and disbelief. This is importantly true, because if you believe that ‘torture is wrong’ and also that ‘2 plus 2 makes 4,’ these occur in the same place in the brain. This can cross our wires between ethics and science, because as far as the brain is concerned, both of these beliefs are similar. Couple this phenomenon with the philosopher Baruch Spinoza’s conjecture—that beliefs come quickly and naturally while skepticism is slow and unnatural—and we have a recipe for trouble. Our strongest held beliefs are oftentimes jumps to conclusions. “The scientific principle that a claim is untrue unless proven otherwise runs counter to our natural tendency to accept as true that which we comprehend quickly.” Science is the best tool we have devised for determining the validity of claims of belief, but it is one that we do not use ubiquitously enough. The only way we know to approach real truth is the scientific method: the experimentation of a hypothesis and the examination of the results. It doesn’t matter how strongly your belief is: if it disagrees with experiment and replicable outcomes, then it is lacking. Still, if you can come to a complete scientific agreement about a phenomenon, the sun being the center of the solar system, for example, the same part of your brain that believes this to be true also might believe that we should jail people for wearing open toed shoes in public. One human can believe both of these ideas, because they are reinforced by the same neural network.
This bring us to the ultimate thesis of this book: human beings form beliefs first, and then rationally back them up second. We are the opposite of Spock from Star Trek, a character infamous for his cold hard logical thinking. We react to situations emotionally. We use our pattern-detection software to understand the world and to subsequently make guesses about how the future will play out, and then we rationalize these thoughts post-hoc. Obviously, this is not a great system for optimizing our potential. It also explains why so many people believe such wildly different (and often completely ridiculous) things.
This further complicates society and our social interactions with one another because “our perceptions about reality are dependent on the beliefs that we hold about it.” Reality exists independent of human minds, but our understanding of it depends upon the beliefs we hold at any given time. If the wind is rustling the leaves a lot, we might believe the area is full of snakes, which could cause us to move our camp to another territory. The reality may be that our first spot was safer than our new one. If we believe that gun ownership is the reason why so many mass shootings occur, that will lead us to take different actions to curb the violence than if we believe that mental illness is the culprit. Each individual’s subjective (emotional and psychological) reasons for believing what causes mass shootings are influenced by their family, friends, and culture, and our pattern finding brains reinforce these beliefs and give them meaning. This is helpful when our patternicity and agenticity align with empirical reality (when there really is a snake in the grass), and harmful when it does not. It is also what makes democracy so difficult—everybody has their own solution to the problems we face, and we all believe ourselves to be accurate and everyone else to be wrong. Coming to a collective agreement can be exceedingly difficult.
So, how do we change our beliefs? The first step is understanding how we come to them and that they are often faulty from the start. We have to recognize the role that our rationality plays in sustaining them, especially in the face of contradictory evidence. Oftentimes this is not enough, with true change only coming from deeper social and cultural shifts in the underlying zeitgeist. These changes are a product of “larger and harder-to-define political, economic, religious, and societal changes.” Nonetheless, a little more skepticism is a healthy thing—especially skepticism of ourselves and our own beliefs. It is important to occasionally hold your beliefs up in the mirror and ask yourself: are these actually, empirically, true?
By Cody Allen on April 11, 2022
Our species developed this pattern-detection software as a means of survival: recognizing patterns in the jungle alerted early humans to things like predators, prey, and weather changes. Recognizing the herding patterns of buffalo or the seasonal patterns of rain were paramount to surviving in a dangerous world. “Because we must make associations in order to survive and reproduce, natural selection favored all association-making strategies,” Shermer writes, “even those that resulted in false positives.” This is important, because while not recognizing a rustle in the leaves as a poisonous snake can lead us to be bitten and killed, recognizing a rustle in the leaves as a potential danger when there is no actual danger proves no harm. Our pattern recognition software is always running, regardless of whether the patterns detected are helpful or erroneous.
In his book, Shermer defines two news vocabulary words. The first is patternicity, which he defines as “the tendency to find meaningful patterns in both meaningful and meaningless data.” Our rational brains have become exceedingly good at finding patterns everywhere, even when no real pattern actually exists Our modern world gives ample opportunity for this pattern detection system to operate. The second word that Shermer brings to his readers is agenticity, which he defines as “the tendency to infuse patterns with meaning, intention, and agency.” Again, this comes from our long evolution as a species. It is important when the sky changes color—incoming storm clouds means something. The problem we have today is when we place too much importance on our false-positive beliefs. When the leaves are rustling, and we believe there is a snake, we might take the longer path home, when in reality there is no snake and we are only making our own lives more difficult. “With this evolutionary perspective we can now understand that people believe weird things because of our evolved need to believe non weird things.”
Sadly, our human brains have only one neural network that is in charge of our beliefs, as opposed to two separate areas for belief and disbelief. This is importantly true, because if you believe that ‘torture is wrong’ and also that ‘2 plus 2 makes 4,’ these occur in the same place in the brain. This can cross our wires between ethics and science, because as far as the brain is concerned, both of these beliefs are similar. Couple this phenomenon with the philosopher Baruch Spinoza’s conjecture—that beliefs come quickly and naturally while skepticism is slow and unnatural—and we have a recipe for trouble. Our strongest held beliefs are oftentimes jumps to conclusions. “The scientific principle that a claim is untrue unless proven otherwise runs counter to our natural tendency to accept as true that which we comprehend quickly.” Science is the best tool we have devised for determining the validity of claims of belief, but it is one that we do not use ubiquitously enough. The only way we know to approach real truth is the scientific method: the experimentation of a hypothesis and the examination of the results. It doesn’t matter how strongly your belief is: if it disagrees with experiment and replicable outcomes, then it is lacking. Still, if you can come to a complete scientific agreement about a phenomenon, the sun being the center of the solar system, for example, the same part of your brain that believes this to be true also might believe that we should jail people for wearing open toed shoes in public. One human can believe both of these ideas, because they are reinforced by the same neural network.
This bring us to the ultimate thesis of this book: human beings form beliefs first, and then rationally back them up second. We are the opposite of Spock from Star Trek, a character infamous for his cold hard logical thinking. We react to situations emotionally. We use our pattern-detection software to understand the world and to subsequently make guesses about how the future will play out, and then we rationalize these thoughts post-hoc. Obviously, this is not a great system for optimizing our potential. It also explains why so many people believe such wildly different (and often completely ridiculous) things.
This further complicates society and our social interactions with one another because “our perceptions about reality are dependent on the beliefs that we hold about it.” Reality exists independent of human minds, but our understanding of it depends upon the beliefs we hold at any given time. If the wind is rustling the leaves a lot, we might believe the area is full of snakes, which could cause us to move our camp to another territory. The reality may be that our first spot was safer than our new one. If we believe that gun ownership is the reason why so many mass shootings occur, that will lead us to take different actions to curb the violence than if we believe that mental illness is the culprit. Each individual’s subjective (emotional and psychological) reasons for believing what causes mass shootings are influenced by their family, friends, and culture, and our pattern finding brains reinforce these beliefs and give them meaning. This is helpful when our patternicity and agenticity align with empirical reality (when there really is a snake in the grass), and harmful when it does not. It is also what makes democracy so difficult—everybody has their own solution to the problems we face, and we all believe ourselves to be accurate and everyone else to be wrong. Coming to a collective agreement can be exceedingly difficult.
So, how do we change our beliefs? The first step is understanding how we come to them and that they are often faulty from the start. We have to recognize the role that our rationality plays in sustaining them, especially in the face of contradictory evidence. Oftentimes this is not enough, with true change only coming from deeper social and cultural shifts in the underlying zeitgeist. These changes are a product of “larger and harder-to-define political, economic, religious, and societal changes.” Nonetheless, a little more skepticism is a healthy thing—especially skepticism of ourselves and our own beliefs. It is important to occasionally hold your beliefs up in the mirror and ask yourself: are these actually, empirically, true?
Shermer spends quite a bit of time talking about the underlying mechanisms of the brain, which is very helpful for people who haven't kept up with neurophysiology, but I can't help feeling that unless the reader accepts the central premise, these sections don't add to the persuasiveness of his argument. Far more revealing - yet somewhat hastily covered - was the discovery that believing requires very little neural effort while critically evaluating information requires significantly more. In this tiny nugget lies perhaps the critical reason why so many people believe in gods and goblins and gouls and ghosts and all the rest of the magic-mind realm: it's much easier to believe than to think something through. That's why people who grow up in India believe in Shiva and Kali and Krishna and Ram, why people who grow up in Tennessee believe in Jesus, and why people who grow up in Arabia believe in Muhammed: it's simply the easiest mental option given the environment.
Also, Shermer doesn't really spend time showing how our brains are wired up for certain types of thought but not others. This lack of 360 thinking (because there were no environmental pressures to cause such thinking to be positively selected for over the eons) means we perpetually make simple logical blunders. For example, the statement "All birds have wings; crows have wings; therefore crows are birds" is generally accepted at first glance by most people. Yet it's a complete mistake (to see why, just substitute the word "bees" or "bats" in place of "crows" in the sentence). We make this type of mistake continually when we interact with the world around us. Coupled to what Shermer calls "agenticity" (the inference of some active agent behind a phenomenon, which results from the fact we're social animals and need to be attuned to what the other members of the group are feeling and thinking) we then make elimentary mistakes such as thinking that thunder is the outward sign of a deity's displeasure, or that rain is somehow signalling sadness. The powerful nature of such errors is still with us: just think of a horror movie in which thunder signals danger, or a romance movie in which rain invariably accompanies the post-breakup solitary walk.
Like many well-meaning people, Shermer tries to accommmodate religion, arguing that it's benign if it's something that helps people make it through life. Unfortunately religion (which is, formally, the organization of superstitions into a codex) may help some people "make it through life" but it's generally at the expense of others. Shermer signally fails to understand that religions are necessarily regressive because their core premise is that their holy book/rock scratchings/oral tales are supposedly "from the mouth of god(s)" and therefore infallible. Consequently the notions of discovery and progress are antithetical to religion. This is why, for example, the Catholic church systematically destroyed scientific advances by burning Giordano Bruno and threatening Galileo with instruments of torture. It's why contemporary Islam has more in common with stone-age societies than with any modern civilization. Oil-rich kingdoms may buy the products of advanced civilizations but they are incapable of creating anything for themselves because they are trapped in a pre-scientific mentality that is utterly unconducive to any kind of real-world accomplishments. Seen more clearly, religion is not merely a harmless psychological crutch: it's a profoundly divisive, restrictive, and discouragingly infantile cognitive error. It is an impediment to true civilization and to civilized behavior. Dawkins is far more accurate in his summation of religion than Shermer, who perhaps has insufficient exposure to the real impact of religious impulse across history and around the globe today. Shermer also makes the mistake of thinking that perhaps religion has adaptive benefits in terms of group cohesion, but there are two major flaws with this notion. The first is that group selection is impossible; the second is that for every "happy congregation" there's a Jonestown, a Waco, and of course the ever-popular Taliban.
The other things missing from Shermer's informative and entertaining book are (i) a discussion of how the human brain isn't wired up to perform consistency checking (e.g. if I believe A and I believe B, are they mutually consistent or mutually contradictory?) and (ii) how all superstitions and religions fail utterly to tell us anything meaningful about the real world. To elaborate: while all religions have their creation myths and their mummy gods and daddy gods, not a single one has ever revealed any underlying truth about the universe. No religion ever imparted knowledge of DNA or cosmology or chemistry or physics or anything else. They are all impoverished products of banal minds, repetitive in the extreme and entirely predictable. Once you've seen a representative sample of a few, you've seen them all. Only the clothes, rituals, and degree of violence may be slightly different. If there really was some kind of magical creature imparting wisdom to our species (which is quite funny to think about - imagine a person trying to teach a worm morality or quantum physics) then why, over all the millenia and over all the many different kinds of religion, has there never been a single example of a valid revelation? Oh, I forgot: "god moves in mysterious ways."
One final small criticism is that Shermer is very USA-centric. His examples of political and moral "positions" only make sense in a very limited American context. He automatically assumes what exists in the USA is generally found elsewhere, which is wildly untrue. His next book might benefit from time spent in equitorial Africa, the Scandinavian countries, the Amazon and somewhere like VietNam or Laos. And his penultimate chapter (on the origins of the universe) misses the key point which is simply this: religious people say "how did the universe get here if it wasn't made by god?" and then sit back as if this was some sort of actual logical argument. In reality, of course, it's totally empty because if you say, "OK, god made the universe" then the very next question is "so where did god come from?" To which religious people say something like "god always existed." Although this position demonstrates an infantile inability to reason at even the most basic level, the conclusion is evident: invoking "god" solves no problem whatsoever. It's merely a pointless regression, like the question of what was underneath the turtle that Atlas stood on while supporting the Earth on his shoulder. In his desire not to offend religious people, Shermer simply omits the conclusion altogether, preferring to meander off into multiverse theory which doesn't actually accomplish much more than another regression anyway.
Anyhow, aside from these minor criticisms, for anyone looking to understand why the vast majority of people believe in things for which there never has been the slightest shred of evidence, Shermer's book is an excellent primer and a useful compliment to the works of Richard Dawkins, who takes it as a priori that only an idiot would be a believer. Shermer shows, valuably, why intelligence and belief are two different things (back to the inability to do consistency checking) and therefore how American scientists, for example, can believe in magical creatures for which there is zero evidence yet at the same time continue to make contributions towards genuine real-world problems.
Top reviews from other countries
and the way in which mentalisation goes about constructing strong frameworks of belief. Belief is back in vogue - somebody has recently written a book telling us that they can be changed in ten minutes - although he too falls in to many of the traps outlined above. It is easy when you see yourself as an expert on something to fall into the trap of systemising in an over-determing way and producing over-reaching conclusions. Belief is an untotalisable multiplicity that does not sit well in a box constructed by an author who seeks refuge in the safety of finite reasoning.
This book holds that the brain is a belief engine. From sensory data flowing through the senses the brain naturally begins to look for and find patterns and then infuse those patterns with meaning, intention and agency. Once beliefs are formed, the brain begins to look for and find confirmatory evidence in support of those beliefs, which adds an emotional boost of further confidence in the beliefs. How is it that people come to believe something that apparently defies reason? The answer is that beliefs come first; reasons for belief follow in confirmation of the realism dependent upon the belief.
The vast scholarship that Michael Shermer brings to bear on the subject is impressive.
He describes the neurological process. For example, of the chemical transmitter substances sloshing around in your brain, dopamine may be the most directly related to the neural correlates of belief. Dopamine is the reward system of the brain. It is critical in associated learning. Any behaviour that is reinforced tends to be repeated.
Religion figures large. 84% of the World’s population belongs to one of the 10,000 distinct religions. America is the most religious tribe of the species. In the US 82% of people believe in God and more people believe in n angels and demons than believe in the theory of evolution. He looks at the overwhelming evidence that God is hardwired into our brains and the questions of what is God, does God actually exist, and Einstein’s God.
But we are all susceptible. Belief in conspiracies, moral judgements and political beliefs are universal. The natural tendency of anyone with a political belief to search for and find evidence to support their case applies to us all. People divide themselves into liberals or conservatives (democrats or republicans) and then read, watch and listen to confirmatory evidence.
Shermer’s solution is skeptiscm – a scientific approach to the evaluation of claims. Where philosophy and theology depend upon logic and reason and thought experiments, science employs empirics, evidence and observational experiments. It is the only hope we have of avoiding the trap of belief dependent realism.
So my visits to the pub every month are justified!
I also felt the cosmology section could have been briefer, interesting and insightful as it was it was a major digression from the neuroscience and psychological basis for this book.
I found it very comprehensive, having some basis in neuroscience, but I think information is presented in a way that is understandable and entertaining. This is pretty in-depth for an overview but if you have an interest in the mechanics of the brain and the evolution of human behaviours around religion and politics you will find this enlightening.
I was also interested in what Mr Shermer believed as I'd always assumed (from reading his articles in Scientific American) that he and I were of the same mind (or should I say "brain", since he seems to think that "mind" is an illusion, though its not clear who the illusion is supposed to be acting on), which turned out not to be the case. I was quite taken aback when he dismissed what is (for me) the key question of reality, "Why is there something rather than nothing?" as "nonsensical", going on to say "Asking why there is something rather than nothing presumes 'nothing' is the natural state of things out of which 'something' needs an explanation. Maybe 'something' is the natural state of things and 'nothing' would be the mystery to be solved". I refer the author to the null hypothesis he eloquently argues for in his epilogue.
He also claims elsewhere that "mind" is just "brain", as though the question "when I feel pain, what is this 'I' that is feeling pain" is explained by such an assertion.
I'd quite like to see Mr Shermer write a book on the topic "what skeptics believe", as I suspect many people who call themselves skeptics, and are not conspiracy theorist nutters, actually have fundamental differences in the axioms on which they base their world views (assuming Mr Shermer himself is a typical skeptic)











