Amazon Vehicles Beauty Best Books of the Month STEM nav_sap_plcc_ascpsc Electronics Dads and Grads Gift Guide Starting at $39.99 Wickedly Prime Handmade Wedding Shop Home Gift Guide Father's Day Gifts Home Gift Guide Shop Popular Services ALongStrangeTrip ALongStrangeTrip ALongStrangeTrip  Introducing Echo Show All-New Fire HD 8, starting at $79.99 Kindle Oasis AutoRip in CDs & Vinyl Shop Now toystl17_gno

Format: Hardcover|Change
Price:$15.96+ Free shipping with Amazon Prime
Your rating(Clear)Rate this item


There was a problem filtering reviews right now. Please try again later.

Showing 1-10 of 224 reviews(4 star, Verified Purchases). See all 1,107 reviews
on May 26, 2017
The author Nate Silver does a great job weaving more technical statistical concepts early in the book, so as not to lose readers early on. However I thought this would lead to more a detailed technical discussion later on, which the author said it would, but it never really transpired. Instead he kept to analogies and keeping the science of prediction in context. Which there's really nothing wrong with, if you're someone looking for that ... just not exactly what I wanted nor expected.

Nonetheless it's a great book, and Silver bears the hallmark of someone who is intellectually curious and genuinely interested in making his analytical tool better, rather than attaching his ego to the outcome. As part of that, he's refreshingly candid in his opinion of others. Well researched and covers a lot of area including sports, weather, financial meltdown's, chess, and others. The best section for me was on chess, where he displayed both his story telling skills (retelling of chess master Kasparov's loss to IBM was both compelling and insightful), and more in depth technical discussion which chess lends itself to. The book seemed to run out of steam toward the end, with some chapters going on longer than I thought necessary, particularly poker and efficient markets.

He shares some of my core beliefs that statistics/data is not enough, if you really want to understand something and make good forecasts you need to understand its underlying structure. And that the proper relationship between man and machine is symbiotic, rather than one taking over the other. Those, and the importance of thinking probabilistically, are the core takeaways.
0Comment| 3 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 12, 2017
Nate Silver is best known for using polling data to call political elections. He missed on the Trump win, but was pretty good up until then.

The Signal and the Noise is a well written, well researched and well reasoned book about forecasting and the various mistakes that prognosticators make. He addresses failures as the inability of economists and others to foresee the bursting of the housing bubble and the chaos it created in 2008. Other themes include easier-to-predict subjects such as future performance of major league baseball players and the success (or not) of poker players. In these later two, he has real world experience as he developed software to predict baseball player performance and made a living as a professional poker player.

Other forecasting areas that he writes about include weather (a modern success); earthquakes (not so much due to difficulties in differentiating the signal from the noise); the spread of infectious diseases (difficult to model due to human behaviour); and climate change (right on warming but uncertain about effects).

One of the over all themes involves the Bayes Theorem. This requires an a priori hunch about the chances of an event that is refined by future observations and experimenting.

There were sections I like more than others, but this may correlate more with my affinity for the subjects rather than Silver's reporting. I particularly like the section on Climate Change research. It was thoughtful and open-minded. As he does throughout the book, he looks at the facts and the stats and interviews the people involved in the research.
0Comment| One person found this helpful. Was this review helpful to you?YesNoReport abuse
on June 30, 2013
If you are hoping for some simple guidelines that will help you distinguish actionable information (signal) from random information (noise), as I was, I expect that you'll be disappointed. I came away from the book with the belief that the author was honest. He's giving you his best analysis and his best approaches to making predictions. Distinguishing signal from noise requires, in the author's words, "both scientific knowledge and self-knowledge...." One has to know one's own biases before one can honestly collect, review and analyze any body of information to make a prediction. He argues in favor of Bayes' Theorem as the preferred statistical approach to make predictions. Do not worry if, like me, you are not a statistician or find formulas challenging. The author has a light touch in this area and explains that he prefers Bayes' Theorem because it explicitly requires one to express one's initial belief (bias perhaps) in forming a prediction.

Mr. Silver does not view predictions as "one and done." The process of prediction may, and frequently does, require the modification of predications as more, or new, information is gathered. He expresses this as the process of making predictions "less and less wrong." In some areas that he considers, such as earthquake prediction, the process of being less and less wrong has been highly limited to virtually non-existent. In other areas, success has been more measurable. The author got his start analyzing baseball statistics and had success with some of his statistical findings. The book Moneyball by Michael Lewis describes some of these successes. He next applied his statistical skills to political predictions and enjoyed success there as well. But, in both of these areas there was a considerable body of statistical information and a history of third-party predictions that could be analyzed. Even in these areas the author honestly recognizes that the ability to predict has limitations.

Mr. Silver considers chess and looks at the success the supercomputer Deep Blue had against Garry Kasparov. He found one surprising observation about the first win that Deep Blue had against Mr. Kasparov. He considers some of the limitations of the computer's analytical capacity.

One section describes Mr. Silver's experiences in playing poker and the predictive approaches necessary to succeed in this endeavor. I wasn't sure if this section was all that helpful, but it did further his thesis that one must recognize one's own limitations (and have a firm grasp on the rules of the game) to have any chance of a useful prediction.

Good prediction may require imagination and require one to set aside current world views to look at matters anew. This may be especially helpful in trying to predict matters such as terrorists' activities.

I recommend the book if for no other reason than it will challenge you to consider your own biases and preconceived notions. I appreciated what I feel is his honest approach to the subject. I came away with a belief that I have to look hard at the information that I have gathered, the sources of the information, and the analytical to which I have subjected the information before I try to predict an outcome.
0Comment|Was this review helpful to you?YesNoReport abuse
on November 27, 2012
This is not a textbook but rather a narrative with some simple math and statistics illustrations.

Nate gives a clear description of bad vs. good analysis. It is clear that political pundits who know how to use good analysis have an advantage.

This is essentially how Nate predicted the 2012 national election with such high accuracy. He has been tracking the opinions, based on polls, and determining which are the most predictive. He uses a broad range of polls and weights them based on their past history of accuracy in past elections.

The book is intended for the lay person to understand that there is wide variation in the performance in the work of analysts. Part of the key differentiation between analysts that rely on discredited frequentist statistical methods, which are still being taught at our universities, and those that embrace Bayesian statistical methods, which are being used by actuaries, gamblers and anyone who has an an informed opinion potential future events.

The frequentists are discredited because their methods do not predict events well, and are also discredited because much research that draws incorrect conclusions are based on these methods. The bayesian approach is becoming the preferred approach for predictive model building because it uses all informed opinions about the future and tests them. The models built in this manner improve over time as opinions that are proven right take on more weight in predictions, and opinions that prove incorrect are diminished and removed from the analysis.
0Comment| 5 people found this helpful. Was this review helpful to you?YesNoReport abuse
on January 4, 2015
This is a long but generally interesting, not mathematically rigorous, look at the human side of making predictions. The book has four sections, the first 3 chapters give an in depth look at the application of predictions (in finance, baseball and politics), then there are 4 chapters on hard prediction problems (weather, earthquakes, economics, and infectious diseases), followed by an introduction to Bayesian thinking (how to adjust your beliefs) and finally there is a 3 chapter discussion on how Bayesian thinking can be applied to hard problems.

The entire book is brilliantly written and it really shines for the diversity of its examples. However, it is very long and it drags badly in the chapters on Bayesian thinking. In particular the author gets bogged down with very extended sections on chess, which does have some brilliant discussion of man vs. machine matches, and poker, which could have been reduced to a couple pages of insights on the economic impact of bad players.

The one major problem with the book is it treats Bayesian thinking as a panacea. Nobody would argue with the general idea of trying to quantify your beliefs and then updating your beliefs when new data comes in. (This is the gist of Bayesian statistics.) However, the author does not address just how hard it is to quantify/specify ones starting beliefs. Coming up with a guestimate on your prior beliefs can be extraordinary difficult and as the author does point out (but only in one example) tweaking your prior belief (probability) can *radically* change your conclusions. The fact that this is not stressed is a major disservice to the reader.

That major complaint aside this book does a brilliant job of explaining complex ideas in layman's terms and highlighting the importance of dismissing the idea of certainty and instead embracing or at least paying attention to the imprecision of our estimates and how wrong we, both laymen and experts, can be.
0Comment| One person found this helpful. Was this review helpful to you?YesNoReport abuse
on February 28, 2016
Like a lot of reviewers here, I came into this book expecting the author to provide a system of prediction: a way to differentiate signal from noise. I didn't get that, but having now read the book I feel that that's okay. One thing Silver does a very good job of is explaining, in exhaustive detail, just how difficult the art of prediction is and how so many people get into trouble by trying to have a "system" of prediction. If anything, Silver has done us all a favor by showcasing just how much not relying on a system is important in getting predictions right. Of his examples, those with grand unifying theories of "how the world works", my favorite example is the political commentators of the Mclaughlin group, by and large are no better than chance with their predictions.

If there is a criticism I have it's that there is a fair amount of fluff. Anecdotes are nice, and occasionally they help to illustrate the point being made, but long stretches of this book are what I can only describe as "human interest" stories about some of the forecasters being profiled. For the most part these are not a problem, particularly when they provide context to how these people are so successful at predicting the future when so many others are not, but they do wear thin after a while.
0Comment|Was this review helpful to you?YesNoReport abuse
on January 30, 2015
"In Pearl Harbor, what they prepared for were things that really didn't happen, "Donald Rumsfeld said.”They prepared for sabotage because they had so many Japanese descendants living in Hawaii. And so they stuck all the airplanes close together, so they could be protected. So of course the bombers came and they were enormously vulnerable, and they were destroyed."

In advance of Pearl Harbor, as Rumsfeld mentioned, we had a theory that sabotage-attack from within-was the most likely means by which our planes and ships would be attacked.

"Any signals were interpreted in this context, logically or not, and we prepared for subterfuge. We stacked our planes wingtip to wingtip, and our ships stern to bow, on the theory that it would be easier to monitor one big target than several smaller ones.

Meanwhile we theorized that, if Japan seemed to be mobilizing for an attack, it would be against Russia or perhaps against the Asian territorial possessions of the United Kingdom, Russia and the UK being countries that were already involved in the war. Why would the Japanese want to provoke the sleeping giant of the United States? We did not see that Japan believed our involvement in the war was inevitable, and they wanted to strike us when we were least prepared and they could cause the most damage to our Navy, The imperial Japanese government of the time was not willing to abandon its hopes for territorial expansion. We had not seen the conflict through the enemy's eyes.

To Wohlsetter, a signal is a piece of evidence that tells us something useful about our enemy's intentions; this book thinks of a signal as an indication of the underlying truth behind a statistical or predictive problem. Wohlstetter's definition of noise is subtly different too.

Whereas I tend to use noise to mean random patterns that might easily be mistaken for signals, Wohlstetter uses it to mean the sound produced by competing signals. In the field of intelligence analysis, the absence of signals can signify something important (the absence of radio transmissions from Japan's carrier fleet signaled their move toward Hawaii) and the presence of too many signals can make it exceptionally challenging to discern meaning. They may drown one another out in an ear-splitting cacophony.

p. 415-416
0Comment|Was this review helpful to you?YesNoReport abuse
on October 11, 2013
If you are over 50 and male this could be the most important post you read this year.

I have just finished reading The Signal and the Noise: The Art and Science of Prediction by Nate Silver. The book discusses a how a diverse set of forecasts ranging from politics, baseball and the weather are prepared, the errors that are often made and how in many cases ‘expert predictions’ should be treated with many grains of salt.

However a real strength of the book is the description of Bayesian reasoning in Chapter 8 which is a technique every manager should learn. A lot of management effort is spent altering forecasts as new information is received Bayes allows to make better predictions. This book made me really learn about Bayes Theorem.

Simply put one takes a prior probility and compares it to a new event and obtains a posterior probability. Say x equals the prior probability, y equals a new event probability that is true, and z equals a new event probability that is false. The posterior probability is xy over xy plus (1-x)z. The secret is calculating both the true and false positives.

Say for example there has been an accident in a city involving a taxi cab.
* 85% of the cabs in the city are white, and 15% are silver.
* A man identified the cab involved in a hit and run as silver.
* The court tested the witness' reliability, and the witness was able to correctly identify the correct color 80% of the time, and failed 20% of the time.

What is the probability the taxi cab was silver? Here's how we figure it out using Bayes theorem.

If the cab was silver, a 15% chance, and correctly identified, an 80% chance, the combined probability is .15 * .8 = .12, a 12% chance. These are true positives.
If the cab was white, an 85% chance, and incorrectly identified, a 20% chance, the combined probability is .85 * .2 = .17, a 17% chance. These are false positives
Since the cab had to be either white or silver, the total probability of it being identified as silver, whether right or wrong, is .12 + .17 = .29. In other words, this witness could be expected to identify the cab as silver 29% of the time whether he was right or wrong.

The chances he was right are .12 out of .29, or 41%.

Now recently I took a PSA test and my reading was above the supposed danger level. What is the probability I have prostate cancer?

The chances of have prostate cancer at various ages are as follows:
For a man in his 40s - 1 in 1000
For a man in his 50s - 12 in 1000
For a man in his 60s - 45 in 1000
For a man in his 70s - 80 in 1000

I am 69 so my chances of prostate cancer would be 63 in 1000. However I have now had a positive PSA test result.

Now according to a medical website for every 100 men over age 50, with no symptoms, who have the PSA test:

10 men out of 100 tested will have a higher than normal level of PSA. These men must then go through other tests and examinations. At the end of these tests :
• Three of the ten men with a higher than normal PSA level will be found to have prostate cancer
• Seven of the 10 men with a higher than normal PSA level will be found not to have prostate cancer at the time of screening

90 men out of 100 tested will have a normal PSA level. Of these 90 men :
• 88 of the men with a normal PSA level will not have prostate cancer.
• One or two of the men with a normal PSA level will actually have prostate cancer, undetected by the test.

The probability that the PSA test gives a true positive for me is 0.063 x 88/90 or 0.0616 (xy in the formula; note two people out of 90 are missed.)

The probability that the PSA test gives a false positive for me is 0.927 x .07 or 0.0656 ((1-x)z in the formula.

The sum of the true and false positives is 0.1272 and so according to Bayes the probability that I have prostate cancer is 48% which is much higher than I originally thought and means that I will go forward with a biopsy. I never would have come to this conclusion without reading Silver's book

I did find the final chapter of climate science to be weak. I confess I am a sceptic but Silver tries to justify Climategate and in my opinion fails badly. As reviewer Robert has noted the Climategate scandal had nothing to do with the global temperatures, the greenhouse effect, or basic climate science. Climategate is all about how some prominent scientists fudged data, erased embarrassing results, and sought to control the peer review process in leading scientific journals to suffocate dissenting opinions.

Indeed the proponents of Climategate sound very similar to political pundits that Silver so effectively lacerates in the first chapter of his book.
0Comment| One person found this helpful. Was this review helpful to you?YesNoReport abuse
on September 17, 2014
A consistently successful bettor, to use the terminology of the book, has to possess subjective probability estimates (his Bayesian priors) that capture more signal than the market consensus. Thus one of the factors in beating any market hinges upon the efficiency of the market in question. As Silver details the on-line poker market from 2003 to October 2006 was not efficient; while nowhere near the average incompetence of the poker market, pre-Moneyball baseball and political forecasting were also less than efficient. Mr. Silver notes he was fortunate to make his reputation in markets "where the water level was set pretty low."

An example of a market where the water level is high would be the stock market. There are large armies of highly trained analysts working for institutions with 50 million dollar budgets and large computers pouring over data. Yet there remains the problem of how to explain bubbles: Silver covers a number of possible explanations none of which is fully satisfactory.

I will spend the rest of this review going over Silver's view on climate change and its troubled politics. There is solid scientific research that demonstrates throughout earth's history temperature has fluctuated significantly. Scientists possess a solid theory which explains these fluctuations: changes in the amounts of greenhouse gases (carbon dioxide, water vapor, methane) in the atmosphere. With a rise in greenhouse gases the earth's surface becomes warmer.

Where controversy arises is trying to put specific numbers to the temperature rise associated with a given carbon dioxide increase. Such a prediction requires estimates of complex interactions between many variables. Silver cites a 2008 survey of climate scientists which found the 84% agreed that human caused climate change was occurring now but only 19% felt confident about the accuracy of climate computer models to forecast, for example, sea rise levels in 50 years.

Is such skepticism in climate models justified? Over 20 years have passed since climate predictions were made based upon early computer models attempting to integrate all the complex interactions. Enough time has passed to begin to look into how well these predictions have stood up. Silver focuses on the IPCC predictions of global temperature issued in 1990 which, when adjusted for the actual lower carbon dioxide emissions, did a decent job of predicting; revisions made in 1995 did even better. These predictions impressed Silver, moreover he stresses, this is how science works, assumptions and beliefs lead to predictions which are then tested and refined. In science progress is possible, "dubious forecasts are likely to be exposed and the truth prevail." In regards to politics Silver is far less sanguine.

Our political system has become so dysfunctional because, according to Silver, in politics "truth enjoys no privileged status"; it's anybody's guess whether in the long run truth or crazy will prevail. One possible way to see this (which is not covered in Silver's book) is by Condorcet's Jury Theorem. Given a three person group where each member possess a 67% probability of being right, then the probability that a majority vote will produce the right answer is 74%. As the size of the group increases so will its probability to be correct. This average has been called by a number of names, most recently, as the wisdom of crowds; it can be seen behind the ideas of the invisible hand and the efficient market. Under this logic what would cause market failure? If the answers of the individuals in the group drop below 50% probability of being right, then the power of crowds would go into reverse.

Silver relates Philip Tetlock's findings which demonstrate that the typical media pundit and "hedgehog" variety of political scientist, both of whom feed the public's craving for certainty, are close to the dangerous 50% threshold. What could cause a political market to settle on answers worse than random? When many people suffer from the same bias the average answer will not be reliable. Ideology is a perfect mechanism which can cause a mass of people to become stupider than they are.

Science may demonstrate what the truth of doubling carbon emissions will be but it cannot choose what must be done, that is for politics to resolve. The far left in America want to use climate science as a bludgeon to break up market capitalism. The far right fearful of the increase in government activity necessary to lower carbon emissions find it convenient to deny the science. Silver is afraid this lowest common denominator climate argument will "continue for decades."

Silver's advice to climate scientists is basically do not descend into the ideological pit that is our political market but to remain above the fray with clean hands. This is reminiscent of the early Emerson who gave up on leading mankind because he found that society did not want to renounce their opinions for the truth. Although, it must be said, that passage of The Fugitive Slave Act roused Emerson to a more active participation in the dysfunctional political market of the 1850's.
0Comment|Was this review helpful to you?YesNoReport abuse
on December 19, 2012
I decided to read this book with all the good publicity Nate Silver has gotten over the last 5 years with his handicapping of national Presidential and Senate election polls. Recently he has had some interesting interviews -- notably with Joe Scarborough where he bet Joe about the accuracy of his predictions and a really interesting, quite serious interview with Conan O'Brien about this book. I was a little disappointed with the book. Not because of the way it's written, but because he chose to concentrate on many other disciplines where predictions are made rather than getting into how he uses polls to determine his electoral predictions. Fortunately he spoke at length about how he did that on those television interviews.

This book is interesting, though, because he shows how the data used in making predictions has been handled by the wide variety of forecasters who make predictions for many different reasons. In his chapter about baseball scouting and their use of data, he shows how scouts will always have an advantage because of the personal observations they can add to the data. So where "Moneyball" (in my case the movie) was portrayed as a battle between statisticians vs. scouts, Silver notes that over the past decade both camps have become instrumental in doing better scouting. If you've ever sat in the front row of a major league baseball game and seen how really young the players are, the book tells you why, with his observation that the average baseball player reaches his peak at the ripe old age of 27! The weather and earthquake chapters were also interesting, showing how forecasters have tried to soften their predictions from what the data suggested so that people would be more accepting of the data.

Many of the positive reviews for this book are from people who are serious practitioners of making predictions and assessing risk. I'm not even sure what an applied business researcher does. But over the last 20-30 years it seems beneficial to make more predictions and decisions using data rather than totally qualitative criteria. "The Signal and the Noise" suggests that the best predictions might result from using data but with qualitative and distinctive human judgments as well.

As far as the enjoyment factor, I agree with another reviewer who commented that the book reminded him of "Freakonomics" but with less fun. I think Silver is probably a fun person but that doesn't come across as much in this, his first book. As he continues to blow people away with the accuracy of his predictions I'm sure other, more enjoyable books will come. But if you can work through all his chapters, you'll realize that he has a very effective methodology of looking at data in common-sense ways to make very accurate predictions.
0Comment| 2 people found this helpful. Was this review helpful to you?YesNoReport abuse

Sponsored Links

  (What's this?)