Expert Political Judgment and over one million other books are available for Amazon Kindle. Learn more
Have one to sell? Sell on Amazon
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See this image

Expert Political Judgment: How Good Is It? How Can We Know? Hardcover – July 5, 2005

ISBN-13: 978-0691123028 ISBN-10: 0691123020 Edition: 4th Printing

10 New from $60.90 15 Used from $40.99
Amazon Price New from Used from
Kindle
"Please retry"
Hardcover
"Please retry"
$60.90 $40.99

Free%20Two-Day%20Shipping%20for%20College%20Students%20with%20Amazon%20Student



NO_CONTENT_IN_FEATURE

Save up to 90% on Textbooks
Rent textbooks, buy textbooks, or get up to 80% back when you sell us your books. Shop Now

Product Details

  • Hardcover: 352 pages
  • Publisher: Princeton University Press; 4th Printing edition (July 5, 2005)
  • Language: English
  • ISBN-10: 0691123020
  • ISBN-13: 978-0691123028
  • Product Dimensions: 9.2 x 6.3 x 1.2 inches
  • Shipping Weight: 1.4 pounds
  • Average Customer Review: 4.5 out of 5 stars  See all reviews (31 customer reviews)
  • Amazon Best Sellers Rank: #1,009,219 in Books (See Top 100 in Books)

Editorial Reviews

Review

Winner of the 2006 Woodrow Wilson Foundation Award, American Political Science Association
Winner of the 2006 Grawemeyer Award for Ideas Improving World Order
Winner of the 2006 Woodrow Wilson Foundation Award, American Political Science Association
Winner of the 2006 Robert E. Lane Award, Political Psychology Section of the American Political Science Association

"It is the somewhat gratifying lesson of Philip Tetlocks new book . . . that people who make prediction their business--people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables--are no better than the rest of us. When theyre wrong, theyre rarely held accountable, and they rarely admit it, either. . . . It would be nice if there were fewer partisans on television disguised as "analysts" and "experts". . . . But the best lesson of Tetlocks book may be the one that he seems most reluctant to draw: Think for yourself."--Louis Menand, The New Yorker

"The definitive work on this question. . . . Tetlock systematically collected a vast number of individual forecasts about political and economic events, made by recognised experts over a period of more than 20 years. He showed that these forecasts were not very much better than making predictions by chance, and also that experts performed only slightly better than the average person who was casually informed about the subject in hand."--Gavyn Davies, Financial Times

"Before anyone turns an ear to the panels of pundits, they might do well to obtain a copy of Phillip Tetlock's new book Expert Political Judgment: How Good Is It? How Can We Know? The Berkeley psychiatrist has apparently made a 20-year study of predictions by the sorts who appear as experts on TV and get quoted in newspapers and found that they are no better than the rest of us at prognostication."--Jim Coyle, Toronto Star

"Tetlock uses science and policy to brilliantly explore what constitutes good judgment in predicting future events and to examine why experts are often wrong in their forecasts."--Choice

"[This] book . . . Marshals powerful evidence to make [its] case. Expert Political Judgment . . . Summarizes the results of a truly amazing research project. . . . The question that screams out from the data is why the world keeps believing that "experts" exist at all."--Geoffrey Colvin, Fortune

"Philip Tetlock has just produced a study which suggests we should view expertise in political forecasting--by academics or intelligence analysts, independent pundits, journalists or institutional specialists--with the same skepticism that the well-informed now apply to stockmarket forecasting. . . . It is the scientific spirit with which he tackled his project that is the most notable thing about his book, but the findings of his inquiry are important and, for both reasons, everyone seriously concerned with forecasting, political risk, strategic analysis and public policy debate would do well to read the book."--Paul Monk, Australian Financial Review

"Phillip E. Tetlock does a remarkable job . . . applying the high-end statistical and methodological tools of social science to the alchemistic world of the political prognosticator. The result is a fascinating blend of science and storytelling, in the the best sense of both words."--William D. Crano, PsysCRITIQUES

"Mr. Tetlock's analysis is about political judgment but equally relevant to economic and commercial assessments."--John Kay, Financial Times

"Why do most political experts prove to be wrong most of time? For an answer, you might want to browse through a very fascinating study by Philip Tetlock . . . who in Expert Political Judgment contends that there is no direct correlation between the intelligence and knowledge of the political expert and the quality of his or her forecasts. If you want to know whether this or that pundit is making a correct prediction, dont ask yourself what he or she is thinking--but how he or she is thinking."--Leon Hadar, Business Times

From the Inside Flap

"This book is a landmark in both content and style of argument. It is a major advance in our understanding of expert judgment in the vitally important and almost impossible task of political and strategic forecasting. Tetlock also offers a unique example of even-handed social science. This may be the first book I have seen in which the arguments and objections of opponents are presented with as much care as the author's own position."--Daniel Kahneman, Princeton University, recipient of the 2002 Nobel Prize in economic sciences

"This book is a major contribution to our thinking about political judgment. Philip Tetlock formulates coding rules by which to categorize the observations of individuals, and arrives at several interesting hypotheses. He lays out the many strategies that experts use to avoid learning from surprising real-world events."--Deborah W. Larson, University of California, Los Angeles

"This is a marvelous book--fascinating and important. It provides a stimulating and often profound discussion, not only of what sort of people tend to be better predictors than others, but of what we mean by good judgment and the nature of objectivity. It examines the tensions between holding to beliefs that have served us well and responding rapidly to new information. Unusual in its breadth and reach, the subtlety and sophistication of its analysis, and the fair-mindedness of the alternative perspectives it provides, it is a must-read for all those interested in how political judgments are formed."--Robert Jervis, Columbia University

"This book is just what one would expect from America's most influential political psychologist: Intelligent, important, and closely argued. Both science and policy are brilliantly illuminated by Tetlock's fascinating arguments."--Daniel Gilbert, Harvard University


More About the Author

Discover books, learn about writers, read author blogs, and more.

Customer Reviews

4.5 out of 5 stars
5 star
22
4 star
6
3 star
0
2 star
1
1 star
2
See all 31 customer reviews
Brilliant research written up in a clear fashion.
David McKay
Tetlock helps the non-experts to know more about what the experts know, how they know it, and how much good it does them in making predictions.
Dr. Frank Stech
Because it makes it impossible to avoid these conclusions, I gave this book five stars; this is very important stuff.
Anne Mills

Most Helpful Customer Reviews

45 of 48 people found the following review helpful By Dr. Frank Stech on January 5, 2007
Format: Paperback Verified Purchase
Tetlock shows conclusively two key points: First, the best experts in making political estimates and forecasts are no more accurate than fairly simple mathematical models of their estimative processes. This is yet another confirmation of what Robyn Dawes termed "the robust beauty of simple linear models." The inability of human experts to out-perform models based on their expertise has been demonstrated in over one hundred fields of expertise over fifty years of research; one of the most robust findings in social science. Political experts are no exception.

Secondly, Tetlock demonstrates that experts who know something about a number of related topics (foxes) predict better than experts who know a great deal about one thing (hedgehogs). Generalist knowledge adds to accuracy.

Tetlock's survey of this research is clear, crisp, and compelling. His work has direct application to world affairs. For example he is presenting his findings to a conference of Intelligence Community leaders next week (Jan 2007) at the invitation of the Director of National Intelligence.

"Expert Political Judgment" is recommended to anyone who depends on political experts, which is pretty much all of us. Tetlock helps the non-experts to know more about what the experts know, how they know it, and how much good it does them in making predictions.
1 Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
29 of 32 people found the following review helpful By Peter McCluskey on September 22, 2006
Format: Paperback
This book is a rather dry description of good research into the forecasting abilities of people who are regarded as political experts. It is unusually fair and unbiased.

His most important finding about what distinguishes the worst from the not-so-bad is that those on the hedgehog end of Isaiah Berlin's spectrum (who derive predictions from a single grand vision) are wrong more often than those near the fox end (who use many different ideas). He convinced me that that finding is approximately right, but leaves me with questions.

Does the correlation persist at the fox end of the spectrum, or do the most fox-like subjects show some diminished accuracy?

How do we reconcile his evidence that humans with more complex thinking do better than simplistic humans, but simple autoregressive models beat all humans? That seems to suggest there's something imperfect in using the hedgehog-fox spectrum. Maybe a better spectrum would use evidence on how much data influences their worldviews?

Another interesting finding is that optimists tend to be more accurate than pessimists. I'd like to know how broad a set of domains this applies to. It certainly doesn't apply to predicting software shipment dates. Does it apply mainly to domains where experts depend on media attention?

To what extent can different ways of selecting experts change the results? Tetlock probably chose subjects that resemble those who most people regard as experts, but there must be ways of selecting experts which produce better forecasts. It seems unlikely they can match <a href="[...]">prediction markets</a>, but there are situations where we probably can't avoid relying on experts.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
15 of 15 people found the following review helpful By T. Coyne on March 10, 2006
Format: Hardcover
As both a consultant and an investment manager I have spent a lot of years studying decision theory. One limitation in a lot of the work I encountered was its heavy reliance on lab studies using students. You were never quite sure if the conclusions applied in the "real world." This outstanding book puts that concern to rest. It is by far the richest body of evidence I have encountered on decision making in real world situations. Anybody interested in decision making and decision theory will profit from reading it.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
Format: Hardcover
This is an important book for it gives us an insight in how to evaluate the thousands of experts who are continually bombarding us with their predictions. Tetlock chooses the difficult and murky area of political judgment, and on it centers his analysis, though his basic conclusions relate to forecasting in other areas, such as business and finance. Roughly he takes Isaiah Berlin's distinction between the hedgegog who would know one big thing, and the fox, who knows many little things as basis of his analysis. As he sees it the Hedgehogs who base themselves on big theories are invariably wrong, while the foxes who tend be more open to the actual play of reality, have a far better record.

As he understands it the Hedgehogs go overboard in making Boom or Bust predictions. He provides empircal studies data to show how they are most often wrong, even more wrong by the way when they are predicting Disaster. Those who qualify their predictions the uses of 'perhaps' and 'however' and 'nonetheless' and 'possibly' have a far better chance of getting it right.

The irony of this however is that it is precisely the Hedgehogs who are rewarded, and receive greatest Media attention. They are never punished for being wrong, for few seem to follow and check on the accuracy of the prediction. The more accurate qualified assessments are given, on the other hand, more scanty space and attention. After all when we are uncertain about the future who wants to hear a prediction which itself says it is uncertain.

This is a very instructive book, although I wish at times its systems of classification were a bit less awkward.

On the whole though this is a highly recommended and important work, which can be of real help to most of us in understanding how to separate the ' wheat ' from the 'bull'.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Customer Images

Most Recent Customer Reviews

Search