- Series: Chapman & Hall/CRC Texts in Statistical Science (Book 106)
- Hardcover: 675 pages
- Publisher: Chapman and Hall/CRC; 3 edition (November 1, 2013)
- Language: English
- ISBN-10: 1439840954
- ISBN-13: 978-1439840955
- Product Dimensions: 7 x 1.4 x 10 inches
- Shipping Weight: 2.9 pounds (View shipping rates and policies)
- Average Customer Review: 40 customer reviews
- Amazon Best Sellers Rank: #17,088 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Bayesian Data Analysis (Chapman & Hall/CRC Texts in Statistical Science) 3rd Edition
Use the Amazon App to scan ISBNs and compare prices.
Fulfillment by Amazon (FBA) is a service we offer sellers that lets them store their products in Amazon's fulfillment centers, and we directly pack, ship, and provide customer service for these products. Something we hope you'll especially enjoy: FBA items qualify for FREE Shipping and Amazon Prime.
If you're a seller, Fulfillment by Amazon can help you increase your sales. We invite you to learn more about Fulfillment by Amazon .
Frequently bought together
What other items do customers buy after viewing this item?
"The second edition was reviewed in JASA by Maiti (2004) … we now stand 10 years later with an even more impressive textbook that truly stands for what Bayesian data analysis should be. … this being a third edition begets the question of what is new when compared with the second edition? Quite a lot … this is truly the reference book for a graduate course on Bayesian statistics and not only Bayesian data analysis."
―Christian P. Robert, Journal of the American Statistical Association, September 2014, Vol. 109
Praise for the Second Edition
… it is simply the best all-around modern book focused on data analysis currently available. … There is enough important additional material here that those with the first edition should seriously consider updating to the new version. … when students or colleagues ask me which book they need to start with in order to take them as far as possible down the road toward analyzing their own data, Gelman et al. has been my answer since 1995. The second edition makes this an even more robust choice.
―Lawrence Joseph, Montreal General Hospital and McGill University, Statistics in Medicine, Vol. 23, 2004
I am thoroughly excited to have this book in hand to supplement course material and to offer research collaborators and clients at our consulting lab more sophisticated methods to solve their research problems.
―John Grego, University of South Carolina, USA
… easily the most comprehensive, scholarly, and thoughtful book on the subject, and I think will do much to promote the use of Bayesian methods
―David Blackwell, University of California, Berkeley, USA
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
The stance of this book is very practical, and it is great to get a glimpse of how these grand-master Statisticians approach data analysis. First few chapters regarding the underlying philosophy of Bayesian statistics is also brilliantly written, a must-read for any Statistician.
I was somewhat disappointed with changes in the third edition though. The addition of Gaussian Processes and other advanced topics is greatly advertised, but I found these new chapters to be relatively poorly written compared to those in the previous edition; notations are not consistent with previous chapters, and clarity of writing is disappeared. It was a stupid idea to buy the new edition while having the second edition.
First I want to comment on the bayesian vs frequentist debate, and why one may want to use bayesian methods. Anyone who objects to bayesian paradigm on the basis of subjectivity has to realize that all statistical models are subjective. The decision to use a linear model, logistic regression, or normal distribution for your data, to list a few examples, are subjective decisions. It's no more subjective than putting a prior on your parameters. A prior doesn't have to be very informative, but can encode reasonable range of values for the parameters, such as person's height is between 0 and 10 feet, or that the number of siblings is less than 100, rather than having data completely determine the parameters. When properly incorporated, prior knowledge will help produce more precise parameter estimates.
However Bayesian analysis is more than just incorporating prior knowledge into your models. It provides probability distributions on the parameters, instead of asymptotic interval estimates. It provides an automatic way of doing regularization, without a need for cross validation. This allows one to estimate more parameters than classical frequentist models can handle, and even deal with cases when p >= n. Another advantage is relaxing independence and identical distribution assumption, as hierarchical bayesian models automatically build dependence between observations, similar to latent variables in classical statistics.
So in my opinion classical statistics already incorporates bayesian ideas through subjective selection of parametric models, practice of regularization such as ridge regression and lasso, and dependence through latent variable models, although it's done in somewhat ad-hoc manner. Bayesian statistics formalizes these notions within probability theory, and together with simulation, allows easy extensions of them in various non-trivial directions.
Now about this book. It covers all these advantages of bayesian methods and more, although sometimes requires considerable effort from the reader to uncover and pull out the relevant concepts. It's definitely not meant to be an introduction to statistics. It's assumed the reader is well versed in classical statistics and has a good grasp on topics such as hypothesis testing and interval estimation, sufficient statistics and the exponential family, MLE and it's asymptotic properties, EM algorithm, and generalized linear models, to name a few. Also I think that bayesian methods require a deeper intuition in probability theory and involve more computation and approximation techniques to build even simple models. Considering the background needed it's likely that the reader would have had a considerable prior exposure to bayesian techniques, and I think this is the target audience that the authors had in mind when writing this book.
The book is definitely tough on the first reading, especially if this is your first book entirely devoted to this subject. But reading it is well worth the effort. It covers a lot of details and subtleties of bayesian approach that are not well emphasized in books devoted to general statistics and machine learning.
The book is of applied nature, written in a way that every applied book should be. There is enough discussion of the theory in order to understand, apply, and extend the described methods. Each chapter is followed by a small section discussing the relevant references if you need to follow the theory in more detail. The authors make a great use of non-trivial examples that show the implementation details and possible complications in the discussed models. In addition, there's an appendix covering computations with R and Stan software.
The first five chapters present a solid, if somewhat terse, introduction to general bayesian methods, including asymptotics and connection to MLE, and culminating in hierarchical bayesian models in chapter 5. Two chapters follow on the important topic of model testing and selection. Chapter 8 covers data collection, and while it's a fascinating read and a novel idea if you've never seen it before, I think it could be skipped on the first reading without affecting much the understanding of further chapters.
Chapters 10-13 deal with simulation and analytic approximations, two central tools for bayesian analysis, because for most practical models direct analytic expressions are intractable. The authors provide a good overview of the rejection sampling, Gibbs, and Metropolis-Hastings algorithms. The explanations are enough for basic implementations. Chapter 13 introduces approximations around posterior modes. There is a very intuitive explanation of the EM algorithm along with it's mathematical derivation. This is followed by variational inference and expectation propagation, approximations which are based on the Kullback-Leibler divergence.
Up to this point in the book is a solid overview of bayesian inference, model checking, simulation and approximation techniques. Further chapters are mixed in the level of presentation and content.
The second half of the book deals with regression. The chapters here become terser and the language less precise. The level of presentation deteriorates towards the end, where in my opinion the chapters on non-parametric models are almost impossible to understand without some prior exposure. There are more sections that require multiple re-readings and places where I feel reading the references prior to the book is a good idea (such as dirichlet processes). However I do think that the chapters on robust inference and finite mixture models were exceptionally good.
I was disappointed that only 2 pages were devoted to regularization and variable selection in linear regression. In my opinion bayesian techniques provide powerful alternatives to classical regularization methods, where instead of choosing the regularization hyperparameters through cross validation, we marginalize over it, thus effectively taking an average over all possible regularizations. Although authors do spend more time on regularization in the context of the basis function selection in chapter 20, I feel it's a pity they didn't choose to devote more space to it in linear regression setting.
Some other small negative things about the book in my opinion are:
- constantly referring to later chapters in the book
- various small typos/mistakes that detract from reading
- presentation of expectation propagation in chapter 13 is confusing and no mention is made that it's related to minimizng Kullback-Leibler divergence
- no mention of relevance vector machines for basis function selection in chapter 20
- no mention of bayesian dimensionality reduction and factor models
However I think that the excellent presentation in the first half of the book alone makes it well worth studying. It's use as a reference far outweighs it's shortcomings as an introduction, and I'm sure I'll be picking it up countless times when reading other bayesian material. I highly recommend this book for anyone with classical statistics background looking to understand bayesian methods in depth.
The book has a lot of good content and assumes previous knowledge on basic probability and statistics.
Definitely recommended as a starter, refresher, self-study guide, textbook or even reference for anyone interested in bayesian modelling.