Bayesian Computation with R (Use R!) 2nd ed. 2009 Edition
Use the Amazon App to scan ISBNs and compare prices.
From the Back Cover
There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry.
Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The early chapters present the basic tenets of Bayesian thinking by use of familiar one and two-parameter inferential problems. Bayesian computational methods such as Laplace's method, rejection sampling, and the SIR algorithm are illustrated in the context of a random effects model. The construction and implementation of Markov Chain Monte Carlo (MCMC) methods is introduced. These simulation-based algorithms are implemented for a variety of Bayesian applications such as normal and binary response regression, hierarchical modeling, order-restricted inference, and robust modeling. Algorithms written in R are used to develop Bayesian tests and assess Bayesian models by use of the posterior predictive distribution. The use of R to interface with WinBUGS, a popular MCMC computing language, is described with several illustrative examples.
This book is a suitable companion book for an introductory course on Bayesian methods and is valuable to the statistical practitioner who wishes to learn more about the R language and Bayesian methodology. The LearnBayes package, written by the author and available from the CRAN website, contains all of the R functions described in the book.
The second edition contains several new topics such as the use of mixtures of conjugate priors and the use of Zellner’s g priors to choose between models in linear regression. There are more illustrations of the construction of informative prior distributions, such as the use of conditional means priors and multivariate normal priors in binary regressions. The new edition contains changes in the R code illustrations according to the latest edition of the LearnBayes package.
Jim Albert is Professor of Statistics at Bowling Green State University. He is Fellow of the American Statistical Association and is past editor of The American Statistician. His books include Ordinal Data Modeling (with Val Johnson), Workshop Statistics: Discovery with Data, A Bayesian Approach (with Allan Rossman), and Bayesian Computation using Minitab.
- ASIN : 0387922970
- Publisher : Springer; 2nd ed. 2009 edition (May 15, 2009)
- Language : English
- Paperback : 312 pages
- ISBN-10 : 9780387922973
- ISBN-13 : 978-0387922973
- Item Weight : 2.14 pounds
- Dimensions : 6.1 x 0.71 x 9.25 inches
- Best Sellers Rank: #1,288,173 in Books (See Top 100 in Books)
- Customer Reviews:
About the author
Top reviews from the United States
There was a problem filtering reviews right now. Please try again later.
You can rather easily follow the code and modify it, to help to understand what is going on.
The drawback however is that the autor is not a good programmer, which makes the code more difficult to understand than necessary.
In some of his examples, he uses the same or very similar variable name for up to three times with different meaning, simply by overwriting the values.
Similarily, the formulas in the book are sometimes written in a rather sloppy way, omitting conditional variables by a "simplified notation", while the more natural code would be more instructive to learners. I would also have wished some more intermediate steps for derivating formulas, but the focus of the author was clearly on the R implementation.
2 stars appear a fair summary for it: Yes the book is somewhat useful when read (and applied) in conjuction with other introductory books for Bayes, but it is unfortunately not as instructive as it could be.
The author uses his own built-in data set and his own program package "LearnBayes" in this book. No clues given on how he got this data into R.
The main problem with R is getting the data into R.
Same thing can be said about Casella's books-- utterly useless.
All these books are overpriced. Being professors, they can force the students to buy these books.
Albert could have posted his version of the book on his web site as a pdf document and let the readers download it for $1 or $2.
A somewhat useful book is the one by Michael Crawley ("The R Book").
What R needs is a simple command to get the data into R.
R may be free. But not much of practical use for real problems unless you already are a programmer or a PhD Mathematician.