- Series: Springer Series in Statistics
- Paperback: 632 pages
- Publisher: Springer; Corrected edition (June 11, 1993)
- Language: English
- ISBN-10: 0387940375
- ISBN-13: 978-0387940373
- Product Dimensions: 6.1 x 1.5 x 9.2 inches
- Shipping Weight: 2.6 pounds (View shipping rates and policies)
- Average Customer Review: 2 customer reviews
- Amazon Best Sellers Rank: #1,069,250 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
001: Breakthroughs in Statistics: Foundations and Basic Theory (Springer Series in Statistics) Corrected Edition
Use the Amazon App to scan ISBNs and compare prices.
Fulfillment by Amazon (FBA) is a service we offer sellers that lets them store their products in Amazon's fulfillment centers, and we directly pack, ship, and provide customer service for these products. Something we hope you'll especially enjoy: FBA items qualify for FREE Shipping and Amazon Prime.
If you're a seller, Fulfillment by Amazon can help you increase your sales. We invite you to learn more about Fulfillment by Amazon .
Frequently bought together
Customers who viewed this item also viewed
Customers who bought this item also bought
Author interviews, book reviews, editors picks, and more. Read it now
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
Statistics is an amazingly young field. Probability theory goes back to the 17th and 18th centuries when famous mathematicians like Pascal, the Bernoulli family, DeMoivre, Bayes,Laplace, Gauss and others. Many motivated by games of chance and in Gauss' case astronomy. But significant advances in the development of statistics can only be traced back to the 19th century and its true birth came in the early 20th century and now in the 21st century it continues to grow.
The editors (Johnson and Kotz) lead us in a series of three volumes through the developments by presenting the works that they and others considered in retrospect to be the "breakthroughs" in the development of statistics. The articles are introduced by a current statistical expert who is very familiar with the article and its connection to the development of a particular research area or branch of statistics. Very good justification for each item is given. My only contention with the series is that there are some papers missing that I think belong. But such is always the case when people get together to rank the best works in almost any endeavor.
This review is about volume 1 which concentrates on the foundations of statistics and the development of the basic theory. Most of this took place between 1890 and 1950 with the likes of Karl Pearson, Francis Galton, Egon Pearson, Jerzy Neyman and most notably Sir Ronald Fisher. The foundations of statistics are still not unified as the Bayesian, Fisherian and frequentist schools of inference all developed in the 20th century and led to controversies among the founders. This book covers all that in a scholarly fashion with the expert introductions and the selected articles.
The first article in the book is Fisher's 1922 paper that is his first account of his foundational theory for statistical inference. It is the place where the concept of maximum likelihood is introduced. Seymour Geisser gives the introduction.
Other breakthrough articles include Hotelling's 1931 article that introduces the multivariate generalization of Student's t distribution, the statistic that we now call Hotelling's T square, Neyman and Pearson's 1933 paper where their theory of hypothesis testing first arose, De Finetti's 1937 paper that helped establish the Bayesian school of inference, Gnedenko's 1943 paper that established the three limiting distribution for the maximum of an independent identically distributed sequence, which dotted the i's and crossed the t's of the earlier work of Fisher and Tippett and others and established extreme value theory as a discipline in statistics, Wald's 1945 paper that made public the secret developments of sequential analysis that he and Barnard developed during World War II and there are many others. Another paper by Wald established statistical decision theory, Jack Kiefer's paper that formalized optimality theory for statistical experimental designs, and the paper by James and Stein that astounded the world of statistics by show the inadmissibility under the quadratic loss function of the maximum likelihood estimator of a multivariate normal mean vector in three or more dimensions. The James and Stein paper in 1961 proved this by construction of an estimator (now called the James-Stein or shrinkage estimator) that shrinks the estimate of the vector toward zero. This estimator dominates the maxumum likelihood estimator over the entire parameter space for the mean vector! Later it was shown to be a Bayes estimator under a normal prior and it relates to the empirical Bayes estimators discussed in another seminal paper by Robbins 1955 that is also included in this volume.
I have omitted discussion fo a number of other articles as a review should give the reader a flavor for the book and not tell the whole story. Any one serious about statistical research or its historical developments and foundational issues should read volume 1.
 The original papers are attached. Do you remember Abel said, Read the masters', not their pupils'?
 For each important paper, there is an introduction written by the real experts of this domain.
 This is a real awesome book that leads you to overview the history of statistics.
 You can find a surprise of good price somewhere on campusi.com if you need all the three volumes.