- Paperback: 416 pages
- Publisher: Cambridge University Press; 1 edition (January 28, 2008)
- Language: English
- ISBN-10: 0521717701
- ISBN-13: 978-0521717700
- Product Dimensions: 7.4 x 0.8 x 9.7 inches
- Shipping Weight: 2 pounds (View shipping rates and policies)
- Average Customer Review: 10 customer reviews
Amazon Best Sellers Rank:
#1,138,226 in Books (See Top 100 in Books)
- #125 in Books > Computers & Technology > Computer Science > AI & Machine Learning > Natural Language Processing
- #204 in Books > Computers & Technology > Computer Science > AI & Machine Learning > Neural Networks
- #218 in Books > Computers & Technology > Computer Science > AI & Machine Learning > Computer Vision & Pattern Recognition
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Pattern Recognition and Neural Networks 1st Edition
Use the Amazon App to scan ISBNs and compare prices.
Frequently bought together
Customers who bought this item also bought
Customers who viewed this item also viewed
This book uses tools from statistical decision theory and computational learning theory to create a rigorous foundation for the theory of neural networks. On the theoretical side, Pattern Recognition and Neural Networks emphasizes probability and statistics. Almost all the results have proofs that are often original. On the application side, the emphasis is on pattern recognition. Most of the examples are from real world problems. In addition to the more common types of networks, the book has chapters on decision trees and belief networks from the machine-learning field. This book is intended for use in graduate courses that teach statistics and engineering. A strong background in statistics is needed to fully appreciate the theoretical developments and proofs. However, undergraduate-level linear algebra, calculus, and probability knowledge is sufficient to follow the book. --This text refers to an out of print or unavailable edition of this title.
"...an excellent text on the statistics of pattern classifiers and the application of neural network techniques...Ripley has managed...to produce an altogether accessible text...[it] will be rightly popular with newcomers to the area for its ability to present the mathematics of statistical pattern recognition and neural networks in an accessible format and engaging style." Nature
"...a valuable reference for engineers and science researchers." Optics & Photonics News
"The combination of theory and examples makes this a unique and interesting book." International Statistical Institute Journal
Top customer reviews
There was a problem filtering reviews right now. Please try again later.
from a strong statistical basis. For anyone wanting a
quick way to access the broad spectrum of literature covering
neural networks and find the seminal papers, thoughts, developments
of the field, the literature references are worth the price.
This is essentially a literature survey, and not a "how-to"
book. It is not excessively heavy on the mathematics but
the uses verbiage to enhance the math that is necessary for
such a topic. It handles a number of significant but often
overlooked issues, such as the need for an ordering scheme of the partial
derivatives in backpropagation. Most authors don't address the obscure
but important points that will make or break your work if you
aren't aware of them. Ripley makes the reader cognizant of
where the minefields lie. This book is a Rosetta stone into
One thing did surprise me. There is one page with color! Describing clustering (I think). I almost died laughing. Showed it to other stat friends familiar with Ripley and we chuckled.
However, a statistical theory of nonlinear classification algorithms shows that these methods have nice properties and have mathematical justification. The statistical pattern recognition research is well over 30 years old and is very well established. So these connections are important for putting neural networks on firm ground and providing greater acceptability from the statistical as well as the engineering community.
Ripley provides a theoretical threatment of the state-of-the-art in statistical pattern recognition. His treatment is thorough, covering all the important developments. He provides a large bibliography and a nice glossary of terms in the back of the book.
Recent papers on neural networks and data mining are often quick to generate results but not very good at providing useful validation techniques that show that perceived performance is not just an artifact of overfitting a model. This is an area where statisticians play a very important role, as they are keenly aware through their experience with regression modeling and prediction, of the crucial need for cross-validation. Ripley covers this quite clearly in Section 2.6 titled "How complex a model do we need?"
It is nice to see the thoroughness of this work. For example, in error rate estimation, many know of the advances of Lachenbruch and Mickey on error rate estimation in discriminant analysis and the further advances of Efron and others with the bootstrap. But in between there was also significant progress by Glick on smooth estimators. This work has been overlooked by many statisticians probably because some of it appears in the engineering literature (but one important paper was in the Journal of the American Statistical Association [JASA] in 1972). To some extent this oversight may be due to the fact that it was not mentioned in Efron's famous 1983 JASA paper and hence is usually missed in the bootstrap literature. Bootstrap methods and cross-validation play a prominent role in this text.
This is an excellent reference book for anyone seriously interested in pattern recognition research. For applied and theoretical statisticians who want a good account of the theory behind neural networks it is a must.
5 stars for the seller too. Early delivery. Book in great shape, as promised.