- Paperback: 406 pages
- Publisher: Chapman and Hall/CRC; 1 edition (April 1, 2009)
- Language: English
- ISBN-10: 1420067184
- ISBN-13: 978-1420067187
- Product Dimensions: 9.3 x 6.3 x 0.9 inches
- Shipping Weight: 1.6 pounds
- Average Customer Review: 4.0 out of 5 stars See all reviews (41 customer reviews)
- Amazon Best Sellers Rank: #1,041,062 in Books (See Top 100 in Books)
Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.
To get the free app, enter your mobile phone number.
Machine Learning: An Algorithmic Perspective (Chapman & Hall/Crc Machine Learning & Pattern Recognition) 1st Edition
Use the Amazon App to scan ISBNs and compare prices.
There is a newer edition of this item:
O'Reilly Learning Series
Featured 'Learning' Series from O'Reilly Media. See more
Customers who bought this item also bought
What other items do customers buy after viewing this item?
… liberally illustrated with many programming examples, using Python. It includes a basic primer on Python and has an accompanying website.
It has excellent breadth, and is comprehensive in terms of the topics it covers, both in terms of methods and in terms of concepts and theory. …
I think the author has succeeded in his aim: the book provides an accessible introduction to machine learning. It would be excellent as a first exposure to the subject, and would put the various ideas in context …
This book also includes the first occurrence I have seen in print of a reference to a zettabyte of data (1021 bytes) ― a reference to "all the world’s computers" being estimated to contain almost a zettabyte by 2010.
―David J. Hand, International Statistical Review (2010), 78
If you are interested in learning enough AI to understand the sort of new techniques being introduced into Web 2 applications, then this is a good place to start. … it covers the subject matter of many an introductory course on AI and it has references to the source material and further reading but it is written in a fairly casual style. Overall it works and much of the mathematics is explained in ways that make it fairly clear what is going on … . This is a suitable introduction to AI if you are studying the subject on your own and it would make a good course text for an introduction and overview of AI.
―I-Programmer, November 2009
About the Author
Massey University, Palmerston North, New Zealand
If you are a seller for this product, would you like to suggest updates through seller support?
Top Customer Reviews
Pros: Unlike too many technical books, the author of this book is not trying to display his brilliance. This is a nuts and bolts book - not a cookbook, but not a Springer-style book of axioms, either. Reading this book reminded me of school. There seem to be two kinds of professors - one kind tends to prefer examples and problems that display some sort of first principle or other basic and fundamental problem. These professors are good at teaching new concepts, but sometimes fail at teaching the practicalities. The other kind of professor favors practical problems and how-do-I-do-this-for-real issues. This book sits firmly in the latter camp.
Cons: As mentioned above, this book is not a cookbook, and yet is also is not rigorous. For such a practical book, I would have wanted more pseudocode and algorithms. For example, near the end of the book, the author goes over Kalman filters and particle filters. He gives one algorithm for a Kalman filter (whith no treatment to the different kinda and uses of Kalman fitlers), and only a slight description of a particle fitler (and no pseudocode). To be fair, books and volumes are to be had on the subject. But, I was left wanting more.
This book is a good introduction to how to use various aspects and techniques of machine learning. If you a looking for mathematical rigour, look elsewhere. If you are looking for cookbook-style algorithms, use this book in supplement. If you want a practical overview of machine learning methods before setting on your course, buy this book.
Recommended, but with qualification.
Like the title says, this book takes an algorithmic approach to teaching machine learning - as opposed to an applied or example based approach. The expectation is that you would get a tutorial on all the main algorithms rather than how to put various algorithms together to solve a particular problem in, say, fraud detection.
The Contents reveal the algorithm basis:
1. Introduction (types of machine learning, why you would want to do it in the first place and a quick introduction to supervised learning)
2. Preliminaries (Key ideas about the problem of over fitting and the what I consider the most important topic: how to test and know when you have a program that has learned something other than the noise). Here the author also covers some ideas about the role of probability. Calling it "turning data into probabilities" is a bit odd, but that's really what we do. Early on he gets the key ideas of the ROC curve out of the way - something many texts just gloss over.
I think the secret to understanding machine learning is understanding the idea behind the bias-variance trade-off (it is also handled very well in The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) which I used to teach a class and read before I read this book.
3. Coverage of Artificial Neural Networks starting with the perceptron and why you would want to go beyond linear discriminators
4. The multilayer ANN
5. Radial Basis Functions and Splines - this is interesting because, Andrew Ng presents a linear regression as the most basic learning algorithm in his Coursera course which means all of the fitting methods, even when not used for classification are relevant.
6. Is a section on dimensional reduction - feature selection and other methods like PCA and even factor analysis (most people stop with PCA which I personally think is a mistake, because you can accidentally end up keeping the features with all the noise and throwing out the meaningful linear combinations.
7.This is a cool section not seen in basic books on probabilistic methods - sure everyone teaches k-NN, but this one has a nice discussion of Gaussian mixture models
8. Talks about the support vector machine. Most people don't get introduced to the idea that ANN and SVMs are actually very similar - they are both large margin classifiers and so knowing something about SVMs will help, even if you end up with some other large margin classifier (with or without kernels)
9. This section talks about search and optimization. The way Ng teaches machine learning, you always begin with the error surface, take the derivative and then search for a minimum in the learning function. You quit teaching when you have minimized the error on the training set without out driving the error too high on the validation set - so in a way all these approaches are optimization methods.
10. A whole section of genetic algorithms (which I jumped to first) a very clear explanation and a good example that really ran so I could see what was going on.
11. Reinforcement learning
12. Learning with trees - CART trees end the chapter something everyone working in this area should know something about. He saves random forests for the next section (where I suppose it really belongs)
13. This section is on bagging an boosting and then compares the idea of a collection of weak learners (like stubby random trees) as a powerful tool - the idea behind random forests.
14. Unsupervised learning. People tend to focus on supervised learning for a very good reason, but there are lots of examples where the cost of putting a label on a data example is too high, so an unsupervised method is a good call.
15. Coverage of Markov Chain methods (MCMC) - again this does not get covered in every applied book.
16. Graphical models - the Bayesian Network and Probabilistic Network Models along with the hidden Markov models
17. Deep belief networks
18. Gaussian process regression and classification
The book concludes with an appendix on Python - getting started etc. I don't think this is quite enough Phython unless you are already pretty familiar with the language.
A critic of my first review suggested that I just bashed R and didn't talk about the book - not a completely unfair statement. R keeps data in data frames and Python is much more list and directory based. Data frames and collections are related and there are ways to do list comprehension in both languages. and Python has a data frame package to make using R-like constructs easier if you happen to be coming from R and like them (the package is called Pandas) Both are good languages, but I will stand by my original statement that R is a statistical language at its core. Many of the packages are written in C so they are fast (like the ones written for Python). It has been my experience that the open source tools for R development are just what the commentator said: they are adequate. I my humble opinion, R-Studio has a lot of catching up to do to be as good as professional tools like the JetBrains stuff (PyCharm). Look at MatLab compared to Octave. At lest the community version of PyCharm is free. R-Studio is not fast, and he dirty secret of R is that everything you do has to fit in memory at once, so you have to be very careful with memory management or you will run out and R will crash - it happens to me everyday. Almost all of the ML methods are statistically based so R and all the books (like An Introduction to Statistical Learning: with Applications in R (Springer Texts in Statistics) are totally brilliant. But if you want to see what it is under the hood, I suggest you look at Advanced R (Chapman & Hall/CRC The R Series). This will give you a deep dive on the internals. Compare it to Python Scripting for Computational Science (Texts in Computational Science and Engineering) and make the call for yourself.
I have used both R and Python for both prototyping advanced algorithms and putting code live in production. What tipped the scale for me was the productivity. Now that the data science community has started building state-of-art tools for Python, (not to say anything negative about the statistics community who put all of machine learning on a solid footing), I prefer a rapid development language with good tools, test-first frameworks and solid software engineering practices as part of the culture. The book reviewed here allows you to learn almost all of the algorithms used for machine learning and in the end you will be able to produce fast, readable, testable code.
Most Recent Customer Reviews
Like his style.Read more