Ensemble Methods in Data Mining and over one million other books are available for Amazon Kindle. Learn more
Qty:1
  • List Price: $35.00
  • Save: $3.50 (10%)
FREE Shipping on orders over $35.
In Stock.
Ships from and sold by Amazon.com.
Gift-wrap available.
Add to Cart
Want it tomorrow, April 24? Order within and choose One-Day Shipping at checkout. Details
FREE Shipping on orders over $35.
Used: Good | Details
Sold by apex_media
Condition: Used: Good
Comment: Ships direct from Amazon! Qualifies for Prime Shipping and FREE standard shipping for orders over $25. Overnight and 2 day shipping available!
Add to Cart
Trade in your item
Get a $7.04
Gift Card.
Have one to sell?
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 2 images

Ensemble Methods in Data Mining: Improving Accuracy Through Combining Predictions (Synthesis Lectures on Data Mining and Knowledge Discovery) Paperback


See all 2 formats and editions Hide other formats and editions
Amazon Price New from Used from Collectible from
Kindle
"Please retry"
Paperback
"Please retry"
$31.50
$29.21 $24.50

Frequently Bought Together

Ensemble Methods in Data Mining: Improving Accuracy Through Combining Predictions (Synthesis Lectures on Data Mining and Knowledge Discovery) + Ensemble Methods: Foundations and Algorithms (Chapman & Hall/Crc Machine Learnig & Pattern Recognition) + Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)
Price for all three: $186.50

Buy the selected items together

NO_CONTENT_IN_FEATURE

Shop the new tech.book(store)
New! Introducing the tech.book(store), a hub for Software Developers and Architects, Networking Administrators, TPMs, and other technology professionals to find highly-rated and highly-relevant career resources. Shop books on programming and big data, or read this week's blog posts by authors and thought-leaders in the tech industry. > Shop now

Product Details

  • Series: Synthesis Lectures on Data Mining and Knowledge Discovery
  • Paperback: 126 pages
  • Publisher: Morgan and Claypool Publishers (February 24, 2010)
  • Language: English
  • ISBN-10: 1608452840
  • ISBN-13: 978-1608452842
  • Product Dimensions: 7.5 x 9.2 x 0.3 inches
  • Shipping Weight: 8.5 ounces (View shipping rates and policies)
  • Average Customer Review: 4.9 out of 5 stars  See all reviews (7 customer reviews)
  • Amazon Best Sellers Rank: #503,189 in Books (See Top 100 in Books)

Editorial Reviews

From the Inside Flap

"This book by Seni and Elder provides a timely, concise introduction to this topic. After an intuitive, highly accessible sketch of the key concerns in predictive learning, the book takes the readers through a shortcut into the heart of the popular tree-based ensemble creation strategies, and follows that with a compact yet clear presentation of the developments in the frontiers of statistics, where active attempts are being made to explain and exploit the mysteries of ensembles through conventional statistical theory and methods." 
-- Tin Kam Ho, Bell Labs, Alcatel-Lucent

"The practical implementations of ensemble methods are enormous. Most current implementations of them are quite primitive and this book will definitely raise the state of the art. Giovanni Seni's thorough mastery of the cutting-edge research and John Elder's practical experience have combined to make an extremely readable and useful book." 
-- Jaffray Woodriff, Quantitative Investment Management

About the Author

The authors are industry experts in data mining and machine learning who are also adjunct professors and popular speakers. Although early pioneers in discovering and using ensembles, they here distill and clarify the recent groundbreaking work of leading academics (such as Jerome Friedman) to bring the benefits of ensembles to practitioners.

More About the Authors

Discover books, learn about writers, read author blogs, and more.

Customer Reviews

4.9 out of 5 stars
5 star
6
4 star
1
3 star
0
2 star
0
1 star
0
See all 7 customer reviews
This relatively short book is very well organized.
Moni Neradilek
It is a bit light on how to tell if the final product is really working.
Yun Liu
I am enjoying reading it so far, and I highly recommend it.
Amazon Customer

Most Helpful Customer Reviews

8 of 9 people found the following review helpful By Yun Liu on June 11, 2010
Format: Paperback
During my 10 plus years of modeling experience, I have always paid most of my attention on variable selection, predictive power, effectiveness and efficiency of a single model form such as logistic model, ordinary regression, tree, etc. From time to time, I also segment my sample space into pieces and then apply different modeling techniques. Never really aware of the concept of 'model selection' or 'model combination'. That classical approach has served me well. But I always suspected that there was a better approach to combine different methods to get better predictions.
Ensemble methods detailed in this books gave me the 'ah ha'. It gave a nicely balanced flavor of easy implementation and difficult concepts. I really enjoyed the book. I was able to finish the book quick and would save it for reference.
If there is anything that I would want to see in more detail, it is the treatment of evaluation of model prediction. It is a bit light on how to tell if the final product is really working. Given that the book is an intro, then it is not really a mis-treatment.
overall, awesome small book.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
6 of 7 people found the following review helpful By Nina Zumel on July 31, 2011
Format: Paperback
This book is an accessible introduction to the theory and practice of ensemble methods in machine learning. It is a quick read, has sufficient detail for a novice to begin experimenting, and copious references for those who are interested in digging deeper. The authors also provide a nice discussion of cross-validation, and their section on regularization techniques is much more straightforward, in my opinion, than the equivalent sections in The Elements of Statistical Learning (Elements is a wonderful, necessary book, but a hard read).

The heart of the text is the chapter on Importance Sampling. The authors frame the classic ensemble methods (bagging, boosting, and random forests) as special cases of the Importance Sampling methodology. This not only clarifies the explanations of each approach, but also provides a principled basis for finding improvements to the original algorithms. They have one of the clearest descriptions of AdaBoost that I've ever read.

The penultimate chapter is on "Rule Ensembles": an attempt at a more interpretable ensemble learner. They also discuss measures for variable importance and interaction strength. The last chapter discusses Generalized Degrees of Freedom as an alternative complexity measure; it is probably of more interest to researchers and mathematicians than to practitioners.

Overall, I found the book clear and concise, with good attention to practical details. I appreciated the snippets of R code and the references to relevant R packages. One minor nitpick: this book has also been published digitally, presumably with color figures. Because the print version is grayscale, some of the color-coded graphs are now illegible. Usually the major points of the figure are clear from the context in the text; still, the color to grayscale conversion is something for future authors in this series to keep in mind.

Recommended.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
6 of 7 people found the following review helpful By Amazon Customer on October 30, 2010
Format: Paperback Verified Purchase
This is a really great (short) book in my opinion. It contains the best "need to know" information found in the Elements in Statistical Learning, and other good books on data mining. The included R code is a big bonus. I am enjoying reading it so far, and I highly recommend it. The only thing that frustrates me is that the online version on the publishers website is in color, while the print version is not. This is the only reason I did not give it 5 stars. I saw the online version first, and thought that the print version would be in color as well. I am sadly mistaken. There are many graphics in this book that reference different colors and it just looks really crappy in grayscale. If you are familiar with the Elements of Statistical Learnining, imagine printing that out in grayscale and you will know what I mean.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
3 of 3 people found the following review helpful By Dimitri Shvorob on December 12, 2010
Format: Paperback
For once, "Product Description" is specific and hype-free. (Apart from the claim regarding importance sampling - dealt with on a single page). This is a concise, to-the-point and accessible introduction to the subject, discussing bagging, random-forest and boosting methods, in classification context. Once these methods are explained, the authors move on to measures of variable importance and model complexity, which may be of less interest to practitioners. R snippets, leveraging rpart and gbm packages, are a plus, but the programming is fairly simple.

PS. Morgan Claypool sell the book's PDF for $20, or $0 for those affiliated with the publisher's institutional subscribers.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Product Images from Customers

Search
ARRAY(0xa10766fc)

What Other Items Do Customers Buy After Viewing This Item?