Automotive Holiday Deals Up to 50% Off Select Books Shop Men's Athletic Shoes Learn more nav_sap_SWP_6M_fly_beacon Black Friday egg_2015 All-New Amazon Fire TV Grocery Gifts Under $50 Amazon Gift Card Offer cm15 cm15 cm15 $30 Off Amazon Echo $15 Off All-New Fire Kindle Voyage Cyber Monday Sweepstakes in Prime Music Outdoors Gift Guide on HTL

Enter your mobile number or email address below and we'll send you a link to download the free Kindle App. Then you can start reading Kindle books on your smartphone, tablet, or computer - no Kindle device required.

  • Apple
  • Android
  • Windows Phone
  • Android

To get the free app, enter your email address or mobile phone number.

Information Theory, Inference and Learning Algorithms 1st Edition

24 customer reviews
ISBN-13: 978-0521642989
ISBN-10: 0521642981
Why is ISBN important?
This bar-code number lets you verify that you're getting exactly the right version or edition of a book. The 13-digit and 10-digit formats both work.
Scan an ISBN with your phone
Use the Amazon App to scan ISBNs and compare prices.
Sell yours for a Gift Card
We'll buy it for $33.85
Learn More
Trade in now
Have one to sell? Sell on Amazon
Rent On clicking this link, a new layer will be open
$34.40 On clicking this link, a new layer will be open
Buy used On clicking this link, a new layer will be open
$53.41 On clicking this link, a new layer will be open
Buy new On clicking this link, a new layer will be open
$76.76 On clicking this link, a new layer will be open
More Buying Choices
38 New from $63.60 29 Used from $49.42
Free Two-Day Shipping for College Students with Amazon Student Free%20Two-Day%20Shipping%20for%20College%20Students%20with%20Amazon%20Student

Take an Extra 30% Off Any Book
$76.76 FREE Shipping. Only 18 left in stock (more on the way). Ships from and sold by Gift-wrap available.

Frequently Bought Together

  • Information Theory, Inference and Learning Algorithms
  • +
  • Machine Learning: A Probabilistic Perspective (Adaptive Computation and Machine Learning series)
  • +
  • Pattern Recognition and Machine Learning (Information Science and Statistics)
Total price: $242.66
Buy the selected items together

Special Offers and Product Promotions

  • Take an Extra 30% Off Any Book: Use promo code HOLIDAY30 at checkout to get an extra 30% off any book for a limited time. Excludes Kindle eBooks and Audible Audiobooks. Restrictions apply. Learn more | Shop now

Editorial Reviews


"...a valuable reference...enjoyable and highly useful."
American Scientist

" impressive book, intended as a class text on the subject of the title but having the character and robustness of a focused encyclopedia. The presentation is finely detailed, well documented, and stocked with artistic flourishes."
Mathematical Reviews

"Essential reading for students of electrical engineering and computer science; also a great heads-up for mathematics students concerning the subtlety of many commonsense questions."

"An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics."
Dave Forney, Massachusetts Institute of Technology

"This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn."
Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London

"An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home."
Bob McEliece, California Institute of Technology

"An excellent textbook in the areas of infomation theory, Bayesian inference and learning alorithms. Undergraduate and post-graduate students will find it extremely useful for gaining insight into these topics."

"Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory."

Book Description

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

Hero Quick Promo
Holiday Deals in Kindle Books
Save up to 85% on more than 1,000 Kindle Books. These deals are valid until November 30, 2015. Learn more

Product Details

  • Paperback: 640 pages
  • Publisher: Cambridge University Press; 1 edition (October 6, 2003)
  • Language: English
  • ISBN-10: 0521642981
  • ISBN-13: 978-0521642989
  • Product Dimensions: 7.4 x 1.3 x 9.7 inches
  • Shipping Weight: 3.3 pounds (View shipping rates and policies)
  • Average Customer Review: 4.3 out of 5 stars  See all reviews (24 customer reviews)
  • Amazon Best Sellers Rank: #318,545 in Books (See Top 100 in Books)

More About the Author

David MacKay is a professor in the Department of Physics at Cambridge University, a Fellow of the Royal Society, and Chief Scientific Advisor to the Department of Energy and Climate Change, UK.

Customer Reviews

Most Helpful Customer Reviews

61 of 63 people found the following review helpful By Alexander C. Zorach on October 2, 2007
Format: Paperback
I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
38 of 42 people found the following review helpful By Iain on February 19, 2005
Format: Paperback
I am a PhD student in computer science. Over the last year and a half this book has been invaluable (and parts of it a fun diversion).

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
27 of 29 people found the following review helpful By Rich Turner on February 28, 2005
Format: Paperback
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse
17 of 18 people found the following review helpful By Edward Donahue on November 20, 2008
Format: Paperback
I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback.
Sorry, we failed to record your vote. Please try again
Report abuse

Most Recent Customer Reviews

Set up an Amazon Giveaway

Amazon Giveaway allows you to run promotional giveaways in order to create buzz, reward your audience, and attract new followers and customers. Learn more
Information Theory, Inference and Learning Algorithms
This item: Information Theory, Inference and Learning Algorithms
Price: $76.76
Ships from and sold by