Programming Books C Java PHP Python Learn more Browse Programming Books
  • List Price: $89.99
  • Save: $68.99 (77%)
Rented from RentU
To Rent, select Shipping State from options above
Due Date: May 30, 2015
FREE return shipping at the end of the semester. Access codes and supplements are not guaranteed with rentals.
Used: Good | Details
Sold by RentU
Condition: Used: Good
Comment: Fast shipping from Amazon! Qualifies for Prime Shipping and FREE standard shipping for orders over $35. Overnight, 2 day and International shipping available! Excellent Customer Service.. May not include supplements such as CD, access code or DVD.
Access codes and supplements are not guaranteed with used items.
Qty:1
  • List Price: $89.99
  • Save: $15.15 (17%)
In Stock.
Ships from and sold by Amazon.com.
Gift-wrap available.
Information Theory, Infer... has been added to your Cart
Sell yours for a Gift Card
We'll buy it for $21.57
Learn More
Trade in now
Have one to sell? Sell on Amazon
Flip to back Flip to front
Listen Playing... Paused   You're listening to a sample of the Audible audio edition.
Learn more
See all 2 images

Information Theory, Inference and Learning Algorithms Hardcover – October 6, 2003

ISBN-13: 978-0521642989 ISBN-10: 0521642981 Edition: 0th

Buy New
Price: $74.84
Rent
Price: $21.00
43 New from $61.07 25 Used from $38.00
Rent from Amazon Price New from Used from
Hardcover
"Please retry"
$21.00
$74.84
$61.07 $38.00
Paperback
"Please retry"
$129.90
Free%20Two-Day%20Shipping%20for%20College%20Students%20with%20Amazon%20Student


Special Offers and Product Promotions

  • Take an Extra 30% Off Any Book: Use promo code HOLIDAY30 at checkout to get an extra 30% off any book for a limited time. Excludes Kindle eBooks and Audible Audiobooks. Restrictions apply. Learn more.


Frequently Bought Together

Information Theory, Inference and Learning Algorithms + Elements of Information Theory 2nd Edition (Wiley Series in Telecommunications and Signal Processing) + Pattern Recognition and Machine Learning (Information Science and Statistics)
Price for all three: $227.30

Buy the selected items together

NO_CONTENT_IN_FEATURE

Shop the new tech.book(store)
New! Introducing the tech.book(store), a hub for Software Developers and Architects, Networking Administrators, TPMs, and other technology professionals to find highly-rated and highly-relevant career resources. Shop books on programming and big data, or read this week's blog posts by authors and thought-leaders in the tech industry. > Shop now

Product Details

  • Hardcover: 640 pages
  • Publisher: Cambridge University Press (October 6, 2003)
  • Language: English
  • ISBN-10: 0521642981
  • ISBN-13: 978-0521642989
  • Product Dimensions: 7.4 x 1.3 x 9.7 inches
  • Shipping Weight: 3.3 pounds (View shipping rates and policies)
  • Average Customer Review: 4.2 out of 5 stars  See all reviews (21 customer reviews)
  • Amazon Best Sellers Rank: #116,407 in Books (See Top 100 in Books)

Editorial Reviews

Review

"...a valuable reference...enjoyable and highly useful."
American Scientist


"...an impressive book, intended as a class text on the subject of the title but having the character and robustness of a focused encyclopedia. The presentation is finely detailed, well documented, and stocked with artistic flourishes."
Mathematical Reviews


"Essential reading for students of electrical engineering and computer science; also a great heads-up for mathematics students concerning the subtlety of many commonsense questions."
Choice


"An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics."
Dave Forney, Massachusetts Institute of Technology


"This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn."
Peter Dayan and Zoubin Ghahramani, Gatsby Computational Neuroscience Unit, University College, London


"An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home."
Bob McEliece, California Institute of Technology


"An excellent textbook in the areas of infomation theory, Bayesian inference and learning alorithms. Undergraduate and post-graduate students will find it extremely useful for gaining insight into these topics."
REDNOVA


"Most of the theories are accompanied by motivations, and explanations with the corresponding examples...the book achieves its goal of being a good textbook on information theory."
ACM SIGACT News

Book Description

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

More About the Author

David MacKay is a professor in the Department of Physics at Cambridge University, a Fellow of the Royal Society, and Chief Scientific Advisor to the Department of Energy and Climate Change, UK.

Customer Reviews

Just to chime in that this is one of the best technical books I have ever read.
K. Josic
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me.
Rich Turner
It has interesting applications such as information theory applied to genes and evolution and to machine learning.
Edward Donahue

Most Helpful Customer Reviews

47 of 48 people found the following review helpful By Alexander C. Zorach on October 2, 2007
Format: Hardcover
I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
30 of 34 people found the following review helpful By Iain on February 19, 2005
Format: Hardcover
I am a PhD student in computer science. Over the last year and a half this book has been invaluable (and parts of it a fun diversion).

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
20 of 22 people found the following review helpful By Rich Turner on February 28, 2005
Format: Hardcover
Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again
10 of 11 people found the following review helpful By Edward Donahue on November 20, 2008
Format: Hardcover
I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'.
Read more ›
Comment Was this review helpful to you? Yes No Sending feedback...
Thank you for your feedback. If this review is inappropriate, please let us know.
Sorry, we failed to record your vote. Please try again

Most Recent Customer Reviews