Price:$77.72+ Free shipping with Amazon Prime

Your rating(Clear)Rate this item

- Information Theory, Inference and Learning Algorithms
- ›
- Customer Reviews

Price:$77.72+ Free shipping with Amazon Prime

Your rating(Clear)Rate this item

66 people found this helpful

ByAlexander C. Zorachon October 2, 2007

I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.

9 people found this helpful

ByMagic Mouseon November 1, 2014

This book starts with cute chapter-maps containing arrows describing the various directions a student might want to go depending on their interests. The book seems entertaining, with Dilbert cartoons used to illustrate the addition of noise, or weighed babies used to illustrate Shannon's channel coding theorem.

Each chapter contains a preface where the author tells you what exercises you should have done in order to be qualified to read it, and this is where I lost my patience with the book. It looks like a great self-study book, but after I spent a lot of time trying to follow the author's advice, I think the suggested exercises are too hard and the book doesn't contain enough preparation. Either you struggle with some excessively hard and time-consuming problems, or you just go to MacKay's solutions. There are many flattering reviews. I doubt the reviewers studied the book in the way it suggests. I found it much easier to study using Cover and Thomas's information theory book.

Another reason for my scepticism is this. The author makes available lectures online at "videolectures dot net" containing similar content to the book. However, the video lectures are simpler than this book. The lectures are given to undergraduates at Cambridge University. That David MacKay has to simplify the content even for these elite undergraduates accords with my guess that the book's suggested self-study routes are unrealistic.

Each chapter contains a preface where the author tells you what exercises you should have done in order to be qualified to read it, and this is where I lost my patience with the book. It looks like a great self-study book, but after I spent a lot of time trying to follow the author's advice, I think the suggested exercises are too hard and the book doesn't contain enough preparation. Either you struggle with some excessively hard and time-consuming problems, or you just go to MacKay's solutions. There are many flattering reviews. I doubt the reviewers studied the book in the way it suggests. I found it much easier to study using Cover and Thomas's information theory book.

Another reason for my scepticism is this. The author makes available lectures online at "videolectures dot net" containing similar content to the book. However, the video lectures are simpler than this book. The lectures are given to undergraduates at Cambridge University. That David MacKay has to simplify the content even for these elite undergraduates accords with my guess that the book's suggested self-study routes are unrealistic.

ByAlexander C. Zorachon October 2, 2007

I find it interesting that most of the people reviewing this book seem to be reviewing it as they would any other information theory textbook. Such a review, whether positive or critical, could not hope to give a complete picture of what this text actually is. There are many books on information theory, but what makes this book unique (and in my opinion what makes it so outstanding) is the way it integrates information theory with statistical inference. The book covers topics including coding theory, Bayesian inference, and neural networks, but it treats them all as different pieces of a unified puzzle, focusing more on the connections between these areas, and the philosophical implications of these connections, and less on delving into depth in one area or another.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.

This is a learning text, clearly meant to be read and understood. The presentation of topics is greatly expanded and includes much discussion, and although the book is dense, it is rarely concise. The exercises are absolutely essential to understanding the text. Although the author has made some effort to make certain chapters or topics independent, I think that this is one book for which it is best to more or less work straight through. For this reason and others, this book does not make a very good reference: occasionally nonstandard notation or terminology is used.

The biggest strength of this text, in my opinion, is on a philosophical level. It is my opinion, and in my opinion it is a great shame, that the vast majority of statistical theory and practice is highly arbitrary. This book will provide some tools to (at least in some cases) anchor your thinking to something less arbitrary. It's ironic that much of this is done within the Bayesian paradigm, something often viewed (and criticized) as being more arbitrary, not less so. But MacKay's way of thinking is highly compelling. This is a book that will not just teach you subjects and techniques, but will shape the way you think. It is one of the rare books that is able to teach how, why, and when certain techniques are applicable. It prepares one to "think outside the box".

I would recommend this book to anyone studying any of the topics covered by this book, including information theory, coding theory, statistical inference, or neural networks. This book is especially indispensable to a statistician, as there is no other book that I have found that covers information theory with an eye towards its application in statistical inference so well. This book is outstanding for self-study; it would also make a good textbook for a course, provided the course followed the development of the textbook very closely.

ByIainon February 19, 2005

I am a PhD student in computer science. Over the last year and a half this book has been invaluable (and parts of it a fun diversion).

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.

For a course I help teach, the intoductions to probability theory and information theory save a lot of work. They are accessible to students with a variety of backgrounds (they understand them and can read them online). They also lead directly into interesting problems.

While I am not directly studying data compression or error correcting codes, I found these sections compelling. Incredibly clear exposition; exciting challenges. How can we ever be certain of our data after bouncing it across the world and storing it on error-prone media (things I do every day)? How can we do it without >60 hard-disks sitting in our computer? The mathematics uses very clear notation --- functions are sketched when introduced, theorems are presented alongside pictures and explanations of what's really going on.

I should note that a small number (roughly 4 or 5 out of 50) of the chapters on advanced topics are much more terse than the majority of the book. They might not be of interest to all readers, but if they are, they are probably more friendly than finding a journal paper on the same topic.

Most importantly for me, the book is a valuable reference for Bayesian methods, on which MacKay is an authority. Sections IV and V brought me up to speed with several advanced topics I need for my research.

0Comment*|*
39 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByRich Turneron February 28, 2005

Uniting information theory and inference in an interactive and entertaining way, this book has been a constant source of inspiration, intuition and insight for me. It is packed full of stuff - its contents appear to grow the more I look - but the layering of the material means the abundance of topics does not confuse.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.

This is _not_ just a book for the experts. However, you will need to think and interact when reading it. That is, after all, how you learn, and the book helps and guides you in this with many puzzles and problems.

0Comment*|*
28 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByEdward Donahueon November 20, 2008

I am reviewing David MacKay's `Information Theory, Inference, and Learning Algorithms, but I haven't yet read completely. It will be years before I finish it, since it contains the material for several advanced undergraduate or graduate courses. However, it is already on my list of favorite texts and references. It is a book I will keep going back to time after time, but don't take my word for it. According to the back cover, Bob McEliece, the author of a 1977 classic on information theory recommends you buy two copies, one for the office and one for home. There are topics in this book I am aching to find the time to read, work through and learn.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'. His web site ( [...] ) is a wondrous collection of resource material including code supporting a variety of topics in the book. The book is available online to browse, either through Google books, or via a link from his web site, but you need to have it in hand, and spend time with it to truly appreciate it.

It can be used as a text book, reference book or to fill in gaps in your knowledge of Information Theory and related material. MacKay outlines several courses for which it can be used including: his Cambridge Course on Information Theory, Pattern Recognition and Neural Networks, a Short Course on Information Theory, and a Course on Bayesian Inference and Machine Learning. As a reference it covers topics not easily accessible in books including: a variety of modern codes (hash codes, low density parity check codes, digital fountain codes, and many others), Bayesian inference techniques (maximum likelihood, LaPlace's method, variational methods and Monte Carlo methods). It has interesting applications such as information theory applied to genes and evolution and to machine learning.

It is well written, with good problems, some help to understand the theory, and others help to apply the theory. Many are worked as examples, and some are especially recommended. He works to keep your attention and interest, and knows how to do it. For example chapter titles include `Why Have Sex' and `Crosswords and Codebreaking'. His web site ( [...] ) is a wondrous collection of resource material including code supporting a variety of topics in the book. The book is available online to browse, either through Google books, or via a link from his web site, but you need to have it in hand, and spend time with it to truly appreciate it.

0Comment*|*
17 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByBernie Madoffon March 16, 2009

Maybe it's just that the topic is so fascinating a superb book such as this is unavoidable--I doubt it--regardless, MacKay has crafted a paragon of science textbooking. the formula: lead with an irresistible puzzle, let the reader have a go at it; unfold the solution intuitively, then finish by justifying it theoretically. the reader leaves understanding: -the applicatiuson, -the method of solution, -and the theory, why it exists and what it allows one to do

why aren't all textbooks like this??

if you're a self-learner, DO BUY THIS BOOK! if only so you can see the possibilities of what a good textbook can be!

why aren't all textbooks like this??

if you're a self-learner, DO BUY THIS BOOK! if only so you can see the possibilities of what a good textbook can be!

0Comment*|*
11 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByMagic Mouseon November 1, 2014

This book starts with cute chapter-maps containing arrows describing the various directions a student might want to go depending on their interests. The book seems entertaining, with Dilbert cartoons used to illustrate the addition of noise, or weighed babies used to illustrate Shannon's channel coding theorem.

Each chapter contains a preface where the author tells you what exercises you should have done in order to be qualified to read it, and this is where I lost my patience with the book. It looks like a great self-study book, but after I spent a lot of time trying to follow the author's advice, I think the suggested exercises are too hard and the book doesn't contain enough preparation. Either you struggle with some excessively hard and time-consuming problems, or you just go to MacKay's solutions. There are many flattering reviews. I doubt the reviewers studied the book in the way it suggests. I found it much easier to study using Cover and Thomas's information theory book.

Another reason for my scepticism is this. The author makes available lectures online at "videolectures dot net" containing similar content to the book. However, the video lectures are simpler than this book. The lectures are given to undergraduates at Cambridge University. That David MacKay has to simplify the content even for these elite undergraduates accords with my guess that the book's suggested self-study routes are unrealistic.

Each chapter contains a preface where the author tells you what exercises you should have done in order to be qualified to read it, and this is where I lost my patience with the book. It looks like a great self-study book, but after I spent a lot of time trying to follow the author's advice, I think the suggested exercises are too hard and the book doesn't contain enough preparation. Either you struggle with some excessively hard and time-consuming problems, or you just go to MacKay's solutions. There are many flattering reviews. I doubt the reviewers studied the book in the way it suggests. I found it much easier to study using Cover and Thomas's information theory book.

Another reason for my scepticism is this. The author makes available lectures online at "videolectures dot net" containing similar content to the book. However, the video lectures are simpler than this book. The lectures are given to undergraduates at Cambridge University. That David MacKay has to simplify the content even for these elite undergraduates accords with my guess that the book's suggested self-study routes are unrealistic.

0Comment*|*
9 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByA customeron January 11, 2004

This review concerns only the coding theory part.

If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory and provides good information about LDPC codes, turbo codes and decoding algorithms. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong when one tries to attain shannon limit. He gives an argument based on GV bound (warning: This argument may not be entirely true).

Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.

Inspite of this, the book is a must have for engineers and computer scientists.

If you want to know what's presently going on in the field of coding theory with solid technical foundation, this is the book. The importance of this book is it answers why people have been going into new directions into coding theory and provides good information about LDPC codes, turbo codes and decoding algorithms. People have solved some problems that arise in coding field without going into depths of mathematics. Till early 1990's research in coding was intensely mathematical. People thought the packing problem was the answer to the coding problem. However Mackay answers the conventional thought was wrong when one tries to attain shannon limit. He gives an argument based on GV bound (warning: This argument may not be entirely true).

Now the bad part of the book. Mackay bases his entire book on the basis that algebraic codes cannot exceed GV bound. This is wrong. If you look at Madhu Sudan's notes at MIT (The prestigious Nevenlinna award winner), he says random codes are not always the best. Specifically he cites an argument which states AG codes exceed GV bound at a faster pace. So packing problem still has a relevance to coding problem as it could help attain shannon limit at a faster pace than random codes. (Warning: Madhu does not state anything about size of blocks. But my feeling is that AG codes since they exceed GV bound faster than random codes one could achieve shannon limit with comparitively smaller blocks). So still mathematicians could hope to contribute to practical coding theory while enriching mathematics.

Inspite of this, the book is a must have for engineers and computer scientists.

0Comment*|*
40 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByS. Matthewson September 26, 2008

This is an unqualified classic, to shelve with the likes of 'Structure and Interpretation of Computer Programs', 'Concrete Mathematics' and 'Mathematical Methods of Classical Mechanics'. If you are involved with, or interested in, high-end data analytics, then you _need_ this.

However 'high-end data analytics' does not even begin to do the book justice, so let me try again.

This is a magnificient compendium of fascinating stuff presented in a coherent information-theoretic framework. It covers everything from how digital television data compression and CD error correction work to a detailed commentary on neural networks, and discussion of principled AI methods such as clustering, Gaussian processes and probabilistic graphical models, together with Monte-Carlo techniques and a bunch of statistical physics. It even throws in a complete course in Bayesian statistics. It reads like a really good 'popular' 'science' book (I often wonder where the scare quotes should be) that doesn't bother to try to be popular.

In fact I bought this originally as bedside reading, for pleasure. It was only later that I actually used it for anything.

However 'high-end data analytics' does not even begin to do the book justice, so let me try again.

This is a magnificient compendium of fascinating stuff presented in a coherent information-theoretic framework. It covers everything from how digital television data compression and CD error correction work to a detailed commentary on neural networks, and discussion of principled AI methods such as clustering, Gaussian processes and probabilistic graphical models, together with Monte-Carlo techniques and a bunch of statistical physics. It even throws in a complete course in Bayesian statistics. It reads like a really good 'popular' 'science' book (I often wonder where the scare quotes should be) that doesn't bother to try to be popular.

In fact I bought this originally as bedside reading, for pleasure. It was only later that I actually used it for anything.

22 comments*|*
11 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

Bymikemison September 6, 2013

I used this for a course on Information Theory, and it was much better than Cover & Thomas because it provided more background and motivation for the material.

0Comment*|*
7 people found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

ByMark Twainon June 18, 2015

Prof. MacKay's book brings together basically all the background you need to start understanding the latest technicques and research in things like Inference and Machine Learning.

The chapters are accessible, the language is clear, and the amount of math is just right for a student to start learning. You can start understanding the theory and mathematical rationale for the technicques but without getting bogged down in pages of greek letter math theorems like what happens in LNCS or math classes. I think the book is very well suited for people with the background of roughly the equivalent of a bachelors degree in EE, CS, physics or the like.

The book will get you up to date on things like Bayesian networks, error correcting codes, variational methods, stochastic optimization etc. I just wish the good doctor would add a chapter on unsupervised learning of stacked networks (basically, the recent advancements in deep learning).

The chapters are accessible, the language is clear, and the amount of math is just right for a student to start learning. You can start understanding the theory and mathematical rationale for the technicques but without getting bogged down in pages of greek letter math theorems like what happens in LNCS or math classes. I think the book is very well suited for people with the background of roughly the equivalent of a bachelors degree in EE, CS, physics or the like.

The book will get you up to date on things like Bayesian networks, error correcting codes, variational methods, stochastic optimization etc. I just wish the good doctor would add a chapter on unsupervised learning of stacked networks (basically, the recent advancements in deep learning).

0Comment*|*
One person found this helpful.
Was this review helpful to you?YesNoReport abuse#### There was a problem loading comments right now. Please try again later.

Please write at least one word

You must purchase at least one item from Amazon to post a comment

A problem occurred while submitting your comment. Please try again later.

byChristopher Bishop

$74.97

Let us know here.

There's a problem loading this menu right now.

Get fast, free shipping with Amazon Prime

Prime members enjoy FREE Two-Day Shipping and exclusive access to music, movies, TV shows, and Kindle books.

Back to top

Get to Know Us | Make Money with Us | Amazon Payment Products | Let Us Help You |

- Conditions of Use
- Privacy Notice
- Interest-Based Ads
- © 1996-2016, Amazon.com, Inc. or its affiliates

|66 people found this helpful. Was this review helpful to you?YesNoReport abuse