This is an extraordinary and important book, generous with insight and rich with detail in statistics, information theory, and probabilistic modeling across a wide swathe of standard, creatively original, and delightfully quirky topics. David MacKay is an uncompromisingly lucid thinker, from whom students, faculty and practitioners all can learn.
An utterly original book that shows the connections between such disparate fields as information theory and coding, inference, and statistical physics.
An instant classic, covering everything from Shannon's fundamental theorems to the postmodern theory of LDPC codes. You'll want two copies of this astonishing book, one for the office and one for the fireside at home.
I am involved with teaching undergraduate information theory at Royal Holloway. We aim to make this book a reference text in our related courses.
I have found the book very enjoyable to read and I think that it is far easier for students to relate to as the maths is explicit and easy to follow, with excellent examples.
Congratulations on an inspirational book! This is by far the best book I have read in years!
David Lindsay, Computer Learning Research Centre (www.clrc.rhul.ac.uk)
I am compelled to state categorically that this is one of the finest text books I have read on these subjects. I have even pilfered some of the material for use in my classes.
Samir Chettri, UMBC
One of the best technical books ever written, period. It's a joy to read and students I have shown it to are attracted to it like bees to honey.
Alpan Raval, Keck Graduate Institute & Claremont Graduate University
A quite remarkable work to have come from a single author ... the three topic areas of its title actually fall short of indicating its full breadth of coverage.
...
This magnificent piece of work is valuable in introducing a new integrated viewpoint, and it is clearly an admirable basis for taught courses, as well as for self-study and reference.