Thursday, September 20, 2007

Information theory, inference and learning algorithms

I'd like to recommend the book Information theory, inference and learning algorithms by David Mackay, a Cambridge professor of physics. I wish I'd had a course on this material from Mackay when I was a student! Especially nice are the introductory example on Bayesian inference (Ch. 3) and the discussion of Occam's razor from a Bayesian perspective (Ch. 28). I'm sure I'll find other gems in this book, but I'm still working my way through.

I learned about the book through Nerdwisdom, which I also recommend highly. Nerdwisdom is the blog of Jonathan Yedidia, a brilliant polymath (theoretical physicist turned professional chess player turned computer scientist) with whom I consumed a lot of French wine at dinners of the Harvard Society of Fellows.

1 comment:

Martin Griffiths said...

I was fortunate enough to take Mackay's course. In fact, I commented about it recently on Cosmic Variance's best college courses post!

Blog Archive

Labels