**Information Theory, Inference, and Learning Algorithms**

by David J. C. MacKay

**Publisher**: Cambridge University Press 2003**ISBN/ASIN**: 0521642981**ISBN-13**: 9780521642989**Number of pages**: 640

**Description**:

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

Download or read it online for free here:

**Download link**

(multiple formats)

## Similar books

**Lecture Notes on Network Information Theory**

by

**Abbas El Gamal, Young-Han Kim**-

**arXiv**

Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.

(

**8985**views)

**Information Theory and Statistical Physics**

by

**Neri Merhav**-

**arXiv**

Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.

(

**7866**views)

**Essential Coding Theory**

by

**Venkatesan Guruswami, Atri Rudra, Madhu Sudan**-

**University at Buffalo**

Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...

(

**2949**views)

**Around Kolmogorov Complexity: Basic Notions and Results**

by

**Alexander Shen**-

**arXiv.org**

Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.

(

**1287**views)