**A Short Course in Information Theory**

by David J. C. MacKay

**Publisher**: University of Cambridge 1995

**Description**:

Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.

Download or read it online for free here:

**Download link**

(multiple PDF,PS files)

## Similar books

**Information Theory, Excess Entropy and Statistical Complexity**

by

**David Feldman**-

**College of the Atlantic**

This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.

(

**7772**views)

**Around Kolmogorov Complexity: Basic Notions and Results**

by

**Alexander Shen**-

**arXiv.org**

Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.

(

**1270**views)

**Network Coding Theory**

by

**Raymond Yeung, S-Y Li, N Cai**-

**Now Publishers Inc**

A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.

(

**11637**views)

**Information Theory and Coding**

by

**John Daugman**-

**University of Cambridge**

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.

(

**13254**views)