A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Description:
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Download or read it online for free here:
Download link
(multiple PDF,PS files)
Similar books

by Venkatesan Guruswami, Atri Rudra, Madhu Sudan - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(10257 views)

by John Daugman - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(25032 views)

by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(11239 views)

by Felix Effenberger - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(9577 views)