A Short Course in Information Theory
by David J. C. MacKay
Publisher: University of Cambridge 1995
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Home page url
Download or read it online for free here:
(multiple PDF,PS files)
by Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
by Gregory. J. Chaitin - Cambridge University Press
The book presents the strongest possible version of Gödel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. The author tried to present the material in the most direct fashion possible.
by Renato Renner - ETH Zurich
Processing of information is necessarily a physical process. It is not surprising that physics and the theory of information are inherently connected. Quantum information theory is a research area whose goal is to explore this connection.
by Abbas El Gamal, Young-Han Kim - arXiv
Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.