Logo

A Short Course in Information Theory

A Short Course in Information Theory
by

Publisher: University of Cambridge

Description:
Is it possible to communicate reliably from one point to another if we only have a noisy communication channel? How can the information content of a random variable be measured? This course will discuss the remarkable theorems of Claude Shannon, starting from the source coding theorem, which motivates the entropy as the measure of information, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.

Home page url

Download or read it online for free here:
Download link
(multiple PDF,PS files)

Similar books

Book cover: Information Theory, Excess Entropy and Statistical ComplexityInformation Theory, Excess Entropy and Statistical Complexity
by - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(7772 views)
Book cover: Around Kolmogorov Complexity: Basic Notions and ResultsAround Kolmogorov Complexity: Basic Notions and Results
by - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(1270 views)
Book cover: Network Coding TheoryNetwork Coding Theory
by - Now Publishers Inc
A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.
(11637 views)
Book cover: Information Theory and CodingInformation Theory and Coding
by - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(13254 views)