Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
Description:
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Download or read it online for free here:
Download link
(1.4MB, PDF)
Similar books
Information Theory and Statistical Physics
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(13005 views)
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(13005 views)
Theory of Quantum Information
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
(11887 views)
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
(11887 views)
Algorithmic Information Theory
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(10780 views)
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(10780 views)
Information Theory, Excess Entropy and Statistical Complexity
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14118 views)
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14118 views)