Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
Description:
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Download or read it online for free here:
Download link
(1.4MB, PDF)
Similar books
Theory of Quantum Information
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
(12029 views)
by John Watrous - University of Calgary
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
(12029 views)
Around Kolmogorov Complexity: Basic Notions and Results
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6885 views)
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6885 views)
A primer on information theory, with applications to neuroscience
by Felix Effenberger - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(9304 views)
by Felix Effenberger - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(9304 views)
Information Theory, Excess Entropy and Statistical Complexity
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14295 views)
by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14295 views)