Information Theory and Coding
by John Daugman
Publisher: University of Cambridge 2009
Number of pages: 75
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; how these are used to calculate the capacity of a communication channel, with and without noise; coding schemes, including error correcting codes; how discrete channels and measures of information generalize to their continuous forms; etc.
Home page url
Download or read it online for free here:
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
by Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
by Karl Petersen - AMS
The aim is to review the many facets of information, coding, and cryptography, including their uses throughout history and their mathematical underpinnings. Prerequisites included high-school mathematics and willingness to deal with unfamiliar ideas.