
Information Theory, Excess Entropy and Statistical Complexity
by David Feldman
Publisher: College of the Atlantic 2002
Number of pages: 49
Description:
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
Download or read it online for free here:
Download link
(multiple formats)
Similar books
Data Compression- Wikibooks
Data compression is useful in some situations because 'compressed data' will save time (in reading and on transmission) and space if compared to the unencoded information it represent. In this book, we describe the decompressor first.
(11012 views)
Essential Coding Theoryby Venkatesan Guruswami, Atri Rudra, Madhu Sudan - University at Buffalo
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(11451 views)
Information Theory and Codingby John Daugman - University of Cambridge
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(26375 views)
Algorithmic Information Theoryby Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(12090 views)