Data Compression
Publisher: Wikibooks 2011
Description:
Data compression is useful in some situations because 'compressed data' will save time (in reading and on transmission) and space if compared to the unencoded information it represent. In this book, we describe the decompressor first, for several reasons...
Download or read it online for free here:
Read online
(online html)
Similar books
Error-Correction Coding and Decoding
by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
(7005 views)
by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
(7005 views)
Entropy and Information Theory
by Robert M. Gray - Springer
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
(17386 views)
by Robert M. Gray - Springer
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
(17386 views)
Around Kolmogorov Complexity: Basic Notions and Results
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6688 views)
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6688 views)
Information Theory and Statistical Physics
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(12968 views)
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(12968 views)