Entropy and Information Theory
by Robert M. Gray
Publisher: Springer 2008
ISBN/ASIN: 1441979697
Number of pages: 313
Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Download or read it online for free here:
Download link
(1.5MB, PDF)
Similar books

by Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
(11103 views)

by Claude Shannon
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
(62464 views)

by David Feldman - College of the Atlantic
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14503 views)

by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(11059 views)