Entropy and Information Theory
by Robert M. Gray
Publisher: Springer 2008
ISBN/ASIN: 1441979697
Number of pages: 313
Description:
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Download or read it online for free here:
Download link
(1.5MB, PDF)
Similar books
A primer on information theory, with applications to neuroscienceby Felix Effenberger - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(10803 views)
Information Theory, Inference, and Learning Algorithmsby David J. C. MacKay - Cambridge University Press
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.
(33125 views)
Data Compression Explainedby Matt Mahoney - mattmahoney.net
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
(12629 views)
Error-Correction Coding and Decodingby Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
(8317 views)