Entropy and Information Theory
by Robert M. Gray
Publisher: Springer 2008
Number of pages: 313
This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.
Home page url
Download or read it online for free here:
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
by Felix Effenberger - arXiv
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
by Gregory. J. Chaitin - Cambridge University Press
The book presents the strongest possible version of Gödel's incompleteness theorem, using an information-theoretic approach based on the size of computer programs. The author tried to present the material in the most direct fashion possible.