Algorithmic Information Theory
by Peter D. Gruenwald, Paul M.B. Vitanyi
Publisher: CWI 2007
Number of pages: 37
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining 'information'. We discuss the extent to which Kolmogorov's and Shannon's information theory have a common purpose, and where they are fundamentally different.
Download or read it online for free here:
by Gregory J. Chaitin - Springer
This book presents the core of Chaitin's theory of program-size complexity, also known as algorithmic information theory. LISP is used to present the key algorithms and to enable computer users to interact with the author's proofs.
by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
by Keith Devlin - ESSLLI
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, a qualitative theory developed by Fred Dretske, and a qualitative theory introduced by Barwise and Perry.
by Martin Tomlinson, et al. - Springer
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.