Logic and Information
by Keith Devlin
Publisher: ESSLLI 2001
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, developed in the 1940s and 50s, a quantitative-based, qualitative theory developed by Fred Dretske in the 1970s, and a qualitative theory introduced by Jon Barwise and John Perry in the early 1980s and pursued by Barwise, Israel, Devlin, Seligman and others in the 1990s.
Home page url
Download or read it online for free here:
(multiple PDF files)
by Inder Jeet Taneja - Universidade Federal de Santa Catarina
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
by David J. C. MacKay - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Data compression is useful in some situations because 'compressed data' will save time (in reading and on transmission) and space if compared to the unencoded information it represent. In this book, we describe the decompressor first.
by Robert H. Schumann - arXiv
A short review of ideas in quantum information theory. Quantum mechanics is presented together with some useful tools for quantum mechanics of open systems. The treatment is pedagogical and suitable for beginning graduates in the field.