e-books in Information & Coding Theory category
by Karl Petersen - AMS , 2018
The aim is to review the many facets of information, coding, and cryptography, including their uses throughout history and their mathematical underpinnings. Prerequisites included high-school mathematics and willingness to deal with unfamiliar ideas.
(6124 views)
by Martin Tomlinson, et al. - Springer , 2017
This book discusses both the theory and practical applications of self-correcting data, commonly known as error-correcting codes. The applications included demonstrate the importance of these codes in a wide range of everyday technologies.
(7056 views)
by Alexander Shen - arXiv.org , 2015
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
(6742 views)
by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG , 2015
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
(7959 views)
by Venkatesan Guruswami, Atri Rudra, Madhu Sudan - University at Buffalo , 2014
Error-correcting codes are clever ways of representing data so that one can recover the original information even if parts of it are corrupted. The basic idea is to introduce redundancy so that the original information can be recovered ...
(9658 views)
by Matt Mahoney - mattmahoney.net , 2013
This book is for the reader who wants to understand how data compression works, or who wants to write data compression software. Prior programming ability and some math skills will be needed. This book is intended to be self contained.
(10810 views)
by Felix Effenberger - arXiv , 2013
This chapter is supposed to give a short introduction to the fundamentals of information theory, especially suited for people having a less firm background in mathematics and probability theory. The focus will be on neuroscientific topics.
(9202 views)
- Wikibooks , 2011
Data compression is useful in some situations because 'compressed data' will save time (in reading and on transmission) and space if compared to the unencoded information it represent. In this book, we describe the decompressor first.
(9672 views)
by Mark M. Wilde - arXiv , 2012
The aim of this book is to develop 'from the ground up' many of the major developments in quantum Shannon theory. We study quantum mechanics for quantum information theory, we give important unit protocols of teleportation, super-dense coding, etc.
(11314 views)
by Keith Devlin - ESSLLI , 2001
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, a qualitative theory developed by Fred Dretske, and a qualitative theory introduced by Barwise and Perry.
(12955 views)
by Robert M. Gray - Information Systems Laboratory , 1972
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.
(9603 views)
by Peter D. Gruenwald, Paul M.B. Vitanyi - CWI , 2007
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain this quantitative approach to defining information and discuss the extent to which Kolmogorov's and Shannon's theory have a common purpose.
(10788 views)
by Gregory J. Chaitin - Springer , 2003
The final version of a course on algorithmic information theory and the epistemology of mathematics. The book discusses the nature of mathematics in the light of information theory, and sustains the thesis that mathematics is quasi-empirical.
(13126 views)
by Renato Renner - ETH Zurich , 2015
Processing of information is necessarily a physical process. It is not surprising that physics and the theory of information are inherently connected. Quantum information theory is a research area whose goal is to explore this connection.
(12762 views)
by John Watrous - University of Calgary , 2004
The focus is on the mathematical theory of quantum information. We will begin with basic principles and methods for reasoning about quantum information, and then move on to a discussion of various results concerning quantum information.
(11893 views)
by Inder Jeet Taneja - Universidade Federal de Santa Catarina , 2001
Contents: Shannon's Entropy; Information and Divergence Measures; Entropy-Type Measures; Generalized Information and Divergence Measures; M-Dimensional Divergence Measures and Their Generalizations; Unified (r,s)-Multivariate Entropies; etc.
(11039 views)
by David Feldman - College of the Atlantic , 2002
This e-book is a brief tutorial on information theory, excess entropy and statistical complexity. From the table of contents: Background in Information Theory; Entropy Density and Excess Entropy; Computational Mechanics.
(14130 views)
by Neri Merhav - arXiv , 2010
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
(13014 views)
by Robert H. Schumann - arXiv , 2000
A short review of ideas in quantum information theory. Quantum mechanics is presented together with some useful tools for quantum mechanics of open systems. The treatment is pedagogical and suitable for beginning graduates in the field.
(16766 views)
by Abbas El Gamal, Young-Han Kim - arXiv , 2010
Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.
(14835 views)
by David J. C. MacKay - University of Cambridge , 1995
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
(13932 views)
by Robert M. Gray - Springer , 2008
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
(17481 views)
by John Daugman - University of Cambridge , 2009
The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.
(24043 views)
by Raymond Yeung, S-Y Li, N Cai - Now Publishers Inc , 2006
A tutorial on the basics of the theory of network coding. It presents network coding for the transmission from a single source node, and deals with the problem under the more general circumstances when there are multiple source nodes.
(17160 views)
by Claude Shannon , 1948
Shannon presents results previously found nowhere else, and today many professors refer to it as the best exposition on the subject of the mathematical limits on communication. It laid the modern foundations for what is now coined Information Theory.
(61933 views)
by David J. C. MacKay - Cambridge University Press , 2003
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.
(29672 views)