A primer on information theory, with applications to neuroscience
by Felix Effenberger
Publisher: arXiv 2013
Number of pages: 58
This chapter is supposed to give a short introduction to the fundamentals of information theory; not only, but especially suited for people having a less firm background in mathematics and probability theory. Regarding applications, the focus will be on neuroscientific topics.
Home page url
Download or read it online for free here:
by Alexander Shen - arXiv.org
Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.
by Mark M. Wilde - arXiv
The aim of this book is to develop 'from the ground up' many of the major developments in quantum Shannon theory. We study quantum mechanics for quantum information theory, we give important unit protocols of teleportation, super-dense coding, etc.
by Frederic Barbaresco, Ali Mohammad-Djafari - MDPI AG
The aim of this book is to provide an overview of current work addressing topics of research that explore the geometric structures of information and entropy. This survey will motivate readers to explore the emerging domain of Science of Information.
by Robert M. Gray - Information Systems Laboratory
The conditional rate-distortion function has proved useful in source coding problems involving the possession of side information. This book represents an early work on conditional rate distortion functions and related theory.