A primer on information theory, with applications to neuroscience
by Felix Effenberger
Publisher: arXiv 2013
Number of pages: 58
This chapter is supposed to give a short introduction to the fundamentals of information theory; not only, but especially suited for people having a less firm background in mathematics and probability theory. Regarding applications, the focus will be on neuroscientific topics.
Home page url
Download or read it online for free here:
by Robert M. Gray - Springer
The book covers the theory of probabilistic information measures and application to coding theorems for information sources and noisy channels. This is an up-to-date treatment of traditional information theory emphasizing ergodic theory.
by David J. C. MacKay - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
by Neri Merhav - arXiv
Lecture notes for a graduate course focusing on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, or graduate students in Physics.
by Gregory J. Chaitin - World Scientific
In this mathematical autobiography, Gregory Chaitin presents a technical survey of his work and a non-technical discussion of its significance. The technical survey contains many new results, including a detailed discussion of LISP program size.