**Information Theory and Statistical Physics**

by Neri Merhav

**Publisher**: arXiv 2010**Number of pages**: 176

**Description**:

This document consists of lecture notes for a graduate course, which focuses on the relations between Information Theory and Statistical Physics. The course is aimed at EE graduate students in the area of Communications and Information Theory, as well as to graduate students in Physics who have basic background in Information Theory. Strong emphasis is given to the analogy and parallelism between Information Theory and Statistical Physics, as well as to the insights, the analysis tools and techniques that can be borrowed from Statistical Physics and 'imported' to certain problem areas in Information Theory.

Download or read it online for free here:

**Download link**

(1.2MB, PDF)

## Similar books

**Quantum Information Theory**

by

**Robert H. Schumann**-

**arXiv**

A short review of ideas in quantum information theory. Quantum mechanics is presented together with some useful tools for quantum mechanics of open systems. The treatment is pedagogical and suitable for beginning graduates in the field.

(

**9600**views)

**Information Theory, Inference, and Learning Algorithms**

by

**David J. C. MacKay**-

**Cambridge University Press**

A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.

(

**13130**views)

**Information Theory and Coding**

by

**John Daugman**-

**University of Cambridge**

The aims of this course are to introduce the principles and applications of information theory. The course will study how information is measured in terms of probability and entropy, and the relationships among conditional and joint entropies; etc.

(

**11383**views)

**Around Kolmogorov Complexity: Basic Notions and Results**

by

**Alexander Shen**-

**arXiv.org**

Algorithmic information theory studies description complexity and randomness. This text covers the basic notions of algorithmic information theory: Kolmogorov complexity, Solomonoff universal a priori probability, effective Hausdorff dimension, etc.

(

**587**views)