Information Theory, Inference, and Learning Algorithms

Large book cover: Information Theory, Inference, and Learning Algorithms

Information Theory, Inference, and Learning Algorithms

Publisher: Cambridge University Press
ISBN/ASIN: 0521642981
ISBN-13: 9780521642989
Number of pages: 640

Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks.

Home page url

Download or read it online for free here:
Download link
(multiple formats)

Similar books

Book cover: Exploring RandomnessExploring Randomness
by - Springer
This book presents the core of Chaitin's theory of program-size complexity, also known as algorithmic information theory. LISP is used to present the key algorithms and to enable computer users to interact with the author's proofs.
Book cover: Lecture Notes on Network Information TheoryLecture Notes on Network Information Theory
by - arXiv
Network information theory deals with the fundamental limits on information flow in networks and optimal coding and protocols. These notes provide a broad coverage of key results, techniques, and open problems in network information theory.
Book cover: A Short Course in Information TheoryA Short Course in Information Theory
by - University of Cambridge
This text discusses the theorems of Claude Shannon, starting from the source coding theorem, and culminating in the noisy channel coding theorem. Along the way we will study simple examples of codes for data compression and error correction.
Book cover: Logic and InformationLogic and Information
An introductory, comparative account of three mathematical approaches to information: the classical quantitative theory of Claude Shannon, a qualitative theory developed by Fred Dretske, and a qualitative theory introduced by Barwise and Perry.