Gaussian Processes for Machine Learning
by Carl E. Rasmussen, Christopher K. I. Williams
Publisher: The MIT Press 2005
Number of pages: 266
Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others.
Home page url
Download or read it online for free here:
(multiple PDF files)
by David Barber - Cambridge University Press
The book is designed for final-year undergraduate students with limited background in linear algebra and calculus. Comprehensive and coherent, it develops everything from basics to advanced techniques within the framework of graphical models.
by Jonas Buchli, et al. - arXiv.org
The starting point is the formulation of of an optimal control problem and deriving the different types of solutions and algorithms from there. These lecture notes aim at supporting this unified view with a unified notation wherever possible.
by A. Goldenberg, A.X. Zheng, S.E. Fienberg, E.M. Airoldi - arXiv
We begin with the historical development of statistical network modeling and then we introduce some examples in the network literature. Our subsequent discussion focuses on prominent static and dynamic network models and their interconnections.
by David J. C. MacKay - Cambridge University Press
A textbook on information theory, Bayesian inference and learning algorithms, useful for undergraduates and postgraduates students, and as a reference for researchers. Essential reading for students of electrical engineering and computer science.