Boosting: Foundations and Algorithms
by Robert E. Schapire, Yoav Freund
Publisher: The MIT Press 2014
Number of pages: 544
Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate 'rules of thumb'. A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry.
Home page url
Download or read it online for free here:
by Aaron Hertzmann - University of Toronto
Contents: Introduction to Machine Learning; Linear Regression; Nonlinear Regression; Quadratics; Basic Probability Theory; Probability Density Functions; Estimation; Classification; Gradient Descent; Cross Validation; Bayesian Methods; and more.
by Ratnadip Adhikari, R. K. Agrawal - arXiv
This work presents a concise description of some popular time series forecasting models used in practice, with their features. We describe three important classes of time series models, viz. the stochastic, neural networks and SVM based models.
by A. Goldenberg, A.X. Zheng, S.E. Fienberg, E.M. Airoldi - arXiv
We begin with the historical development of statistical network modeling and then we introduce some examples in the network literature. Our subsequent discussion focuses on prominent static and dynamic network models and their interconnections.
by Jonas Buchli, et al. - arXiv.org
The starting point is the formulation of of an optimal control problem and deriving the different types of solutions and algorithms from there. These lecture notes aim at supporting this unified view with a unified notation wherever possible.