Adaptive Control: Stability, Convergence, and Robustness
by Shankar Sastry, Marc Bodson
Publisher: Prentice Hall 1994
Number of pages: 378
The objective of this book is to give the major results, techniques of analysis and new directions of research in adaptive systems. The authors give a clear, conceptual presentation of adaptive methods, to enable a critical evaluation of these techniques and suggest avenues of further development. The book presents deterministic theory of identification and adaptive control. The focus is on linear, continuous time, single-input single output systems.
Home page url
Download or read it online for free here:
(multiple PDF files)
by Eitan Altman, Bruno Gaujal, Arie Hordijk - Springer
Opening new directions in research in stochastic control, this book focuses on a wide class of control and of optimization problems over sequences of integer numbers. The theory is applied to the control of stochastic discrete-event dynamic systems.
by Andrew Whitworth - Wikibooks
An inter-disciplinary engineering text that analyzes the effects and interactions of mathematical systems. This book is for third and fourth year undergraduates in an engineering program. It considers both classical and modern control methods.
by R. Timman
The lectures present an introduction to modern control theory. Calculus of variations is used to study the problem of determining the optimal control for a deterministic system without constraints and for one with constraints.
by Hugh Jack
Dynamic System Modeling and Control introduces the basic concepts of system modeling with differential equations. Supplemental materials at the end of this book include a writing guide, summary of math topics, and a table of useful engineering units.