Linear Optimal Control
by B.D.O. Anderson, J.B. Moore
Publisher: Prentice Hall 1971
Number of pages: 413
The aim of this book is to construct one of many bridges that are still required for the student and practicing control engineer between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance. Linear systems are very heavily emphasized.
Download or read it online for free here:
by Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
by Richard Weber - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
by Dimitri P. Bertsekas, Steven E. Shreve - Athena Scientific
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
by B.D.O. Anderson, J.B. Moore - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.