An Introduction to Mathematical Optimal Control Theory
by Lawrence C. Evans
Publisher: University of California, Berkeley 2010
Number of pages: 126
Description:
Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.
Download or read it online for free here:
Download link
(690KB, PDF)
Similar books
Distributed-Parameter Port-Hamiltonian Systems
by Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(10835 views)
by Hans Zwart, Birgit Jacob - CIMPA
Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.
(10835 views)
Optimization and Control
by Richard Weber - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
(12986 views)
by Richard Weber - University of Cambridge
Topics: Dynamic Programming; Dynamic Programming Examples; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; etc.
(12986 views)
Optimal Control: Linear Quadratic Methods
by B.D.O. Anderson, J.B. Moore - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.
(22573 views)
by B.D.O. Anderson, J.B. Moore - Prentice-Hall
Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.
(22573 views)
Stochastic Optimal Control: The Discrete-Time Case
by Dimitri P. Bertsekas, Steven E. Shreve - Athena Scientific
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
(14079 views)
by Dimitri P. Bertsekas, Steven E. Shreve - Athena Scientific
This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.
(14079 views)