**Stochastic Optimal Control: The Discrete-Time Case**

by Dimitri P. Bertsekas, Steven E. Shreve

**Publisher**: Athena Scientific 1996**ISBN/ASIN**: 1886529035**Number of pages**: 331

**Description**:

This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.

Download or read it online for free here:

**Download link**

(multiple PDF files)

## Similar books

**An Introduction to Mathematical Optimal Control Theory**

by

**Lawrence C. Evans**-

**University of California, Berkeley**

Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.

(

**8717**views)

**Distributed-Parameter Port-Hamiltonian Systems**

by

**Hans Zwart, Birgit Jacob**-

**CIMPA**

Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.

(

**6325**views)

**Optimal Control: Linear Quadratic Methods**

by

**B.D.O. Anderson, J.B. Moore**-

**Prentice-Hall**

Numerous examples highlight this treatment of the use of linear quadratic Gaussian methods for control system design. It explores linear optimal control theory from an engineering viewpoint, with illustrations of practical applications.

(

**14292**views)

**Linear Optimal Control**

by

**B.D.O. Anderson, J.B. Moore**-

**Prentice Hall**

This book constructs a bridge between the familiar classical control results and those of modern control theory. Many modern control results do have practical engineering significance, as distinct from applied mathematical significance.

(

**9738**views)