**Optimization and Control**

by Richard Weber

**Publisher**: University of Cambridge 2010**Number of pages**: 70

**Description**:

Topics: Dynamic Programming; Examples of Dynamic Programming; Dynamic Programming over the Infinite Horizon; Positive Programming; Negative Programming; Bandit Processes and Gittins Index; Average-cost Programming; LQ Regulation; Controllability; Stabilizability and Observability; Kalman Filter and Certainty Equivalence; etc.

Download or read it online for free here:

**Download link**

(500KB, PDF)

## Similar books

**An Introduction to Mathematical Optimal Control Theory**

by

**Lawrence C. Evans**-

**University of California, Berkeley**

Contents: Introduction; Controllability, bang-bang principle; Linear time-optimal control; The Pontryagin Maximum Principle; Dynamic programming; Game theory; Introduction to stochastic control theory; Proofs of the Pontryagin Maximum Principle.

(

**14929**views)

**Distributed-Parameter Port-Hamiltonian Systems**

by

**Hans Zwart, Birgit Jacob**-

**CIMPA**

Topics from the table of contents: Introduction; Homogeneous differential equation; Boundary Control Systems; Transfer Functions; Well-posedness; Stability and Stabilizability; Systems with Dissipation; Mathematical Background.

(

**10835**views)

**Modeling, Simulation and Optimization: Tolerance and Optimal Control**

by

**Shkelzen Cakaj**-

**InTech**

Topics covered: parametric representation of shapes, modeling of dynamic continuous fluid flow process, plant layout optimal plot plan, atmospheric modeling, cellular automata simulations, thyristor switching characteristics simulation, etc.

(

**17000**views)

**Stochastic Optimal Control: The Discrete-Time Case**

by

**Dimitri P. Bertsekas, Steven E. Shreve**-

**Athena Scientific**

This research monograph is the authoritative and comprehensive treatment of the mathematical foundations of stochastic optimal control of discrete-time systems, including the treatment of the intricate measure-theoretic issues.

(

**14080**views)