MENU

You are here

Mathematical control theory

Lecturer: 
Course Type: 
PhD Course
Academic Year: 
2017-2018
Period: 
October-February
Duration: 
60 h
Description: 
  1. Vector fields and control systems. Lie brackets.
  2. Linear systems: controllability, optimization, normal forms.
  3. Elements of the chronological calculus.
  4. Intrinsic characterization of linear systems.
  5. Orbits and attainable sets. Nagano, Sussmann, and Krener theorems.
  6. Applications: control of rigid and liquid bodies.
  7. Feedback (gauge) transformations.
  8. Relaxation technique: “hidden convexity”.
  9. Optimal control problem. Existence of solution.
  10. Elements of symplectic geometry and Pontryagin Maximum Principle.
  11. Some model optimal control problems: a particle on the line, oscillator, Markov–Dubins and Euler interpolation problems.
  12. Fields of extremals and sufficient optimality conditions.
  13. Optimal cost: Hamilton–Jacobi–Bellmann equation.
  14. Second variation, conjugate points and Jacobi equation.
  15. Singular extremals: Goh and generalized Legendre conditions.
  16. The chattering phenomenon.
  17. Curvature of optimal control problems.

Reference:

A. Agrachev, Yu. Sachkov, Control theory from the geometric viewpoint. Springer, 2004 

Location: 
A-133
Next Lectures: 

Sign in