1. Control systems and optimal control problems.

2. Linear systems. Controllability and the energy minimization problem.

3. Vector fields, flows and Lie brackets. Elements of the chronological calculus.

4. Orbits and attainable sets. Nagano–Sussmann and Krener theorems.

5. Relaxation techniques and hidden convexity.

6. Existence of optimal control.

7. Elements of symplectic geometry and Pontryagin Maximum Principle.

8. Some model optimal control problems: a particle on the line, oscillator, Markov–Dubins and Euler interpolation problems.

9. Linear time-optimal problem.

10. Fields of extremals and sufficient optimality conditions.

11. Optimal cost: Hamilton–Jacobi–Bellmann equation.

12. Second variation, conjugate points and Jacobi equation.

13. Singular extremals: Goh and generalized Legendre conditions.

14. Bang-bang extremals: discrete Jacobi equation.

15. The chattering phenomenon.

16. Curvature of optimal control problems.

**Reference:**

A. Agrachev, Yu. Sachkov, Control theory from the geometric viewpoint. Springer, 2004