Mathematics Colloquia and Seminars

Return to Colloquia & Seminar listing

Local Solution of the Dynamics Programming Equations in Discrete-Time

Student-Run Research Seminar

Speaker: Carmeliza Navasca, UC Davis
Location: 693 Kerr
Start time: Mon, Oct 15 2001, 10:00AM

We present a method for the solution of the Dynamic Programming Equations (DPE) that arises in an infinite horizon optimal control problem. The method extends to the discrete time version of Al'brecht's procedure for locally approximating the solution of the Hamilton Jacobi Bellman PDE. Assuming that the dynamics and cost are C^(r-1)(R^(n+m)) and C^r(R^(n+m)) smooth, respectively, we explicitly find expansions of the optimal control and cost in a term by term fashion. These finite expansions are the first (r-1)th and rth terms of the power series expansions of the optimal control and the optimal cost, respectively. Once the formal solutions to the DPE are found, we then prove the existence of smooth solutions to the DPE that has the same Taylor series expansions as the formal solutions. The Pontryagin Maximum Principle provides the nonlinear Hamiltonian dynamics associated with the optimal control problem. We learn the eigenstructure of the Hamiltonian matrix and symplectic properties which aid in finding the graph of the gradient of the optimal cost. Furthermore, the Local Stable Manifold Theorem, the Stokes' Theorem, and the Implicit Function Theorem are some of the main tools used to show the optimal cost and the optimal control do exist and satisfy the DPE.