The maximum principle for discrete-time control systems and applications to dynamic games
Abstract: We study deterministic nonstationary discrete-time optimal control problems in both finite and infinite horizon. With the aid of Gateaux differentials, we prove a discrete-time maximum principle in analogy with the well-known continuous-time maximum principle. We show that this maximum principle, together with a transversality condition, is a necessary condition for optimality; we also show that it is sufficient under additional hypotheses. We use Gateaux differentials as a natural setting to derive first-order conditions. Additionally, we use the discrete-time maximum principle to derive the discrete-time Euler equation and to characterize Nash equilibria for discrete-time dynamic games.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.