Papers
Topics
Authors
Recent
Search
2000 character limit reached

Measurized Markov Decision Processes

Published 6 May 2024 in math.OC | (2405.03888v4)

Abstract: In this paper, we explore lifting Markov Decision Processes (MDPs) to the space of probability measures and consider the so-called measurized MDPs - deterministic processes where states are probability measures on the original state space, and actions are stochastic kernels on the original action space. We show that measurized MDPs are a generalization of stochastic MDPs, thus the measurized framework can be deployed without loss of fidelity. Bertsekas and Shreve studied similar deterministic MDPs under the discounted infinite-horizon criterion in the context of universally measurable policies. Here, we also consider the long-run average reward case, but we cast lifted MDPs within the semicontinuous-semicompact framework of Hern\'andez-Lerma and Lasserre. This makes the lifted framework more accessible as it entails (i) optimal Borel-measurable value functions and policies, (ii) reasonably mild assumptions that are easier to verify than those in the universally-measurable framework, and (iii) simpler proofs. In addition, we showcase the untapped potential of lifted MDPs by demonstrating how the measurized framework enables the incorporation of constraints and value function approximations that are not available from the standard MDP setting. Furthermore, we introduce a novel algebraic lifting procedure for any MDP, showing that non-deterministic measure-valued MDPs can emerge from lifting MDPs impacted by external random shocks.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.