Papers
Topics
Authors
Recent
Search
2000 character limit reached

Minimum information Markov model

Published 11 Jan 2026 in stat.ME | (2601.06900v1)

Abstract: The analysis of high-dimensional time series data has become increasingly important across a wide range of fields. Recently, a method for constructing the minimum information Markov kernel on finite state spaces was established. In this study, we propose a statistical model based on a parametrization of its dependence function, which we call the \textit{Minimum Information Markov Model}. We show that its parametrization induces an orthogonal structure between the stationary distribution and the dependence function, and that the model arises as the optimal solution to a divergence rate minimization problem. In particular, for the Gaussian autoregressive case, we establish the existence of the optimal solution to this minimization problem, a nontrivial result requiring a rigorous proof. For parameter estimation, our approach exploits the conditional independence structure inherent in the model, which is supported by the orthogonality. Specifically, we develop several estimators, including conditional likelihood and pseudo likelihood estimators, for the minimum information Markov model in both univariate and multivariate settings. We demonstrate their practical performance through simulation studies and applications to real-world time series data.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.