Papers
Topics
Authors
Recent
Search
2000 character limit reached

A multivariate heavy-tailed integer-valued GARCH process with EM algorithm-based inference

Published 30 Jun 2023 in stat.CO | (2306.17776v1)

Abstract: A new multivariate integer-valued Generalized AutoRegressive Conditional Heteroscedastic process based on a multivariate Poisson generalized inverse Gaussian distribution is proposed. The estimation of parameters of the proposed multivariate heavy-tailed count time series model via maximum likelihood method is challenging since the likelihood function involves a Bessel function that depends on the multivariate counts and its dimension. As a consequence, numerical instability is often experienced in optimization procedures. To overcome this computational problem, two feasible variants of the Expectation-Maximization (EM) algorithm are proposed for estimating parameters of our model under low and high-dimensional settings. These EM algorithm variants provide computational benefits and help avoid the difficult direct optimization of the likelihood function from the proposed model. Our model and proposed estimation procedures can handle multiple features such as modeling of multivariate counts, heavy-taildness, overdispersion, accommodation of outliers, allowances for both positive and negative autocorrelations, estimation of cross/contemporaneous-correlation, and the efficient estimation of parameters from both statistical and computational points of view. Extensive Monte Carlo simulation studies are presented to assess the performance of the proposed EM algorithms. An application to modeling bivariate count time series data on cannabis possession-related offenses in Australia is discussed.

Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.