Papers
Topics
Authors
Recent
Search
2000 character limit reached

Unbiased Estimating Equation on Inverse Divergence and Its Conditions

Published 25 Apr 2024 in cs.IT, cs.LG, math.IT, math.ST, and stat.TH | (2404.16519v1)

Abstract: This paper focuses on the Bregman divergence defined by the reciprocal function, called the inverse divergence. For the loss function defined by the monotonically increasing function $f$ and inverse divergence, the conditions for the statistical model and function $f$ under which the estimating equation is unbiased are clarified. Specifically, we characterize two types of statistical models, an inverse Gaussian type and a mixture of generalized inverse Gaussian type distributions, to show that the conditions for the function $f$ are different for each model. We also define Bregman divergence as a linear sum over the dimensions of the inverse divergence and extend the results to the multi-dimensional case.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. A. Basu, I. R. Harris, N. L. Hjort, and M. C. Jones, “Robust and efficient estimation by minimising a density power divergence,” Biometrika, vol. 85, no. 3, pp. 549–559, 1998.
  2. T. Mukherjee, A. Mandal, and A. Basu, “The B-exponential divergence and its generalizations with applications to parametric estimation,” Statistical Methods & Applications, vol. 28, no. 2, pp. 241–257, 2019.
  3. S. Roy, K. Chakraborty, S. Bhadra, and A. Basu, “Density power downweighting and robust inference: Some new strategies,” Journal of Mathematics and Statistics, vol. 15, pp. 333–353, 2019.
  4. P. Singh, A. Mandal, and A. Basu, “Robust inference using the exponential-polynomial divergence,” Journal of Statistical Theory and Practice, vol. 15, no. 2, pp. 1–22, 2021.
  5. N. Murata, T. Takenouchi, T. Kanamori, and S. Eguchi, “Information geometry of U𝑈{U}italic_U-boost and Bregman divergence,” Neural Computation, vol. 16, no. 7, pp. 1437–1481, 2004.
  6. H. Fujisawa and S. Eguchi, “Robust parameter estimation with a small bias against heavy contamination,” Journal of Multivariate Analysis, vol. 99, no. 9, pp. 2053–2081, 2008.
  7. H. Fujisawa, “Normalized estimating equation for robust parameter estimation,” Electronic Journal of Statistics, vol. 7, pp. 1587–1606, 2013.
  8. A. Okuno, “Minimizing robust density power-based divergences for general parametric density models,” Annals of the Institute of Statistical Mathematics, 2024, to be published.
  9. L. Yu, J. Song, Y. Song, and S. Ermon, “Pseudo-spherical contrastive divergence,” in Proc. Advances in Neural Information Processing Systems (NeurIPS), vol. 34, 2021, pp. 22 348–22 362.
  10. Y. Shkel and S. Verdú, “A coding theorem for f-separable distortion measures,” Entropy, vol. 20, no. 2, pp. 1–16, 2018.
  11. M. Kobayashi and K. Watanabe, “Generalized Dirichlet-process-means for f𝑓fitalic_f-separable distortion measures,” Neurocomputing, vol. 458, pp. 667–689, 2021.
  12. ——, “Unbiased estimating equation and latent bias under f𝑓fitalic_f-separable Bregman distortion measures,” IEEE Transactions on Information Theory, 2024, to be published.
  13. A. Sanhueza, V. Leiva, and N. Balakrishnan, “A new class of inverse Gaussian type distributions,” Metrika, vol. 68, no. 1, pp. 31–49, 2008.
  14. A. Banerjee, S. Merugu, I. S. Dhillon, and J. Ghosh, “Clustering with Bregman divergences,” Journal of Machine Learning Research, vol. 6, pp. 1705–1749, 2005.
  15. A. Cichocki and S. Amari, “Families of alpha- beta- and gamma- divergences: Flexible and robust measures of similarities,” Entropy, vol. 12, no. 6, pp. 1532–1568, 2010.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.