Papers
Topics
Authors
Recent
Search
2000 character limit reached

New Derivation for Gaussian Mixture Model Parameter Estimation: MM Based Approach

Published 9 Jan 2020 in eess.SP | (2001.02923v1)

Abstract: In this letter, we revisit the problem of maximum likelihood estimation (MLE) of parameters of Gaussian Mixture Model (GMM) and show a new derivation for its parameters. The new derivation, unlike the classical approach employing the technique of expectation-maximization (EM), is straightforward and doesn't invoke any hidden or latent variables and calculation of the conditional density function. The new derivation is based on the approach of minorization-maximization and involves finding a tighter lower bound of the log-likelihood criterion. The update steps of the parameters, obtained via the new derivation, are same as the update steps obtained via the classical EM algorithm.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.