Least informative distributions in Maximum q-log-likelihood estimation
Abstract: We use the Maximum $q$-log-likelihood estimation for Least informative distributions (LID) in order to estimate the parameters in probability density functions (PDFs) efficiently and robustly when data include outlier(s). LIDs are derived by using convex combinations of two PDFs, $f_\epsilon=(1-\epsilon)f_0+\epsilon f_1$. A convex combination of two PDFs is considered as a contamination $f_1$ as outlier(s) to underlying $f_0$ distributions and $f_\epsilon$ is a contaminated distribution. The optimal criterion is obtained by minimizing the change of Maximum q-log-likelihood function when the data have slightly more contamination. In this paper, we make a comparison among ordinary Maximum likelihood, Maximum q-likelihood estimations, LIDs based on $\log_q$ and Huber M-estimation. Akaike and Bayesian information criterions (AIC and BIC) based on $\log_q$ and LID are proposed to assess the fitting performance of functions. Real data sets are applied to test the fitting performance of estimating functions that include shape, scale and location parameters.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.