On Generalized Likelihood Estimation Based on the Logarithmic Norm Relative Entropy
Abstract: Traditional likelihood based methods for parameter estimation get highly affected when the given data is contaminated by outliers even in a small proportion. In this paper, we consider a robust parameter estimation method, namely the minimum logarithmic norm relative entropy (LNRE) estimation procedure, and study different (generalized) sufficiency principles associated with it. We introduce a new two-parameter power-law family of distributions (namely, $\mathcal{M}{(\alpha,\beta)}$-family), which is shown to have a fixed number of sufficient statistics, independent of the sample size, with respect to the generalized likelihood function associated with the LNRE. Then, we obtain the generalized minimal sufficient statistic for this family and derive the generalized Rao-Blackwell theorem and the generalized Cram\'{e}r-Rao lower bound for the minimum LNRE estimation. We also study the minimum LNRE estimators (MLNREEs) for the family of Student's distributions particularly in detail. Our general results reduces to the classical likelihood based results under the exponential family of distributions at specific choices of the tuning parameter $\alpha$ and $\beta$. Finally, we present simulation studies followed by a real data analysis, which highlight the practical utility of the MLNREEs for data contaminated by possible outliers. Along the way we also correct a mistake found in a paper on related theory of generalized likelihoods.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.