Papers
Topics
Authors
Recent
Search
2000 character limit reached

$β$-Divergence loss for the kernel density estimation with bias reduced

Published 25 Mar 2019 in stat.ME | (1903.10462v1)

Abstract: Allthough nonparametric kernel density estimation with bias reduce is nowadays a standard technique in explorative data-analysis, there is still a big dispute on how to assess the quality of the estimate and which choice of bandwidth is optimal. This article examines the most important bandwidth selection methods for kernel density estimation with bias reduce, in particular, normal reference, least squares cross-validation, biased crossvalidation and $\beta$-Divergence loss. Methods are described and expressions are presented. We will compare these various bandwidth selector on simulated data. As an example of real data, we will use econometric data sets CO2 per capita in example 1 and the second

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.