Large and moderate deviation principles for averaged stochastic approximation method for the estimation of a regression function
Abstract: In this paper we prove large deviations principles for the averaged stochastic approximation method for the estimation of a regression function introduced by A. Mokkadem et al. [Revisiting R\'ev\'esz's stochastic approximation method for the estimation of a regression function, ALEA Lat. Amm. J. Probab. Math. Stat. 6 (2009), 63-114]. We show that the averaged stochastic approximation algorithm constructed using the weight sequence which minimize the asymptotic variance gives the same pointwise LDP as the Nadaraya-Watson kernel estimator. Moreover, we give a moderate deviations principle for these estimators. It turns out that the rate function obtained in the moderate d\'eviations principle for the averaged stochastic approximation algorithm constructed using the weight sequence which minimize the asymptotic variance is larger than the one obtained for the Nadaraya-Watson estimator and the one obtained for the semi-recursive estimator.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.