2000 character limit reached
One-step corrected projected stochastic gradient descent for statistical estimation
Published 9 Jun 2023 in math.ST, stat.ML, and stat.TH | (2306.05896v2)
Abstract: A generic, fast and asymptotically efficient method for parametric estimation is described. It is based on the projected stochastic gradient descent on the log-likelihood function corrected by a single step of the Fisher scoring algorithm. We show theoretically and by simulations that it is an interesting alternative to the usual stochastic gradient descent with averaging or the adaptative stochastic gradient descent.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.