Papers
Topics
Authors
Recent
Search
2000 character limit reached

Online Estimation with Rolling Validation: Adaptive Nonparametric Estimation with Streaming Data

Published 18 Oct 2023 in math.ST, stat.ME, stat.ML, and stat.TH | (2310.12140v3)

Abstract: Online nonparametric estimators are gaining popularity due to their efficient computation and competitive generalization abilities. An important example includes variants of stochastic gradient descent. These algorithms often take one sample point at a time and incrementally update the parameter estimate of interest. In this work, we consider model selection/hyperparameter tuning for such online algorithms. We propose a weighted rolling validation procedure, an online variant of leave-one-out cross-validation, that costs minimal extra computation for many typical stochastic gradient descent estimators and maintains their online nature. Similar to batch cross-validation, it can boost base estimators to achieve better heuristic performance and adaptive convergence rate. Our analysis is straightforward, relying mainly on some general statistical stability assumptions. The simulation study underscores the significance of diverging weights in practice and demonstrates its favorable sensitivity even when there is only a slim difference between candidate estimators.

Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.