Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation

Published 3 Oct 2023 in math.OC | (2310.02371v2)

Abstract: In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, possibly with some stochastic noise. Assuming convex, and higher-order of smoothness of the objective function, this paper provides a zero-order accelerated stochastic gradient descent (ZO-AccSGD) method for solving this problem, which exploits the higher-order of smoothness information via kernel approximation. As theoretical results, we show that the ZO-AccSGD algorithm proposed in this paper improves the convergence results of state-of-the-art (SOTA) algorithms, namely the estimate of iteration complexity. In addition, our theoretical analysis provides an estimate of the maximum allowable noise level at which the desired accuracy can be achieved. Validation of our theoretical results is demonstrated both on the model function and on functions of interest in the field of machine learning. We also provide a discussion in which we explain the results obtained and the superiority of the proposed algorithm over SOTA algorithms for solving the original problem.

Citations (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.