Papers
Topics
Authors
Recent
Search
2000 character limit reached

Reducing Contextual Stochastic Bilevel Optimization via Structured Function Approximation

Published 25 Mar 2025 in math.OC | (2503.19991v2)

Abstract: Contextual Stochastic Bilevel Optimization (CSBO) extends standard stochastic bilevel optimization (SBO) by incorporating context-dependent lower-level problems, as in hyperparameter tuning and inverse optimization. This added structure introduces significant computational challenges: solving CSBO requires solving an infinite number of lower-level problems - one for each context realization - and existing approaches either suffer from high sample complexity or rely on impractical conditional sampling oracles. We propose a reduction framework that approximates the lower-level solutions using expressive basis functions, thereby decoupling the lower-level dependence on context and transforming CSBO into a standard SBO problem solvable using only joint samples from the context and noise distribution. Under mild assumptions, we show this reduction preserves hypergradient accuracy and yields an $\epsilon$-stationary solution to CSBO. We relate the sample complexity of the reduced problem to simple metrics of the basis and show that using Chebyshev polynomials leads to a near-optimal complexity of $\tilde{\mathcal{O}}(\epsilon{-3})$, matching the best-known rates for standard SBO. Empirical results on hyperparameter and inverse optimization tasks demonstrate that our approach outperforms CSBO baselines in convergence, sample efficiency, and memory usage, especially in settings without conditional sampling access.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.