Papers
Topics
Authors
Recent
Search
2000 character limit reached

Comparing Algorithm Selection Approaches on Black-Box Optimization Problems

Published 30 Jun 2023 in cs.NE | (2306.17585v1)

Abstract: Performance complementarity of solvers available to tackle black-box optimization problems gives rise to the important task of algorithm selection (AS). Automated AS approaches can help replace tedious and labor-intensive manual selection, and have already shown promising performance in various optimization domains. Automated AS relies on ML techniques to recommend the best algorithm given the information about the problem instance. Unfortunately, there are no clear guidelines for choosing the most appropriate one from a variety of ML techniques. Tree-based models such as Random Forest or XGBoost have consistently demonstrated outstanding performance for automated AS. Transformers and other tabular deep learning models have also been increasingly applied in this context. We investigate in this work the impact of the choice of the ML technique on AS performance. We compare four ML models on the task of predicting the best solver for the BBOB problems for 7 different runtime budgets in 2 dimensions. While our results confirm that a per-instance AS has indeed impressive potential, we also show that the particular choice of the ML technique is of much minor importance.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (11)
  1. IOHexperimenter: Benchmarking Platform for Iterative Optimization Heuristics. CoRR abs/2111.04077 (2021). arXiv:2111.04077 https://arxiv.org/abs/2111.04077
  2. Revisiting deep learning models for tabular data. In Advances in Neural Information Processing Systems, Vol. 34. 18932–18943. https://openreview.net/forum?id=i_Q1yrOegLY
  3. Why do tree-based models still outperform deep learning on typical tabular data?. In NeurIPS 2022 Datasets and Benchmarks Track (Advances in Neural Information Processing). New Orleans, United States. https://hal.science/hal-03723551
  4. COCO: A platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36 (2020), 114–144. https://doi.org/10.1080/10556788.2020.1808977
  5. TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second. https://doi.org/10.48550/ARXIV.2207.01848
  6. Automated Algorithm Selection: Survey and Perspectives. Evolutionary Computation 27, 1 (2019), 3–45. https://doi.org/10.1162/evco_a_00242
  7. Per-run Algorithm Selection with Warm-starting using Trajectory-based Features. In Parallel Problem Solving from Nature (PPSN) (LNCS, Vol. 13398). Springer, 46–60. https://doi.org/10.1007/978-3-031-14714-2_4 Free version available at https://arxiv.org/abs/2204.09483.
  8. Comparing Algorithm Selection Approaches: data and code. https://doi.org/10.5281/zenodo.7807853
  9. Exploratory landscape analysis. In Proc. of Genetic and Evolutionary Computation Conference (GECCO’11). ACM, 829–836. https://doi.org/10.1145/2001576.2001690
  10. Making EGO and CMA-ES Complementary for Global Optimization. In Proc. of Learning and Intelligent Optimization (LION) (LNCS, Vol. 8994). Springer, 287–292. https://doi.org/10.1007/978-3-319-19084-6_29
  11. Chaining of Numerical Black-box Algorithms: Warm-Starting and Switching Points. CoRR abs/2204.06539 (2022). https://doi.org/10.48550/arXiv.2204.06539
Citations (8)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.