Papers
Topics
Authors
Recent
Search
2000 character limit reached

Batch Acquisition Function Optimization

Updated 19 January 2026
  • Batch Acquisition Function Optimization is a strategy for selecting multiple query points in parallel to accelerate Bayesian Optimization processes.
  • It uses complementary objectives like PI, EI, and LCB to ensure high informativeness and diversity in batch evaluations.
  • Frameworks such as MACE show significant speed-ups, achieving up to 74× improvement in unconstrained and 15× in constrained optimization scenarios.

Batch Acquisition Function Optimization is a class of methodologies centered on selecting and optimizing acquisition functions in Bayesian Optimization (BO) for simultaneous, parallel evaluation of multiple query points ("batches") during each optimization round. Batch acquisition is essential in settings with substantial parallel computational resources or expensive function evaluations, such as analog circuit synthesis, where maximizing parallel throughput critically accelerates wall-clock optimization.

1. Parallel Bayesian Optimization and Batching Motivations

In standard sequential BO, the process iterates as follows: fit a Gaussian process (GP) model to known evaluations, maximize a scalar acquisition function α(x;D)\alpha(\mathbf{x};D) (for candidate x\mathbf{x}), evaluate the expensive black-box function at the chosen point, and update the dataset. This sequential protocol, however, is inefficient when modern simulators or experimental platforms enable concurrent evaluations. Batch acquisition optimization generalizes this protocol by proposing a set of BB points to be evaluated in parallel at each iteration, aiming to maximize the total information gain while promoting diversity among batch members.

Critical challenges for batch acquisition optimization are:

  • C1: Maximizing informativeness—each batch point should provide high expected improvement or information gain.
  • C2: Promoting diversity—avoid redundant points, ensure coverage over the plausible solution space.

2. Classical and Multi-objective Acquisition Functions

Central acquisition functions in BO include:

  • Probability of Improvement (PI):

PI(x)=Φ(τξμ(x)σ(x))\mathrm{PI}(\mathbf{x}) = \Phi\left(\frac{\tau - \xi - \mu(\mathbf{x})}{\sigma(\mathbf{x})}\right)

where τ\tau is the current best observed value, μ(x)\mu(\mathbf{x}) and σ(x)\sigma(\mathbf{x}) are GP posterior mean and standard deviation respectively, and Φ\Phi is the standard normal CDF.

  • Expected Improvement (EI):

EI(x)=σ(x)[λΦ(λ)+ϕ(λ)],λ=τξμ(x)σ(x)\mathrm{EI}(\mathbf{x}) = \sigma(\mathbf{x}) [\lambda\, \Phi(\lambda) + \phi(\lambda)], \quad \lambda = \frac{\tau - \xi - \mu(\mathbf{x})}{\sigma(\mathbf{x})}

  • Lower Confidence Bound (LCB):

LCB(x)=μ(x)βσ(x)\mathrm{LCB}(\mathbf{x}) = \mu(\mathbf{x}) - \beta\, \sigma(\mathbf{x})

with β\beta set according to desired exploration level.

Optimizing any one of these functions independently for batch selection typically produces correlated or spatially redundant points.

3. Multi-objective Acquisition Ensemble (MACE) Framework

The MACE paradigm (Zhang et al., 2021) casts batch acquisition as a multi-objective optimization over several standard acquisition functions (PI, EI, LCB). Rather than scalarizing, MACE treats

minx{LCB(x),PI(x),EI(x)}\min_{\mathbf{x}} \bigl\{\mathrm{LCB}(\mathbf{x}), -\mathrm{PI}(\mathbf{x}), -\mathrm{EI}(\mathbf{x})\bigr\}

as a three-objective minimization problem. At each iteration, MACE applies a multi-objective optimizer (DEMO) to approximate the Pareto front Pn\mathcal{P}_{n} of non-dominated solutions over these objectives. Uniform sampling from Pn\mathcal{P}_{n} effectively generates a batch of diverse, informative points. This approach eliminates the need for manual repulsion terms or weightings and automatically balances exploration vs. exploitation without normalization artifacts.

Algorithmic structure (Unconstrained MACE):

  1. Initialize with NinitN_{\rm init} random samples.
  2. For each BO round t=1Tt=1\ldots T:
    • Fit GP surrogate to the current dataset.
    • Solve the multi-objective subproblem for PI, EI, LCB.
    • Sample BB uniformly at random from the Pareto front.
    • Evaluate ff at these BB points in parallel; update the dataset.

4. Handling Constraints: Feasibility-First Two-Stage MACE Extension

In constrained optimization, where f(x)f(\mathbf{x}) must satisfy black-box inequality constraints ci(x)<0c_i(\mathbf{x})<0, MACE deploys a two-stage adaptation:

  • Stage 1: Focuses exclusively on locating at least one feasible point by optimizing a 3-objective ensemble:

min{PF(x),g1(x),g2(x)}\min\bigl\{-\mathrm{PF}(\mathbf{x}),\,g_1(\mathbf{x}),\,g_2(\mathbf{x})\bigr\}

with PF(x)\mathrm{PF}(\mathbf{x}) the product of constraint-satisfaction probabilities (via GP). This expedites finding the feasible region.

  • Stage 2: Once feasibility is achieved, a 6-objective ensemble is solved, including PI, EI, LCB and constraint-penalization terms. Candidate points with high constraint violation are pruned.

Uniform sampling from the resulting Pareto set concentrates batch points in feasible, informative regions.

5. Optimization Algorithms and Diversity

The core batch selection in MACE leverages multi-objective evolutionary optimizers (such as DEMO) for Pareto-front approximation. Random sampling from this front naturally yields diversity without explicit penalization, a common difficulty in traditional scalarized batch acquisition methods. No normalization or scale-calibration is required since acquisition functions are dimensionless and equally weighted in the Pareto framework.

For constrained versions, penalization terms and feasibility probabilities are directly embedded in the multi-objective optimization, with aggressive pruning of infeasible candidates to prioritize efficient batch evaluation.

6. Experimental Performance Assessment

Experimental results for analog circuit synthesis demonstrate:

  • Unconstrained case: MACE with batch size B=15B=15 achieved up to 74×74\times reduction in simulation time relative to differential evolution (DE), with equal or superior final objective values.
  • Constrained case: MACE attained up to 15×15\times speedup compared to weighted-EI Bayesian optimization (WEIBO), again with equivalent or improved final solutions.

Batch-size analysis revealed that larger batches increase parallel efficiency and reduce wall-clock time, with negligible quality degradation up to B=15B=15 due to Pareto-front sampling robustness.

7. Practical Guidelines and Transferability

  • Acquisition function selection: Choose a small, complementary set (e.g., PI, EI, LCB).
  • Multi-objective optimizer: Use a lightweight algorithm to approximate Pareto optimality over acquisition values.
  • Batch formation: Sample batches of size BB from the Pareto set, matched to available parallel resources.
  • Constrained problems: Apply a two-stage approach, focusing on feasibility identification before full penalized ensemble optimization.

The MACE framework generalizes robustly to other expensive black-box domains beyond analog circuit synthesis. Its multi-objective, Pareto-ensemble principle and feasibility-prioritized adaptation for constraints enable efficient batch Bayesian optimization with minimal manual tuning.

Summary Table: Core Batch Acquisition Mechanisms

Method Objective(s) Diversity Mechanism Constraint Handling Optimizer
MACE (Zhang et al., 2021) PI, EI, LCB (multi-objective) Pareto set sampling Two-stage, penalized ensemble DEMO
Scalarized BO Weighted sum/scalarization Manual repulsion Scalarized constraints Various

Emphasis on multi-objective acquisition and Pareto-based batch selection distinguishes MACE as a principled approach that effectively balances exploration, exploitation, and diversity in parallel Bayesian optimization, with significant empirical speed-ups and generality for both unconstrained and constrained domains (Zhang et al., 2021).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Batch Acquisition Function Optimization.