Papers
Topics
Authors
Recent
Search
2000 character limit reached

Multifidelity ABC SMC Inference

Updated 3 February 2026
  • Multifidelity ABC SMC is a simulation-based inference technique that combines low-fidelity pre-filtering with high-fidelity evaluations to efficiently approximate complex Bayesian posteriors.
  • It employs adaptive importance sampling, hierarchical thresholding, and MCMC rejuvenation to balance computational cost with estimation accuracy.
  • The method achieves significant speedups and reduced high-fidelity calls, making it valuable for expensive likelihood simulations in probabilistic modeling.

Multifidelity Approximate Bayesian Computation Sequential Monte Carlo (multifidelity ABC SMC) refers to a broad class of computational inference algorithms that leverage a hierarchy of simulation models of varying cost and accuracy to accelerate sequential Monte Carlo (SMC) for likelihood-free Bayesian inference. These methods exploit the availability of both low-fidelity (LF) and high-fidelity (HF) simulators to maximize computational efficiency, particularly in situations where a direct evaluation of the likelihood is intractable and synthetic data must be compared to observations via an approximate Bayesian computation (ABC) framework. Multifidelity ABC SMC has emerged as a critical development in simulation-based inference for probabilistic models with expensive likelihoods, combining principles of model reduction, early rejection, adaptive importance sampling, and rigorous error bounds.

1. Conceptual Framework

Multifidelity ABC SMC augments classic ABC SMC by strategically embedding model hierarchies into the SMC population workflow. At a high level, particles (parameter proposals) are first screened using cheaper, lower-fidelity model evaluations. Only those that pass a pre-determined criterion proceed to expensive high-fidelity simulation and ABC acceptance testing. This hierarchical processing is motivated by the high computational cost typically associated with sufficiently accurate simulators. The essential ingredients are:

  • A prior distribution π(θ)\pi(\theta) for parameters θ\theta.
  • A sequence of decreasing ABC tolerance thresholds ε1>>εT>0\varepsilon_1 > \cdots > \varepsilon_T > 0.
  • For each θ\theta, access to a low-fidelity simulator p~(θ)\tilde{p}(\cdot \mid \theta) and a high-fidelity simulator p(θ)p(\cdot \mid \theta).
  • An ABC distance metric Δ(x,y)\Delta(x, y) between simulated and observed data yy.
  • An SMC kernel, typically adaptive, for propagating particle populations between generations.

Hierarchical importance weighting or continuation probabilities ensure correct ABC targeting despite the two-stage filtering. The ensemble may additionally exploit Markov chain Monte Carlo (MCMC) rejuvenation steps.

2. Algorithmic Strategies

Prominent exemplars of multifidelity ABC SMC algorithms include the MAPS (Multifidelity ABC with Pre-Filtering SMC) method (Cao et al., 2 Feb 2026) and the MF-ABC-SMC of Prescott & Baker (Prescott et al., 2020). Each approach adopts a two-stage paradigm grounded in adaptive importance sampling:

Hierarchical Importance Sampling and Pre-Filtering

  • Proposals θ\theta are first evaluated with nLn_L LF model runs. If for all kk, Δ(x~k,y)>ε~\Delta(\tilde{x}_k, y) > \tilde{\varepsilon} (LF threshold), the proposal is rejected outright (zero weight).
  • If at least one LF simulation satisfies Δ(x~k,y)ε~\Delta(\tilde{x}_k, y) \leq \tilde{\varepsilon}, the particle advances, and nHn_H HF simulations are drawn: x1:nHp(θ)x_{1:n_H} \sim p(\cdot \mid \theta).
  • The standard ABC indicator, k{Δ(xk,y)ε}\sum_k \{\Delta(x_k, y) \leq \varepsilon\}, defines the HF ABC likelihood at the iteration-specific threshold ε\varepsilon.
  • The resulting importance weight is:

W(i)=π(θ(i))q(θ(i)){minkΔ(x~k,y)ε~}k=1nH{Δ(xk,y)ε}W^{(i)} = \frac{\pi(\theta^{(i)})}{q(\theta^{(i)})} \cdot \bigl\{ \min_k \Delta(\tilde{x}_k, y) \leq \tilde{\varepsilon} \bigr\} \sum_{k=1}^{n_H} \{\Delta(x_k, y) \leq \varepsilon\}

(Cao et al., 2 Feb 2026).

  • In algorithms using continuation probabilities, per-particle adaptive probabilities αt(θ,x~)\alpha_t(\theta, \tilde{x}) control the frequency of advancing to HF simulation, with optimal values calibrated to minimize MSE or maximize computational efficiency (Prescott et al., 2020).

Adaptive Thresholds and ESS Monitoring

Adaptive selection of LF and HF acceptance thresholds (ε~t\tilde{\varepsilon}_t and εt\varepsilon_t) is central. Quantile-based or efficiency-predicted thresholding is employed to control the effective sample size (ESS) and limit adverse variance escalation.

SMC and Rejuvenation

Particles progress across ABC-tolerance levels through iteration. After each propagation step, resampling and MCMC mutation (such as pseudo-marginal MH moves conditioned on LF/HF acceptance) are employed to maintain sample diversity and mixing.

3. Theoretical Guarantees and Error Analysis

Both MAPS and MF-ABC-SMC frameworks provide guarantees of asymptotic posterior concentration and controlled bias as ABC thresholds decrease and sample size increases:

  • Under standard ABC regularity conditions and an explicit LF filter false-rejection constraint (aL<1a_L < 1), the bias in the filtered posterior satisfies an L1L_1 bound:

πε,ε~πε1<11aL(1aL)\bigl\| \pi_{\varepsilon, \tilde{\varepsilon}} - \pi_\varepsilon \bigr\|_1 < \frac{1}{1-a_L} - (1-a_L)

This quantifies the impact of LF pre-filtering on ABC posterior approximation (Cao et al., 2 Feb 2026).

  • In MF-ABC-SMC, the unbiasedness of ABC importance sampling is preserved through the coupling of weights and continuation probabilities, with explicit error terms dependent on the probabilities of false positives and negatives between the LF and HF models (Prescott et al., 2020).

The rates at which model discrepancies (e.g., P(x~Ωε,xΩε)P(\tilde{x} \in \Omega_\varepsilon, x \notin \Omega_\varepsilon)) decay are key to ensuring that multifidelity accelerations do not unduly bias inference.

4. Efficiency, Computational Tradeoffs, and Surrogate Construction

Efficiency enhancements derive from the early rejection of proposals using LF surrogates:

  • The average per-particle cost is

CostnLcL+αLnHcH\text{Cost} \approx n_L c_L + \alpha_L n_H c_H

where cLc_L and cHc_H are the costs of LF and HF runs, and αL\alpha_L is the fraction of proposals reaching the HF stage (Cao et al., 2 Feb 2026).

  • Increasing nLn_L or the LF threshold ε~\tilde{\varepsilon} reduces the chance of discarding promising particles but increases overall LF evaluation costs.
  • Surrogate (LF) models are commonly derived by mesh coarsening, time truncation, or dimension-reduction techniques, ensuring that they are at least one order of magnitude faster than the HF simulations (Prescott et al., 2020).

Continuation probabilities in MF-ABC-SMC are set as piecewise constants:

αt(θ,x~)=η1I{d(x~,yobs)εt}+η2I{d(x~,yobs)>εt}\alpha_t(\theta, \tilde{x}) = \eta_1 \cdot \mathbb{I}\{ d(\tilde{x}, y_\text{obs}) \leq \varepsilon_t \} + \eta_2 \cdot \mathbb{I}\{ d(\tilde{x}, y_\text{obs}) > \varepsilon_t \}

and optimized analytically to maximize efficiency given empirical MSE and computational cost models.

5. Diagnostics and Practical Recommendations

Diagnostic strategies are crucial to the robust application of multifidelity ABC SMC:

  • The false-negative error metric EE quantifies the LF/HF alignment: a low EE implies LF filtering does not overly discard true positives. It is computed by comparing LF outputs on a pilot batch with confirmed HF acceptances at a desired ABC threshold (Cao et al., 2 Feb 2026).
  • Monitor ESS and EE at each SMC iteration to detect potential degeneracy or excessive bias accumulation.
  • Hyperparameters such as α=αL0.7\alpha = \alpha_L \approx 0.7, aL103a_L \approx 10^{-3}, and the setting nLcostLF0.4nHcostHFn_L \cdot \text{cost}_{LF} \lesssim 0.4 n_H \cdot \text{cost}_{HF} have been found effective in practice.
  • In case studies (toy, Ornstein–Uhlenbeck, Kuramoto network), MAPS reduces HF simulations by approximately 35–45% compared to standard SMC with equivalent posterior fidelity (Cao et al., 2 Feb 2026), while MF-ABC-SMC demonstrates multiple-fold acceleration, especially during early SMC generations (Prescott et al., 2020).

Multifidelity ABC SMC is distinct from but related to multilevel Monte Carlo ABC (MLMC-ABC), which leverages a telescoping sum decomposition across ABC tolerance “levels,” running a coupled SMC sampler to efficiently exploit variance decay properties (Jasra et al., 2017). MLMC-ABC targets optimal cost-MSE scaling via allocation of computational effort across ABC approximations with varying error and cost. While both frameworks offer substantial computational gain over single-level ABC SMC, multifidelity ABC SMC focuses on model hierarchy (LF/HF), whereas MLMC typically varies simulation granularity or kernel accuracy. Both admit theoretical analysis of bias, variance, and computational complexity, linked to coupling properties and mixing assumptions.

An encapsulation of practical multifidelity ABC SMC methods is provided in the table:

Approach Early Rejection via LF Adaptivity Theoretical Bias Bound Tested Speedup (Typical)
MAPS (Cao et al., 2 Feb 2026) Deterministic pre-filter Adaptive thresholds Explicit L1L_1 control via aLa_L ~35–45% HF calls reduction
MF-ABC-SMC (Prescott et al., 2020) Stochastic continuation/acceptance Adaptive kernel & thresholds Bias via coupling probabilities ×2–10 wall-time acceleration

7. Implementation and Future Directions

Efficient implementations of multifidelity ABC SMC are publicly available, for example via the R package for MAPS at https://github.com/caofff/MAPS (Cao et al., 2 Feb 2026). Key considerations include the design of high-quality LF surrogates, tight coupling of random seeds to control false positive/negative probabilities, and dynamic adjustment of thresholds and continuation parameters.

A plausible implication is that continued advances in multifidelity surrogate construction and tighter posterior error analysis will further expand the range of scientific applications accessible to ABC methods, especially in domains with substantial model complexity or scarce computational budgets. Emerging hybridization with MLMC and automated surrogate design remains an active research frontier.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Multifidelity ABC Sequential Monte Carlo.