Papers
Topics
Authors
Recent
Search
2000 character limit reached

Neural-Guided Search in Formal Verification

Updated 26 January 2026
  • Neural-guided search is a hybrid technique that uses neural networks to generate candidate certificates and controllers while relying on formal methods to enforce correctness.
  • The process employs an iterative CEGIS loop where neural proposals are refined using counterexamples from formal engines like SMT solvers.
  • This approach achieves efficiency and formal or PAC-based guarantees in synthesizing certificates for high-dimensional and non-polynomial system verification.

Neural-guided search is an emerging paradigm in formal methods, control, and software analysis that integrates machine learning—typically neural networks—into the synthesis or verification search process. Neural models are used to generate candidates (such as certificates, invariants, or controllers), while an external formal engine, such as an SMT solver, rigorously enforces correctness or provides counterexamples to iteratively refine the neural model.

1. Definition and Scope

Neural-guided search refers to algorithms in which neural networks parameterize a search space for candidate objects—such as barrier certificates, Lyapunov functions, specification contracts, or programs—and guide the induction of these candidates toward satisfying formal correctness properties. Unlike pure data-driven training, the process is fundamentally hybrid: a neural network acts as a template or heuristic generator, while a symbolic engine (e.g., SMT solver, MILP solver, SDP engine) ensures that candidates meet strict requirements or else exposes counterexamples.

This paradigm is particularly prominent in the synthesis of certificates and controllers for dynamical and hybrid systems, where the synthesis problem is typically hard, non-convex, and often intractable with purely symbolic or purely data-driven methods (Edwards et al., 2023, Ravanbakhsh, 2018, Rickard et al., 8 Feb 2025, Rickard et al., 17 Mar 2025).

2. The Neural-Guided Search Workflow

A prototypical neural-guided search process for certificate synthesis involves the following iterative loop:

  1. Neural Proposal: The neural network generates a candidate object (certificate, controller, etc.) based on current parameter values.
  2. Formal Verification: An external checker (usually an SMT solver, MILP solver, or convex program) verifies whether the candidate meets the required formal conditions over the desired domain.
  3. Counterexample Extraction: If a violation is detected, the formal engine provides a concrete counterexample (state or trajectory) where the candidate fails.
  4. Data Augmentation: The training set is enriched with the counterexample information, and the neural network is retrained or updated to reduce the failure at this (and other) points.
  5. Looping: Steps 1–4 are repeated until the candidate passes all checks or a resource limit is reached.

This forms a counterexample-guided inductive synthesis (CEGIS) loop augmented by neural templates (Edwards et al., 2023, Ravanbakhsh, 2018, Rickard et al., 8 Feb 2025).

Example: In Fossil 2.0, candidate certificates (e.g., Lyapunov or barrier functions) are produced by neural nets, checked by SMT solvers (Z3/dReal/CVC5), and refined via CEGIS. The neural network is updated using a loss function representing verification conditions; newly discovered counterexamples are used to augment the training data (Edwards et al., 2023).

3. Neural Parametrizations and Loss Functions

Neural networks act as universal function approximators and can efficiently parameterize complex candidate classes, including non-polynomial certificates not tractable with classical sum-of-squares (SOS) approaches. Common architectures include fully connected feedforward networks, often using activation functions tailored to the certificate type (e.g., squared or tanh activations for energy-like functions) (Edwards et al., 2023, Rickard et al., 17 Mar 2025, Rickard et al., 8 Feb 2025).

The loss function used in neural-guided search encodes the formal conditions to be satisfied:

  • State constraints (safety, invariance): Penalize the violation of certificate requirements in sets of interest (e.g., initial, unsafe, goal).
  • Trajectory or derivative constraints: Penalize the violation of barrier certificate or Lyapunov decrease conditions along simulated or sampled trajectories.

For instance, in data-driven neural certificate synthesis, the loss is composed of a "state-loss" that penalizes initial/unsafe violations and a "trajectory-loss" that penalizes points where the decrease condition is not met (Rickard et al., 17 Mar 2025, Rickard et al., 8 Feb 2025).

4. Integration with Formal Methods

The core strength of neural-guided search is its formal correctness guarantee, achieved through rigorous verification of the candidate produced by the neural network. Verification is generally performed by:

  • SMT solvers: Quantifier-free encodings of the negation of the certificate conditions are checked for satisfiability in Z3, dReal, or CVC5 (Edwards et al., 2023). If UNSAT, the candidate is formally certified correct over the domain.
  • Convex optimization/MILP/SDP: When templates are polynomial, traditional SOS or SDP relaxations can be used; with neural nets, this is less common, but possible if activations are polynomial.
  • Scenario/PAC analysis: If only data-driven sample-based verification is feasible (due to system complexity or black-box dynamics), generalization guarantees can be provided using scenario theory and PAC/compression-style bounds (Rickard et al., 17 Mar 2025, Rickard et al., 8 Feb 2025).

A key aspect is the CEGIS loop, where formal engines not only verify but also supply counterexamples for re-training.

5. Applications and Theoretical Guarantees

Certificate and Controller Synthesis

Neural-guided search is used for Lyapunov stability certificates, barrier certificates for safety, and compound certificates for properties such as reach-avoid, reach-while-stay, and region-of-attraction. In Fossil 2.0, this covers both continuous- and discrete-time systems, uncontrolled and controlled cases (Edwards et al., 2023).

Generalization Guarantees

In data-driven approaches, neural-guided search provides probably approximately correct (PAC) risk bounds using scenario theory. The bound on the probability of violation depends only on the size of the compression set rather than the dimension of the neural net, enabling generalization to unseen trajectories (Rickard et al., 8 Feb 2025, Rickard et al., 17 Mar 2025).

Performance

Benchmark results indicate practical efficiency: in high-dimensional systems (e.g., 8D), synthesis times are on the order of tens of seconds to minutes, with risk bounds ε ⁣ ⁣0.02\varepsilon \!\approx\! 0.02 at high confidence levels (11051-10^{-5}). Comparison shows orders of magnitude fewer samples required than non-certificate direct-evaluation methods (Rickard et al., 17 Mar 2025, Rickard et al., 8 Feb 2025).

Table: Summary of Recent Neural-Guided Search Implementations

Paper Domain Neural Role Formal Engine Guarantee Type
(Edwards et al., 2023) ODE/Difference Certificate templates, controller SMT (Z3/dReal/CVC5) Formal (SMT-based)
(Rickard et al., 8 Feb 2025, Rickard et al., 17 Mar 2025) Discrete/CT ODE Barrier/Reachability certificates Scenario-based PAC Probabilistic (compression)
(Ravanbakhsh, 2018) Nonlinear control Polynomial certificate templates SMT/SDP Formal (template exhaustivity)
(Ma et al., 2021) RL/control Safety index, policy functions Lagrangian constraints Local optimality + feasibility

6. Limitations and Open Problems

Current limitations include:

  • Scalability: The verification phase (SMT or SDP solve) may become a bottleneck, particularly for high-dimensional, stiff, or non-polynomial systems.
  • Certification Scope: Neural-guided search typically certifies properties only over the sampled or specified domain, not globally.
  • Architecture Selection: Finding sufficiently expressive yet trainable neural templates is nontrivial; lack of theoretical guarantees for convergence in nonconvex training remains.
  • Generalization: PAC bounds may be conservative, especially if the compression set is large; refinement of bounds and algorithms to keep compression minimal is ongoing (Rickard et al., 8 Feb 2025).

Extensions to continuous-time controlled systems and systematic neural architecture design are open research directions (Rickard et al., 17 Mar 2025).

7. Significance and Outlook

Neural-guided search bridges the gap between expressive function classes enabled by neural parameterizations and the rigor of formal verification. In certificate and controller synthesis for dynamical systems, it achieves formal or statistical guarantees while scaling to high-dimensional and non-polynomial scenarios where purely symbolic or classical data-driven approaches fail. The paradigm is increasingly adopted in tools such as Fossil 2.0 and in scenario-based certificate learning, demonstrating significant impact on practical verification and correct-by-construction control (Edwards et al., 2023, Rickard et al., 17 Mar 2025, Rickard et al., 8 Feb 2025, Ravanbakhsh, 2018).

A plausible implication is that neural-guided search frameworks—combining data-driven candidate generation with symbolic or PAC-based formal checking—will become standard methodologies for scalable, certified synthesis and verification in nonlinear, hybrid, and learning-enabled systems.

Topic to Video (Beta)

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Neural-Guided Search.