Published 16 Nov 2025 in stat.ML and cs.LG | (2511.12783v1)
Abstract: Bayesian optimization (BO) has been widely used to optimize expensive and gradient-free objective functions across various domains. However, existing BO methods have not addressed the objective where both inputs and outputs are functions, which increasingly arise in complex systems as advanced sensing technologies. To fill this gap, we propose a novel function-on-function Bayesian optimization (FFBO) framework. Specifically, we first introduce a function-on-function Gaussian process (FFGP) model with a separable operator-valued kernel to capture the correlations between function-valued inputs and outputs. Compared to existing Gaussian process models, FFGP is modeled directly in the function space. Based on FFGP, we define a scalar upper confidence bound (UCB) acquisition function using a weighted operator-based scalarization strategy. Then, a scalable functional gradient ascent algorithm (FGA) is developed to efficiently identify the optimal function-valued input. We further analyze the theoretical properties of the proposed method. Extensive experiments on synthetic and real-world data demonstrate the superior performance of FFBO over existing approaches.
The paper presents a novel framework that extends Bayesian optimization to infinite-dimensional function spaces by directly modeling both functional inputs and outputs.
It leverages separable operator-valued kernels and spectral decomposition to build accurate surrogate models and ensure computational efficiency.
Empirical evaluations demonstrate lower regret and superior robustness on synthetic benchmarks and real-world engineering tasks.
Function-on-Function Bayesian Optimization: A Formal Technical Review
Problem Setting and Motivation
This paper, "Function-on-Function Bayesian Optimization" (2511.12783), addresses the Bayesian optimization (BO) of objectives where both the input and output are functions—formally, f:Xp→Y, with x∈Xp and f(x)∈Y belonging to infinite-dimensional spaces (typically, L2 Hilbert spaces). This setting arises in advanced engineering systems with rich sensing data, such as metamaterial design in 3D-printed biovalves, where the input space and the output measurements are naturally represented by functions. Prior work on BO has largely focused on scalar, vector, or function-on-vector settings; the optimization of objectives with both functional inputs and outputs remains unaddressed.
Function-on-Function Gaussian Process (FFGP) Surrogate Model
The authors introduce a rigorous FFGP surrogate model, leveraging operator-valued kernels to directly encode correlation structures between functional variables without relying on discretization. Specifically, they employ a separable operator-valued kernel:
Numerical stability is addressed by truncating the kernel’s spectral expansion to the top m eigenpairs, ensuring forward computational scaling and controlled approximation error.
Acquisition Function and Optimization Algorithm
To enable BO in function spaces, a scalarization strategy is imperative. The paper adopts a weighted operator-based linear functional:
with Ï•(t) a learnable or user-specified weight. This yields a scalar GP for the acquisition function, allowing application of UCB or EI-type objectives:
Rigorous theoretical analyses establishes that the FFGP model defines a proper measure and is strictly measurable, under operator trace-class and boundedness conditions. The truncated expansion of the spectral kernel converges to the true posterior at O(m−1), and the regret of the resulting FFBO algorithm is sub-linear (O⋆(T​)), matching the best-known results for conventional BO in finite-dimensional settings.
Empirical Results
On three synthetic benchmarks with known optimal function-valued inputs and outputs, FFBO demonstrates consistently lower regret and improved optima identification compared to FIBO, FOBO, and MTBO baselines, which represent leading techniques for function-valued or multi-task BO but lack the capacity for simultaneous function-on-function modeling.
Figure 1: The optimal input (Left) and each round’s simple regret (Right) across Settings 1–3, illustrating superior convergence dynamics for FFBO versus all baselines.
Additionally, on a meta-material valve optimization task with stress–strain curve outputs and complex sinusoidal input parametrizations, FFBO efficiently explores the functional input space and rapidly converges to high-quality designs.
Figure 2: Each round’s simple regret in the 3D-printed valve case study, with FFBO demonstrating notably faster convergence and improved robustness.
Extended evaluations, including high-dimensional functional input scenarios and varying observational noise levels, demonstrate persistent robustness and accuracy of FFBO.
Figure 3: The optimal input and regret trajectories over 20 iterations for multi-dimensional functional input experiments (Setting 4), reinforcing FFBO’s scalability and resilience.
Under heightened noise conditions, FFBO’s performance remains stable and outpaces alternatives.
Figure 4: The optimal input (Left) and each round’s simple regret (Right) under increased noise for Settings 1–3, showing FFBO’s noise robustness.
Implications and Future Directions
The methodology introduced in this work extends the frontier of Gaussian Process surrogate modeling and Bayesian optimization to truly infinite-dimensional domains, bridging the gap between functional data analysis and global optimization. This framework lays the foundation for BO frameworks in experimental design, control, and calibration tasks where both controls and measurements are natively functional.
Practically, FFBO enables sample-efficient, uncertainty-aware optimization of complex function-valued systems, with direct applicability in advanced manufacturing, computational physiology, and beyond. From a theoretical perspective, the operator-valued kernel techniques and function-to-function modeling strategies may motivate extensions to tensor-valued and manifold-valued domains.
Possible future extensions entail handling non-separable operator-valued kernels, non-Gaussian likelihoods for function-valued outputs, and scalable variants that exploit low-rank or randomized spectral computations for very large functional datasets. Coupling FFBO with deep functional representations (e.g., neural operators) may further broaden its reach into settings with high-dimensional unstructured input and output spaces.
Conclusion
This paper establishes a rigorous and practical Bayesian optimization framework for problems where both inputs and outputs are functions. By defining a function-on-function Gaussian process with separable operator-valued kernels and developing a scalable functional gradient ascent algorithm, the authors enable direct Bayesian optimization in infinite-dimensional spaces. Theoretical and empirical results collectively validate the effectiveness, efficiency, and robustness of FFBO. This work provides a substantial advance for BO methodologies and functional data-driven optimization tasks.