Geometric Generalization of Neural Operators from Kernel Integral Perspective
Abstract: Neural operators are neural network-based surrogate models for approximating solution operators of parametric partial differential equations, enabling efficient many-query computations in science and engineering. Many applications, including engineering design, involve variable and often nonparametric geometries, for which generalization to unseen geometries remains a central practical challenge. In this work, we adopt a kernel integral perspective motivated by classical boundary integral formulations and recast operator learning on variable geometries as the approximation of geometry-dependent kernel operators, potentially with singularities. This perspective clarifies a mechanism for geometric generalization and reveals a direct connection between operator learning and fast kernel summation methods. Leveraging this connection, we propose a multiscale neural operator inspired by Ewald summation for learning and efficiently evaluating unknown kernel integrals, and we provide theoretical accuracy guarantees for the resulting approximation. Numerical experiments demonstrate robust generalization across diverse geometries for several commonly used kernels and for a large-scale three-dimensional fluid dynamics example.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.