PAS-Net: Physics-informed Adaptive Scale Deep Operator Network
Abstract: Nonlinear physical phenomena often show complex multiscale interactions; motivated by the principles of multiscale modeling in scientific computing, we propose PAS-Net, a physics-informed Adaptive-Scale Deep Operator Network for learning solution operators of nonlinear and singularly perturbed evolution PDEs with small parameters and localized features. Specifically, PAS-Net augments the trunk input in the physics informed Deep Operator Network (PI-DeepONet) with a prescribed (or learnable) locally rescaled coordinate transformation centered at reference points. This addition introduces a multiscale feature embedding that acts as an architecture-independent preconditioner which improves the representation of localized, stiff, and multiscale dynamics. From an optimization perspective, the adaptive-scale embedding in PAS-Net modifies the geometry of the Neural Tangent Kernel (NTK) associated with the neural network by increasing its smallest eigenvalue, which in turn improves spectral conditioning and accelerates gradient-based convergence. We further show that this adaptive-scale mechanism explicitly accelerates neural network training in approximating functions with steep transitions and strong asymptotic behavior, and we provide a rigorous proof of this function-approximation result within the finite-dimensional NTK matrix framework. We test the proposed PAS-Net on three different problems: (i) the one-dimensional viscous Burgers equation, (ii) a nonlinear diffusion-reaction system with sharp spatial gradients, and (iii) a two-dimensional eikonal equation. The numerical results show that PAS-Net consistently achieves higher accuracy and faster convergence than the standard DeepONet and PI-DeepONet models under a similar training cost.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.