2000 character limit reached
Neural Networks are Integrable
Published 22 Oct 2023 in math.NA and cs.NA | (2310.14394v2)
Abstract: In this study, we explore the integration of Neural Networks, a powerful class of functions known for their exceptional approximation capabilities. Our primary emphasis is on the integration of multi-layer Neural Networks, a challenging task within this domain. To tackle this challenge, we introduce a novel numerical method that consist of a forward algorithm and a corrective procedure. Our experimental results demonstrate the accuracy achieved through our integration approach.
- Fast likelihood-free cosmology with neural density estimators and active learning. Monthly Notices of the Royal Astronomical Society, 488(3):4440–4458, 2019.
- Understanding Deep Neural Networks with Rectified Linear Units. In International Conference on Learning Representations, 2018.
- Andrew R Barron. Approximation and estimation bounds for artificial neural networks. Machine learning, 14:115–133, 1994.
- Physics-informed neural networks (PINNs) for fluid mechanics: A review. Acta Mechanica Sinica, 37(12):1727–1738, 2021.
- Neural networks: A review from a statistical perspective. Statistical science, pages 2–30, 1994.
- George Cybenko. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4):303–314, 1989.
- Neural network approximation. Acta Numerica, 30:327–444, 2021.
- A family of embedded Runge-Kutta formulae. Journal of computational and applied mathematics, 6(1):19–26, 1980.
- The Deep Ritz Method: A Deep Learning-Based Numerical Algorithm for Solving Variational Problems. Communications in Mathematics and Statistics, 6(1):1–12, Mar 2018.
- Leonhard Euler. Institutionum calculi integralis, volume 4. impensis Academiae imperialis scientiarum, 1845.
- Quantum mechanics and path integrals. Courier Corporation, 2010.
- Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 770–778, 2016.
- WÂ HJ. Introduction to Quantum Mechanics: Schrodinger Equation and Path Integral. World Scientific, 2012.
- Kurt Hornik. Approximation capabilities of multilayer feedforward networks. Neural networks, 4(2):251–257, 1991.
- Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, Jun 2021.
- Universal approximation with deep narrow networks. In Conference on learning theory, pages 2306–2327. PMLR, 2020.
- Neural Operator: Learning Maps Between Function Spaces With Applications to PDEs. Journal of Machine Learning Research, 24(89):1–97, 2023.
- Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6):84–90, 2017.
- Using machine learning to greatly accelerate path integral ab initio molecular dynamics. Journal of Chemical Theory and Computation, 18(2):599–604, 2022.
- Fourier Neural Operator for Parametric Partial Differential Equations. In International Conference on Learning Representations, 2021.
- Using neural networks for fast numerical integration and optimization. IEEE Access, 8:84519–84531, 2020.
- Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3(3):218–229, Mar 2021.
- Likelihood-free inference with emulator networks. In Symposium on Advances in Approximate Bayesian Inference, pages 32–53. PMLR, 2019.
- Neural networks for density estimation. Advances in Neural Information Processing Systems, 11, 1998.
- Physics-informed neural networks for high-speed flows. Computer Methods in Applied Mechanics and Engineering, 360:112789, 2020.
- Ab initio path integral molecular dynamics: Basic ideas. The Journal of chemical physics, 104(11):4077–4082, 1996.
- Nerf: representing scenes as neural radiance fields for view synthesis. 65(1):99–106, dec 2021.
- fPInns: Fractional physics-informed neural networks. SIAM Journal on Scientific Computing, 41(4):A2603–A2626, 2019.
- Allan Pinkus. Approximation theory of the MLP model in neural networks. Acta numerica, 8:143–195, 1999.
- Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
- The convergence rate of neural networks for learned functions of different frequencies. Advances in Neural Information Processing Systems, 32, 2019.
- Solving ordinary differential equations II, volume 375. Springer Berlin Heidelberg New York, 1996.
- Halbert White. Learning in artificial neural networks: A statistical perspective. Neural computation, 1(4):425–464, 1989.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.