Papers
Topics
Authors
Recent
Search
2000 character limit reached

Neural Networks are Integrable

Published 22 Oct 2023 in math.NA and cs.NA | (2310.14394v2)

Abstract: In this study, we explore the integration of Neural Networks, a powerful class of functions known for their exceptional approximation capabilities. Our primary emphasis is on the integration of multi-layer Neural Networks, a challenging task within this domain. To tackle this challenge, we introduce a novel numerical method that consist of a forward algorithm and a corrective procedure. Our experimental results demonstrate the accuracy achieved through our integration approach.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Fast likelihood-free cosmology with neural density estimators and active learning. Monthly Notices of the Royal Astronomical Society, 488(3):4440–4458, 2019.
  2. Understanding Deep Neural Networks with Rectified Linear Units. In International Conference on Learning Representations, 2018.
  3. Andrew R Barron. Approximation and estimation bounds for artificial neural networks. Machine learning, 14:115–133, 1994.
  4. Physics-informed neural networks (PINNs) for fluid mechanics: A review. Acta Mechanica Sinica, 37(12):1727–1738, 2021.
  5. Neural networks: A review from a statistical perspective. Statistical science, pages 2–30, 1994.
  6. George Cybenko. Approximation by superpositions of a sigmoidal function. Mathematics of control, signals and systems, 2(4):303–314, 1989.
  7. Neural network approximation. Acta Numerica, 30:327–444, 2021.
  8. A family of embedded Runge-Kutta formulae. Journal of computational and applied mathematics, 6(1):19–26, 1980.
  9. The Deep Ritz Method: A Deep Learning-Based Numerical Algorithm for Solving Variational Problems. Communications in Mathematics and Statistics, 6(1):1–12, Mar 2018.
  10. Leonhard Euler. Institutionum calculi integralis, volume 4. impensis Academiae imperialis scientiarum, 1845.
  11. Quantum mechanics and path integrals. Courier Corporation, 2010.
  12. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 770–778, 2016.
  13. W HJ. Introduction to Quantum Mechanics: Schrodinger Equation and Path Integral. World Scientific, 2012.
  14. Kurt Hornik. Approximation capabilities of multilayer feedforward networks. Neural networks, 4(2):251–257, 1991.
  15. Physics-informed machine learning. Nature Reviews Physics, 3(6):422–440, Jun 2021.
  16. Universal approximation with deep narrow networks. In Conference on learning theory, pages 2306–2327. PMLR, 2020.
  17. Neural Operator: Learning Maps Between Function Spaces With Applications to PDEs. Journal of Machine Learning Research, 24(89):1–97, 2023.
  18. Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6):84–90, 2017.
  19. Using machine learning to greatly accelerate path integral ab initio molecular dynamics. Journal of Chemical Theory and Computation, 18(2):599–604, 2022.
  20. Fourier Neural Operator for Parametric Partial Differential Equations. In International Conference on Learning Representations, 2021.
  21. Using neural networks for fast numerical integration and optimization. IEEE Access, 8:84519–84531, 2020.
  22. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3(3):218–229, Mar 2021.
  23. Likelihood-free inference with emulator networks. In Symposium on Advances in Approximate Bayesian Inference, pages 32–53. PMLR, 2019.
  24. Neural networks for density estimation. Advances in Neural Information Processing Systems, 11, 1998.
  25. Physics-informed neural networks for high-speed flows. Computer Methods in Applied Mechanics and Engineering, 360:112789, 2020.
  26. Ab initio path integral molecular dynamics: Basic ideas. The Journal of chemical physics, 104(11):4077–4082, 1996.
  27. Nerf: representing scenes as neural radiance fields for view synthesis. 65(1):99–106, dec 2021.
  28. fPInns: Fractional physics-informed neural networks. SIAM Journal on Scientific Computing, 41(4):A2603–A2626, 2019.
  29. Allan Pinkus. Approximation theory of the MLP model in neural networks. Acta numerica, 8:143–195, 1999.
  30. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational physics, 378:686–707, 2019.
  31. The convergence rate of neural networks for learned functions of different frequencies. Advances in Neural Information Processing Systems, 32, 2019.
  32. Solving ordinary differential equations II, volume 375. Springer Berlin Heidelberg New York, 1996.
  33. Halbert White. Learning in artificial neural networks: A statistical perspective. Neural computation, 1(4):425–464, 1989.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.