The Neural Process Family: Survey, Applications and Perspectives
Abstract: The standard approaches to neural network implementation yield powerful function approximation capabilities but are limited in their abilities to learn meta representations and reason probabilistic uncertainties in their predictions. Gaussian processes, on the other hand, adopt the Bayesian learning scheme to estimate such uncertainties but are constrained by their efficiency and approximation capacity. The Neural Processes Family (NPF) intends to offer the best of both worlds by leveraging neural networks for meta-learning predictive uncertainties. Such potential has brought substantial research activity to the family in recent years. Therefore, a comprehensive survey of NPF models is needed to organize and relate their motivation, methodology, and experiments. This paper intends to address this gap while digging deeper into the formulation, research themes, and applications concerning the family members. We shed light on their potential to bring several recent advances in other deep learning domains under one umbrella. We then provide a rigorous taxonomy of the family and empirically demonstrate their capabilities for modeling data generating functions operating on 1-d, 2-d, and 3-d input domains. We conclude by discussing our perspectives on the promising directions that can fuel the research advances in the field. Code for our experiments will be made available at https://github.com/srvCodes/neural-processes-survey.
- S. Casas, C. Gulino, S. Suo, K. Luo, R. Liao, and R. Urtasun, “Implicit latent variable model for scene-consistent motion forecasting,” in ECCV. Springer, 2020, pp. 624–641.
- N. Abcouwer, S. Daftry, T. del Sesto, O. Toupet, M. Ono, S. Venkatraman, R. Lanka, J. Song, and Y. Yue, “Machine learning based path planning for improved rover navigation,” in IEEE AeroConf. IEEE, 2021, pp. 1–9.
- J. F. Quinting and C. M. Grams, “Eulerian identification of ascending airstreams (elias 2.0) in numerical weather prediction and climate models–part 1: Development of deep learning model,” GMD, vol. 15, no. 2, pp. 715–730, 2022.
- M. P. McBee, O. A. Awan, A. T. Colucci, C. W. Ghobadi, N. Kadom, A. P. Kansagra, S. Tridandapani, and W. F. Auffermann, “Deep learning in radiology,” Acad. Radiol., vol. 25, no. 11, pp. 1472–1480, 2018.
- M. Wang, P. Wong, H. Luo, S. Kumar, V. Delhi, and J. Cheng, “Predicting safety hazards among construction workers and equipment using computer vision and deep learning techniques,” in ISARC, vol. 36. IAARC Publications, 2019, pp. 399–406.
- Y. LeCun, “A path towards autonomous machine intelligence,” 2022. [Online]. Available: https://openreview.net/pdf?id=BZ5a1r-kVsf
- C. Louart and R. Couillet, “A random matrix and concentration inequalities framework for neural networks analysis,” in ICASSP. IEEE, 2018, pp. 4214–4218.
- A. Pereira and C. Thomas, “Challenges of machine learning applied to safety-critical cyber-physical systems,” Mach. learn. knowl., vol. 2, no. 4, pp. 579–602, 2020.
- C. Nguyen, T.-T. Do, and G. Carneiro, “Uncertainty in model-agnostic meta-learning using variational inference,” in Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, 2020, pp. 3090–3100.
- D. J. MacKay, “Probable networks and plausible predictions-a review of practical bayesian methods for supervised neural networks,” Network: computation in neural systems, vol. 6, no. 3, p. 469, 1995.
- C. Gelada and J. Buckman, “Bayesian neural networks need not concentrate,” https://jacobbuckman.com/2020-01-22-bayesian-neural-networks-need-not-concentrate/, 2020.
- M. Garnelo, D. Rosenbaum, C. Maddison, T. Ramalho, D. Saxton, M. Shanahan, Y. W. Teh, D. Rezende, and S. A. Eslami, “Conditional neural processes,” in ICML. PMLR, 2018, pp. 1704–1713.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” NeurIPS, vol. 30, 2017.
- T. A. Le, H. Kim, M. Garnelo, D. Rosenbaum, J. Schwarz, and Y. W. Teh, “Empirical evaluation of neural process objectives,” in NeurIPS workshop on Bayesian Deep Learning, 2018, p. 71.
- M. Zaheer, S. Kottur, S. Ravanbakhsh, B. Poczos, R. R. Salakhutdinov, and A. J. Smola, “Deep sets,” NeurIPS, vol. 30, 2017.
- J. Gordon, W. P. Bruinsma, A. Y. K. Foong, J. Requeima, Y. Dubois, and R. E. Turner, “Convolutional conditional neural processes,” in ICLR, 2020. [Online]. Available: https://openreview.net/forum?id=Skey4eBYPS
- M. Kawano, W. Kumagai, A. Sannai, Y. Iwasawa, and Y. Matsuo, “Group equivariant conditional neural processes,” in ICLR, 2021. [Online]. Available: https://openreview.net/forum?id=e8W-hsu_q5
- J. Snell, K. Swersky, and R. Zemel, “Prototypical networks for few-shot learning,” NeurIPS, vol. 30, 2017.
- A. Van den Oord, N. Kalchbrenner, L. Espeholt, O. Vinyals, A. Graves et al., “Conditional image generation with pixelcnn decoders,” NeurIPS, vol. 29, 2016.
- W. Bruinsma, S. Markou, J. Requeima, A. Y. K. Foong, A. Vaughan, T. Andersson, A. Buonomo, S. Hosking, and R. E. Turner, “Autoregressive conditional neural processes,” in ICLR, 2023. [Online]. Available: https://openreview.net/forum?id=OAsXFPBfTBh
- B. Yoo, J. Lee, J. Ju, S. Chung, S. Kim, and J. Choi, “Conditional temporal neural processes with covariance loss,” in ICML. PMLR, 2021, pp. 12 051–12 061.
- T. Nguyen and A. Grover, “Transformer neural processes: Uncertainty-aware meta learning via sequence modeling,” in ICML. PMLR, 2022, pp. 16 569–16 594.
- D. P. Kingma and M. Welling, “Auto-Encoding Variational Bayes,” in ICLR, 2014.
- R. L. Murphy, B. Srinivasan, V. Rao, and B. Ribeiro, “Janossy pooling: Learning deep permutation-invariant functions for variable-size inputs,” in ICLR, 2019. [Online]. Available: https://openreview.net/forum?id=BJluy2RcFm
- E. Wagstaff, F. B. Fuchs, M. Engelcke, M. A. Osborne, and I. Posner, “Universal approximation of functions on sets,” arXiv preprint arXiv:2107.01959, 2021.
- S. A. Eslami, D. Jimenez Rezende, F. Besse, F. Viola, A. S. Morcos, M. Garnelo, A. Ruderman, A. A. Rusu, I. Danihelka, K. Gregor et al., “Neural scene representation and rendering,” Science, vol. 360, no. 6394, pp. 1204–1210, 2018.
- L. Hewitt, A. Gane, T. Jaakkola, and J. B. Tenenbaum, “The variational homoencoder: Learning to infer high-capacity generative models from few examples,” 2018. [Online]. Available: https://openreview.net/forum?id=HJ5AUm-CZ
- J.-F. Ton, L. CHAN, Y. Whye Teh, and D. Sejdinovic, “Noise contrastive meta-learning for conditional density estimation using kernel mean embeddings,” in AISTATS, ser. PMLR, A. Banerjee and K. Fukumizu, Eds., vol. 130. PMLR, 13–15 Apr 2021, pp. 1099–1107. [Online]. Available: https://proceedings.mlr.press/v130/ton21a.html
- M. Garnelo, J. Schwarz, D. Rosenbaum, F. Viola, D. J. Rezende, S. Eslami, and Y. W. Teh, “Neural processes,” arXiv preprint arXiv:1807.01622, 2018.
- H. Kim, A. Mnih, J. Schwarz, M. Garnelo, A. Eslami, D. Rosenbaum, O. Vinyals, and Y. W. Teh, “Attentive neural processes,” in ICLR, 2019. [Online]. Available: https://openreview.net/forum?id=SkE6PjC9KX
- J. Lee, Y. Lee, J. Kim, A. Kosiorek, S. Choi, and Y. W. Teh, “Set transformer: A framework for attention-based permutation-invariant neural networks,” in ICML. PMLR, 2019, pp. 3744–3753.
- C. Finn, P. Abbeel, and S. Levine, “Model-agnostic meta-learning for fast adaptation of deep networks,” in ICML. PMLR, 2017, pp. 1126–1135.
- A. Nichol and J. Schulman, “Reptile: a scalable metalearning algorithm,” arXiv preprint arXiv:1803.02999, vol. 2, no. 3, p. 4, 2018.
- A. Antoniou, H. Edwards, and A. Storkey, “How to train your MAML,” in ICLR, 2019. [Online]. Available: https://openreview.net/forum?id=HJGven05Y7
- I. Molybog and J. Lavaei, “When does maml objective have benign landscape?” in CCTA. IEEE, 2021, pp. 220–227.
- Z. Lin and H. Zhu, “Mfpca: multiscale functional principal component analysis,” in AAAI, vol. 33, no. 01, 2019, pp. 4320–4327.
- F. Rossi, N. Delannay, B. Conan-Guez, and M. Verleysen, “Representation of functional data in neural networks,” Neurocomputing, vol. 64, pp. 183–210, 2005.
- T.-Y. Hsieh, Y. Sun, S. Wang, and V. Honavar, “Functional autoencoders for functional data representation learning,” in SDM. SIAM, 2021, pp. 666–674.
- T.-Y. Hsieh, Y. Sun, X. Tang, S. Wang, and V. G. Honavar, “Srvarm: state regularized vector autoregressive model for joint learning of hidden state transitions and state-dependent inter-variable dependencies from multi-variate time series,” in WWW, 2021, pp. 2270–2280.
- B. Bloem-Reddy and Y. W. Teh, “Probabilistic symmetries and invariant neural networks.” J. Mach. Learn. Res., vol. 21, pp. 90–1, 2020.
- J. Xu, J.-F. Ton, H. Kim, A. Kosiorek, and Y. W. Teh, “Metafun: Meta-learning with iterative functional updates,” in ICML. PMLR, 2020, pp. 10 617–10 627.
- A. Pakman, Y. Wang, C. Mitelut, J. Lee, and L. Paninski, “Neural clustering processes,” in ICML, ser. PMLR, H. D. III and A. Singh, Eds., vol. 119. PMLR, 13–18 Jul 2020, pp. 7455–7465. [Online]. Available: https://proceedings.mlr.press/v119/pakman20a.html
- A. G. Wilson, Z. Hu, R. Salakhutdinov, and E. P. Xing, “Deep kernel learning,” in AISTATS. PMLR, 2016, pp. 370–378.
- T. G. Rudner, V. Fortuin, Y. W. Teh, and Y. Gal, “On the connection between neural processes and gaussian processes with deep kernels,” in Workshop on Bayesian Deep Learning at NeurIPS, 2018, p. 14.
- E. Tsymbalov, S. Makarychev, A. Shapeev, and M. Panov, “Deeper connections between neural networks and gaussian processes speed-up active learning,” in IJCAI, 2019, pp. 3599–3605.
- E. Grant, C. Finn, S. Levine, T. Darrell, and T. Griffiths, “Recasting gradient-based meta-learning as hierarchical bayes,” in ICLR, 2018. [Online]. Available: https://openreview.net/forum?id=BJ_UL-k0b
- E. Fong and C. C. Holmes, “On the marginal likelihood and cross-validation,” Biometrika, vol. 107, no. 2, pp. 489–496, 2020.
- C. Ma, S. Tschiatschek, K. Palla, J. M. Hernandez-Lobato, S. Nowozin, and C. Zhang, “Eddi: Efficient dynamic discovery of high-value information with partial vae,” in ICML. PMLR, 2019, pp. 4234–4243.
- C. Ma, Y. Li, and J. M. Hernandez-Lobato, “Variational implicit processes,” in ICML, ser. PMLR, K. Chaudhuri and R. Salakhutdinov, Eds., vol. 97. PMLR, 09–15 Jun 2019, pp. 4222–4233. [Online]. Available: https://proceedings.mlr.press/v97/ma19b.html
- L. Ruthotto and E. Haber, “An introduction to deep generative modeling,” GAMM-Mitteilungen, vol. 44, no. 2, p. e202100008, 2021.
- H. Edwards and A. Storkey, “Towards a neural statistician,” in ICLR, 2017. [Online]. Available: https://openreview.net/forum?id=HJDBUF5le
- K. Dvijotham, M. Garnelo, A. Fawzi, and P. Kohli, “Verification of deep probabilistic models,” in SecML workshop at NeurIPS, 2018.
- A. Carr, J. Nielsen, and D. Wingate, “Wasserstein Neural Processes,” in Optimal Transport and Machine Learning Workshop at NeurIPS,, 2019.
- A. Suresh and S. Srinivasan, “Improved attentive neural processes,” in Fourth Workshop on Bayesian Deep Learning at NeurIPS, 2019.
- B.-J. Lee, S. Hong, and K.-E. Kim, “Residual neural processes,” in AAAI, vol. 34, no. 04, 2020, pp. 4545–4552.
- J. Yoon, G. Singh, and S. Ahn, “Robustifying sequential neural processes,” in ICML. PMLR, 2020, pp. 10 861–10 870.
- Q. Wang and H. Van Hoof, “Doubly stochastic variational inference for neural processes with hierarchical latent variables,” in ICML. PMLR, 2020, pp. 10 018–10 028.
- M. Kim, K. R. Go, and S.-Y. Yun, “Neural processes with stochastic attention: Paying more attention to the context dataset,” in ICLR, 2022. [Online]. Available: https://openreview.net/forum?id=JPkQwEdYn8
- X. Yu and S. Mao, “Research on patch attentive neural process,” arXiv preprint arXiv:2202.01884, 2022.
- P. Holderrieth, M. J. Hutchinson, and Y. W. Teh, “Equivariant learning of stochastic fields: Gaussian processes and steerable conditional neural processes,” in ICML. PMLR, 2021, pp. 4297–4307.
- M. Nassar, X. Wang, and E. Tumer, “Conditional graph neural processes: A functional autoencoder approach,” in Third workshop on Bayesian Deep Learning at NeurIPS, 2018.
- A. Carr and D. Wingate, “Graph neural processes: Towards bayesian graph neural networks,” arXiv preprint arXiv:1902.10042, 2019.
- C. Louizos, X. Shi, K. Schutte, and M. Welling, “The functional neural process,” in NeurIPS, 2019.
- B. Day, C. Cangea, A. R. Jamasb, and P. Liò, “Message passing neural processes,” CoRR, vol. abs/2009.13895, 2020. [Online]. Available: https://arxiv.org/abs/2009.13895
- H. Liang and J. Gao, “How neural processes improve graph link prediction,” arXiv preprint arXiv:2109.14894, 2021.
- J. Lee, Y. Lee, J. Kim, E. Yang, S. J. Hwang, and Y. W. Teh, “Bootstrapping neural processes,” NeurIPS, vol. 33, pp. 6606–6615, 2020.
- A. Foong, W. Bruinsma, J. Gordon, Y. Dubois, J. Requeima, and R. Turner, “Meta-learning stationary stochastic process prediction with convolutional neural processes,” NeurIPS, vol. 33, pp. 8284–8295, 2020.
- X. Wang, L. Yao, X. Wang, F. Nie, and B. Benatallah, “Np-prov: Neural processes with position-relevant-only variances,” in WISE. Springer, 2021, pp. 129–142.
- W. Bruinsma, J. Requeima, A. Y. K. Foong, J. Gordon, and R. E. Turner, “The gaussian neural process,” in Third Symposium on Advances in Approximate Bayesian Inference, 2021. [Online]. Available: https://openreview.net/forum?id=rzsDn7Vzxf
- X. Wang, L. Yao, X. Wang, H.-y. Paik, and S. Wang, “Global convolutional neural processes,” in ICDM. IEEE, 2021, pp. 699–708.
- S. Markou, J. Requeima, W. Bruinsma, A. Vaughan, and R. E. Turner, “Practical conditional neural process via tractable dependent predictions,” in ICLR, 2022. [Online]. Available: https://openreview.net/forum?id=3pugbNqOh5m
- G. Singh, J. Yoon, Y. Son, and S. Ahn, “Sequential neural processes,” NeurIPS, vol. 32, 2019.
- T. Willi, J. Masci, J. Schmidhuber, and C. Osendorfer, “Recurrent neural processes,” in Fourth Workshop on Bayesian Deep Learning at NeurIPS,, 2019.
- S. Qin, J. Zhu, J. Qin, W. Wang, and D. Zhao, “Recurrent attentive neural process for sequential data,” CoRR, vol. abs/1910.09323, 2019. [Online]. Available: http://arxiv.org/abs/1910.09323
- A. Norcliffe, C. Bodnar, B. Day, J. Moss, and P. Liò, “Neural {ode} processes,” in ICLR, 2021. [Online]. Available: https://openreview.net/forum?id=27acGyyI1BY
- J. Petersen, G. Köhler, D. Zimmerer, F. Isensee, P. F. Jäger, and K. H. Maier-Hein, “Gp-convcnp: Better generalization for conditional convolutional neural processes on time series data,” in UAI, ser. PMLR, C. de Campos and M. H. Maathuis, Eds., vol. 161. PMLR, 27–30 Jul 2021, pp. 939–949. [Online]. Available: https://proceedings.mlr.press/v161/petersen21a.html
- J. Requeima, J. Gordon, J. Bronskill, S. Nowozin, and R. E. Turner, “Fast and flexible multi-task classification using conditional neural adaptive processes,” NeurIPS, vol. 32, 2019.
- D. Kim, S. Cho, W. Lee, and S. Hong, “Multi-task processes,” in ICLR, 2022. [Online]. Available: https://openreview.net/forum?id=9otKVlgrpZG
- K. Kallidromitis, D. Gudovskiy, K. Kazuki, O. Iku, and L. Rigazio, “Contrastive neural processes for self-supervised learning,” in ACML. PMLR, 2021, pp. 594–609.
- M. W. Gondal, S. Joshi, N. Rahaman, S. Bauer, M. Wuthrich, and B. Schölkopf, “Function contrastive learning of transferable meta-representations,” in ICML. PMLR, 2021, pp. 3755–3765.
- E. Mathieu, A. Foster, and Y. Teh, “On contrastive representations of stochastic processes,” NeurIPS, vol. 34, 2021.
- Z. Ye and L. Yao, “Contrastive conditional neural processes,” in CVPR, June 2022.
- N. Gao, H. Ziesche, N. A. Vien, M. Volpp, and G. Neumann, “What matters for meta-learning vision regression tasks?” in CVPR, 2022, pp. 14 776–14 786.
- S. Kolouri, P. E. Pope, C. E. Martin, and G. K. Rohde, “Sliced wasserstein auto-encoders,” in ICLR, 2019. [Online]. Available: https://openreview.net/forum?id=H1xaJn05FQ
- Q. Wang, M. Federici, and H. van Hoof, “Bridge the inference gaps of neural processes via expectation maximization,” in ICLR, 2023. [Online]. Available: https://openreview.net/forum?id=A7v2DqLjZdq
- S. Kolouri, Y. Zou, and G. K. Rohde, “Sliced wasserstein kernels for probability distributions,” in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 5258–5267.
- E. Wagstaff, F. Fuchs, M. Engelcke, I. Posner, and M. A. Osborne, “On the limitations of representing functions on sets,” in ICML. PMLR, 2019, pp. 6487–6494.
- J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei, “Imagenet: A large-scale hierarchical image database,” in CVPR. Ieee, 2009, pp. 248–255.
- A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, and N. Houlsby, “An image is worth 16x16 words: Transformers for image recognition at scale,” in ICLR, 2021. [Online]. Available: https://openreview.net/forum?id=YicbFdNTTy
- N. Weber, J. Starc, A. Mittal, R. Blanco, and L. Màrquez, “Optimizing over a bayesian last layer,” in NeurIPS workshop on Bayesian Deep Learning, 2018.
- M. Kim, K. R. Go, and S.-Y. Yun, “Neural processes with stochastic attention: Paying more attention to the context dataset,” in Fifth Workshop on Meta-Learning at NeurIPS, 2021. [Online]. Available: https://openreview.net/forum?id=URep0STGewu
- X. Fan, S. Zhang, B. Chen, and M. Zhou, “Bayesian attention modules,” NeurIPS, vol. 33, pp. 16 362–16 376, 2020.
- D. RUMBERT, “Learning internal representations by error propagation,” Parallel distributed processing, vol. 1, pp. 318–363, 1986.
- Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998.
- N. McGreivy and A. Hakim, “Convolutional layers are not translation equivariant,” 2022.
- C. Olah, N. Cammarata, C. Voss, L. Schubert, and G. Goh, “Naturally occurring equivariance in neural networks,” Distill, 2020, https://distill.pub/2020/circuits/equivariance.
- M. Finzi, S. Stanton, P. Izmailov, and A. G. Wilson, “Generalizing convolutional neural networks for equivariance to lie groups on arbitrary continuous data,” in ICML. PMLR, 2020, pp. 3165–3176.
- T. S. Cohen, M. Geiger, and M. Weiler, “A general theory of equivariant cnns on homogeneous spaces,” NeurIPS, vol. 32, 2019.
- K. Małecki, “Graph cellular automata with relation-based neighbourhoods of cells for complex systems modelling: A case of traffic simulation,” Symmetry, vol. 9, no. 12, p. 322, 2017.
- R. Hunt, E. Mendi, and C. Bayrak, “Using cellular automata to model social networking behavior,” in CINTI. IEEE, 2011, pp. 287–290.
- M. Bock, A. K. Tyagi, J.-U. Kreft, and W. Alt, “Generalized voronoi tessellation as a model of two-dimensional cell tissue dynamics,” Bull. Math. Biol., vol. 72, no. 7, pp. 1696–1731, 2010.
- T. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” in ICLR, 2019. [Online]. Available: https://openreview.net/pdf?id=SJU4ayYgl
- M. Nassar, “Hierarchical bipartite graph convolution networks,” in Workshop on Relational Representation Learning at NeurIPS, 2018.
- B. Tang and D. S. Matteson, “Graph-based continual learning,” in ICLR, 2021. [Online]. Available: https://openreview.net/forum?id=HHSEKOnPvaO
- R. S. Burt, “A note on missing network data in the general social survey,” Soc. Netw., vol. 9, no. 1, pp. 63–73, 1987.
- M. Huisman, “Imputation of missing network data: Some simple procedures,” JoSS, vol. 10, no. 1, pp. 1–29, 2009.
- B. Efron, “Bootstrap methods: another look at the jackknife,” in Breakthroughs in statistics. Springer, 1992, pp. 569–593.
- A. G. d. G. Matthews, J. Hensman, R. Turner, and Z. Ghahramani, “On sparse variational methods and the kullback-leibler divergence between stochastic processes,” in AISTATS. PMLR, 2016, pp. 231–239.
- S. Markou, J. Requeima, W. Bruinsma, and R. Turner, “Efficient gaussian neural processes for regression,” in Workshop on Uncertainty and Robustness in Deep Learning at ICML, 2021.
- D. Hafner, T. Lillicrap, I. Fischer, R. Villegas, D. Ha, H. Lee, and J. Davidson, “Learning latent dynamics for planning from pixels,” in ICML. PMLR, 2019, pp. 2555–2565.
- R. T. Chen, Y. Rubanova, J. Bettencourt, and D. K. Duvenaud, “Neural ordinary differential equations,” NeurIPS, vol. 31, 2018.
- J. Shen, X. Zhen, M. Worring, and L. Shao, “Multi-task neural processes,” 2022. [Online]. Available: https://openreview.net/forum?id=wfRZkDvxOqj
- S. Jha, D. Gong, H. Zhao, and L. Yao, “NPCL: Neural processes for uncertainty-aware continual learning,” in NeurIPS, 2023. [Online]. Available: https://openreview.net/forum?id=huh0XmSdBK
- J. S. Shen, X. Zhen, Q. Wang, and M. Worring, “Episodic multi-task learning with heterogeneous neural processes,” in NeurIPS, 2023. [Online]. Available: https://openreview.net/forum?id=FXU4aR2uif
- S. Chopra, R. Hadsell, and Y. LeCun, “Learning a similarity metric discriminatively, with application to face verification,” in CVPR, vol. 1. IEEE, 2005, pp. 539–546.
- T. Kipf, E. van der Pol, and M. Welling, “Contrastive learning of structured world models,” in ICLR, 2019.
- C. Doersch, A. Gupta, and A. Zisserman, “Crosstransformers: spatially-aware few-shot transfer,” NeurIPS, vol. 33, pp. 21 981–21 993, 2020.
- T. Chen, S. Kornblith, M. Norouzi, and G. Hinton, “A simple framework for contrastive learning of visual representations,” in ICML. PMLR, 2020, pp. 1597–1607.
- A. v. d. Oord, Y. Li, and O. Vinyals, “Representation learning with contrastive predictive coding,” arXiv preprint arXiv:1807.03748, 2018.
- X. Lin, J. Wu, C. Zhou, S. Pan, Y. Cao, and B. Wang, “Task-adaptive neural process for user cold-start recommendation,” in WWW, 2021, pp. 1306–1316.
- G. Wang, X. Xu, T. Zhong, and F. Zhou, “Conditional collaborative filtering process for top-k recommender system (student abstract),” AAAI, 2022.
- Z. Shangguan, L. Lin, W. Wu, and B. Xu, “Neural process for black-box model optimization under bayesian framework,” in AAAI-MLPS, 2021. [Online]. Available: http://ceur-ws.org/Vol-2964/article_66.pdf
- K. Swersky, J. Snoek, and R. P. Adams, “Multi-task bayesian optimization,” NeurIPS, vol. 26, 2013.
- J. Snoek, O. Rippel, K. Swersky, R. Kiros, N. Satish, N. Sundaram, M. Patwary, M. Prabhat, and R. Adams, “Scalable bayesian optimization using deep neural networks,” in ICML. PMLR, 2015, pp. 2171–2180.
- J. Luo, L. Chen, X. Li, and Q. Zhang, “Novel multitask conditional neural-network surrogate models for expensive optimization,” IEEE Trans. Cybern., pp. 1–14, 2020.
- Y. Wei, P. Zhao, and J. Huang, “Meta-learning hyperparameter performance prediction with neural processes,” in ICML. PMLR, 2021, pp. 11 058–11 067.
- R. J. Cotton, F. Sinz, and A. Tolias, “Factorized neural processes for neural processes: K-shot prediction of neural responses,” NeurIPS, vol. 33, pp. 11 368–11 379, 2020.
- D. Klindt, A. S. Ecker, T. Euler, and M. Bethge, “Neural system identification for large populations separating “what” and “where”,” NeurIPS, vol. 30, 2017.
- H. I. Reuter, A. Nelson, and A. Jarvis, “An evaluation of void-filling interpolation methods for srtm data,” Int. J. Geogr. Inf. Sci., vol. 21, no. 9, pp. 983–1008, 2007.
- I. Goodfellow, J. Pouget-Abadie, M. Mirza, B. Xu, D. Warde-Farley, S. Ozair, A. Courville, and Y. Bengio, “Generative adversarial nets,” NeurIPS, vol. 27, 2014.
- Y.-J. Park and H.-L. Choi, “A neural process approach for probabilistic reconstruction of no-data gaps in lunar digital elevation maps,” Aerosp. Sci. Technol.., vol. 113, p. 106672, 2021.
- I. Čvorović-Hajdinjak, A. B. Kovačević, D. Ilić, L. Č. Popović, X. Dai, I. Jankov, V. Radović, P. Sánchez-Sáez, and R. Nikutta, “Conditional neural process for nonparametric modeling of active galactic nuclei light curves,” Astronomische Nachrichten, vol. 343, no. 1-2, p. e210103, 2022.
- A. Pondaven, M. Bakler, D. Guo, H. Hashim, M. Ignatov, and H. Zhu, “Convolutional neural processes for inpainting satellite images,” arXiv preprint arXiv:2205.12407, 2022.
- Z. Wang, E. P. Simoncelli, and A. C. Bovik, “Multiscale structural similarity for image quality assessment,” in ACSSC, vol. 2. Ieee, 2003, pp. 1398–1402.
- G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang, “Physics-informed machine learning,” Nature Reviews Physics, vol. 3, no. 6, pp. 422–440, 2021.
- A. Chakrabarty, G. Wichern, and C. Laughman, “Attentive neural processes and batch bayesian optimization for scalable calibration of physics-informed digital twins,” in ICML workshop on Tackling Climate Change with Machine Learning, 2021.
- I. Ahmed, G. U. Modu, A. Yusuf, P. Kumam, and I. Yusuf, “A mathematical model of coronavirus disease (covid-19) containing asymptomatic and symptomatic classes,” Results in physics, vol. 21, p. 103776, 2021.
- C. A. Garcia, A. Otero, P. Felix, J. Presedo, and D. G. Marquez, “Nonparametric estimation of stochastic differential equations with sparse gaussian processes,” Physical Review E, vol. 96, no. 2, p. 022104, 2017.
- Y. Wang and S. Yao, “Neural stochastic differential equations with neural processes family members for uncertainty estimation in deep learning,” Sensors, vol. 21, no. 11, p. 3708, 2021.
- B. Øksendal, “Stochastic differential equations,” in Stochastic differential equations. Springer, 2003, pp. 65–84.
- Y. Wang, K. Wang, W. Cai, and X. Yue, “Np-ode: Neural process aided ordinary differential equations for uncertainty quantification of finite element analysis,” IISE Transactions, vol. 54, no. 3, pp. 211–226, 2022.
- R. Chen, N. Gao, N. A. Vien, H. Ziesche, and G. Neumann, “Meta-learning regrasping strategies for physical-agnostic objects,” arXiv preprint arXiv:2205.11110, 2022.
- Y. Li, N. Gao, H. Ziesche, and G. Neumann, “Category-agnostic 6d pose estimation with conditional neural processes,” arXiv preprint arXiv:2206.07162, 2022.
- Y. Yildirim and E. Ugur, “Learning social navigation from demonstrations with conditional neural processes,” arXiv preprint arXiv:2210.03582, 2022.
- S. Kia and A. Marquand, “Neural processes mixed-effect models for deep normative modeling of clinical neuroimaging data,” in MIDL, ser. PMLR, M. J. Cardoso, A. Feragen, B. Glocker, E. Konukoglu, I. Oguz, G. Unal, and T. Vercauteren, Eds., vol. 102. PMLR, 08–10 Jul 2019, pp. 297–314. [Online]. Available: https://proceedings.mlr.press/v102/kia19a.html
- A. F. Marquand, I. Rezek, J. Buitelaar, and C. F. Beckmann, “Understanding heterogeneity in clinical cohorts using normative models: beyond case-control studies,” Biol. psychiatry, vol. 80, no. 7, pp. 552–561, 2016.
- S. M. Kia, C. F. Beckmann, and A. F. Marquand, “Scalable multi-task gaussian process tensor regression for normative modeling of structured variation in neuroimaging data,” arXiv preprint arXiv:1808.00036, 2018.
- J. Heo, J. Park, H. Jeong, K. J. Kim, J. Lee, E. Yang, and S. J. Hwang, “Cost-effective interactive attention learning with neural attention processes,” in ICML. PMLR, 2020, pp. 4228–4238.
- A. Sinitsin, V. Plokhotnyuk, D. Pyrkin, S. Popov, and A. Babenko, “Editable neural networks,” in ICLR, 2020. [Online]. Available: https://openreview.net/forum?id=HJedXaEtvS
- A. Vaughan, W. Tebbutt, J. S. Hosking, and R. E. Turner, “Convolutional conditional neural processes for local climate downscaling,” arXiv preprint arXiv:2101.07950, 2021.
- G. Wichern, A. Chakrabarty, Z.-Q. Wang, and J. Le Roux, “Anomalous sound detection using attentive neural processes,” in 2021 IEEE WASPAA Workshop. IEEE, 2021, pp. 186–190.
- K. Suefusa, T. Nishida, H. Purohit, R. Tanabe, T. Endo, and Y. Kawaguchi, “Anomalous sound detection based on interpolation deep neural network,” in ICASSP. IEEE, 2020, pp. 271–275.
- Y. Wang, F. Ma, H. Wang, K. Jha, and J. Gao, “Multimodal emergent fake news detection via meta neural process networks,” in ACM SIGKDD, 2021, pp. 3708–3716.
- E. Jang, S. Gu, and B. Poole, “Categorical reparameterization with gumbel-softmax,” in ICLR, 2017. [Online]. Available: https://openreview.net/pdf?id=rkE3y85ee
- S. Xue, C. Qu, X. Shi, C. Liao, S. Zhu, X. Tan, L. Ma, S. Wang, S. Wang, Y. Hu et al., “A meta reinforcement learning approach for predictive autoscaling in the cloud,” arXiv preprint arXiv:2205.15795, 2022.
- D. M. Wu, M. Chinazzi, A. Vespignani, Y.-A. Ma, and R. Yu, “Multi-fidelity hierarchical neural processes,” 2022.
- J. Wang, T. Lukasiewicz, D. Massiceti, X. Hu, V. Pavlovic, and A. Neophytou, “Np-match: When neural processes meet semi-supervised learning,” in ICML. PMLR, 2022, pp. 22 919–22 934.
- F. Shi, B. Li, and X. Xue, “Compositional law parsing with latent random functions,” arXiv preprint arXiv:2209.09115, 2022.
- B. M. Lake, R. Salakhutdinov, and J. B. Tenenbaum, “Human-level concept learning through probabilistic program induction,” Science, vol. 350, no. 6266, pp. 1332–1338, 2015.
- Z. Liu, P. Luo, X. Wang, and X. Tang, “Deep learning face attributes in the wild,” in ICCV, December 2015.
- G. Cohen, S. Afshar, J. Tapson, and A. Van Schaik, “Emnist: Extending mnist to handwritten letters,” in IJCNN. IEEE, 2017, pp. 2921–2926.
- Y. Netzer, T. Wang, A. Coates, A. Bissacco, B. Wu, and A. Y. Ng, “Reading digits in natural images with unsupervised feature learning,” NeurIPS Workshop on Deep Learning and Unsupervised Feature Learning, 2011.
- F. P. Casale, A. Dalca, L. Saglietti, J. Listgarten, and N. Fusi, “Gaussian process prior variational autoencoders,” NeurIPS, vol. 31, 2018.
- I. Sutskever, G. E. Hinton, and G. W. Taylor, “The recurrent temporal restricted boltzmann machine,” NeurIPS, vol. 21, 2008.
- W. Dolan, C. Quirk, C. Brockett, and B. Dolan, “Unsupervised construction of large paraphrase corpora: Exploiting massively parallel news sources,” 2004.
- N. S. Goel, S. C. Maitra, and E. W. Montroll, “On the volterra and other nonlinear models of interacting populations,” RMP, vol. 43, no. 2, p. 231, 1971.
- T. Allam Jr, A. Bahmanyar, R. Biswas, M. Dai, L. Galbany, R. Hložek, E. E. Ishida, S. W. Jha, D. O. Jones, R. Kessler et al., “The photometric lsst astronomical time-series classification challenge (plasticc): Data set,” arXiv preprint arXiv:1810.00001, 2018.
- E. Cho, S. A. Myers, and J. Leskovec, “Friendship and mobility: friendship and mobility: user movement in location-based social networksfriendship and mobility,” in ACM KDD, 2011.
- Z. Wu, S. Pan, G. Long, J. Jiang, and C. Zhang, “Graph wavenet for deep spatial-temporal graph modeling,” in IJCAI, 2019, pp. 1907–1913.
- H. V. Jagadish, J. Gehrke, A. Labrinidis, Y. Papakonstantinou, J. M. Patel, R. Ramakrishnan, and C. Shahabi, “Big data and its technical challenges,” Communications of the ACM, vol. 57, no. 7, pp. 86–94, 2014.
- G. Moody, “A new method for detecting atrial fibrillation using rr intervals,” Comput. Cardiol., pp. 227–230, 1983.
- H. Qiu, J. Lee, J. Lin, and G. Yu, “Wavelet filter-based weak signature detection method and its application on rolling element bearing prognostics,” JSV, vol. 289, no. 4-5, pp. 1066–1090, 2006.
- J. Salamon, C. Jacoby, and J. P. Bello, “A dataset and taxonomy for urban sound research,” in ACM Multimedia, 2014, pp. 1041–1044.
- J. Muñoz-Sabater, E. Dutra, A. Agustí-Panareda, C. Albergel, G. Arduini, G. Balsamo, S. Boussetta, M. Choulga, S. Harrigan, H. Hersbach et al., “Era5-land: A state-of-the-art global reanalysis dataset for land applications,” Earth Syst. Sci. Data, vol. 13, no. 9, pp. 4349–4383, 2021.
- Y. Gal, R. McAllister, and C. E. Rasmussen, “Improving pilco with bayesian neural network dynamics models,” in Data-Efficient Machine Learning workshop, ICML, vol. 4, no. 34, 2016, p. 25.
- K. Kersting, N. M. Kriege, C. Morris, P. Mutzel, and M. Neumann, “Benchmark data sets for graph kernels,” 2016. [Online]. Available: http://graphkernels.cs.tu-dortmund.de
- A. K. McCallum, K. Nigam, J. Rennie, and K. Seymore, “Automating the construction of internet portals with machine learning,” IR, vol. 3, no. 2, pp. 127–163, 2000.
- C. L. Giles, K. D. Bollacker, and S. Lawrence, “Citeseer: An automatic citation indexing system,” in ACM JCDL, 1998, pp. 89–98.
- P. Sen, G. Namata, M. Bilgic, L. Getoor, B. Galligher, and T. Eliassi-Rad, “Collective classification in network data,” AI magazine, vol. 29, no. 3, pp. 93–93, 2008.
- A. X. Chang, T. Funkhouser, L. Guibas, P. Hanrahan, Q. Huang, Z. Li, S. Savarese, M. Savva, S. Song, H. Su et al., “Shapenet: An information-rich 3d model repository,” arXiv preprint arXiv:1512.03012, 2015.
- F. Zenke, B. Poole, and S. Ganguli, “Continual learning through synaptic intelligence,” in ICML. PMLR, 2017, pp. 3987–3995.
- A. Chaudhry, P. K. Dokania, T. Ajanthan, and P. H. Torr, “Riemannian walk for incremental learning: Understanding forgetting and intransigence,” in ECCV, 2018, pp. 532–547.
- M.-E. Nilsback and A. Zisserman, “Automated flower classification over a large number of classes,” in ICVGIP, Dec 2008.
- Y. Dubois, J. Gordon, and A. Y. Foong, “Neural process family,” http://yanndubs.github.io/Neural-Process-Family/, September 2020.
- L. Yi, V. G. Kim, D. Ceylan, I.-C. Shen, M. Yan, H. Su, C. Lu, Q. Huang, A. Sheffer, and L. Guibas, “A scalable active framework for region annotation in 3d shape collections,” ACM ToG, vol. 35, no. 6, pp. 1–12, 2016.
- Y. Wang, Y. Sun, Z. Liu, S. E. Sarma, M. M. Bronstein, and J. M. Solomon, “Dynamic graph cnn for learning on point clouds,” Acm ToG, vol. 38, no. 5, pp. 1–12, 2019.
- I. Beltagy, M. E. Peters, and A. Cohan, “Longformer: The long-document transformer,” arXiv preprint arXiv:2004.05150, 2020.
- L. Wu, A. Miller, L. Anderson, G. Pleiss, D. Blei, and J. Cunningham, “Hierarchical inducing point gaussian process for inter-domian observations,” in AISTATS. PMLR, 2021, pp. 2926–2934.
- A. Zandieh, N. Nouri, A. Velingker, M. Kapralov, and I. Razenshteyn, “Scaling up kernel ridge regression via locality sensitive hashing,” in AISTATS. PMLR, 2020, pp. 4088–4097.
- Y. Shen, M. Seeger, and A. Ng, “Fast gaussian process regression using kd-trees,” NeurIPS, vol. 18, 2005.
- Y. Sun, A. Narang, I. Gulluk, S. Oymak, and M. Fazel, “Towards sample-efficient overparameterized meta-learning,” NeurIPS, vol. 34, 2021.
- J. Xu, H. Kim, T. Rainforth, and Y. Teh, “Group equivariant subsampling,” NeurIPS, vol. 34, 2021.
- J. Shi, M. E. Khan, and J. Zhu, “Scalable training of inference networks for gaussian-process models,” in ICML. PMLR, 2019, pp. 5758–5768.
- C. Pabbaraju and P. Jain, “Learning functions over sets via permutation adversarial networks,” arXiv preprint arXiv:1907.05638, 2019.
- H. Maron, O. Litany, G. Chechik, and E. Fetaya, “On learning sets of symmetric elements,” in ICML. PMLR, 2020, pp. 6734–6744.
- M. Soelch, A. Akhundov, P. van der Smagt, and J. Bayer, “On deep set learning and the choice of aggregations,” in ICANN. Springer, 2019, pp. 444–457.
- L. Falorsi, P. de Haan, T. R. Davidson, and P. Forré, “Reparameterizing distributions on lie groups,” in AISTATS. PMLR, 2019, pp. 3244–3253.
- B.-H. Tran, D. Milios, S. Rossi, and M. Filippone, “Sensible priors for bayesian neural networks through wasserstein distance minimization to gaussian processes,” in AABI, 2021. [Online]. Available: https://openreview.net/forum?id=GsivsoVTb2
- J. Harrison, A. Sharma, and M. Pavone, “Meta-learning priors for efficient online bayesian regression,” in WAFR. Springer, 2018, pp. 318–337.
- M. Havasi, J. M. Hernández-Lobato, and J. J. Murillo-Fuentes, “Inference in deep gaussian processes using stochastic gradient hamiltonian monte carlo,” NeurIPS, vol. 31, 2018.
- G. G. Chrysos and Y. Panagakis, “Cope: Conditional image generation using polynomial expansions,” CoRR, vol. abs/2104.05077, 2021. [Online]. Available: https://arxiv.org/abs/2104.05077
- S. Jha, M. Schiemer, F. Zambonelli, and J. Ye, “Continual learning in sensor-based human activity recognition: An empirical benchmark analysis,” Information Sciences, vol. 575, pp. 1–21, 2021.
- S.-A. Rebuffi, A. Kolesnikov, G. Sperl, and C. H. Lampert, “icarl: Incremental classifier and representation learning,” in CVPR, 2017, pp. 2001–2010.
- H. Lebesgue, “Intégrale, longueur, aire,” Annali di Matematica Pura ed Applicata (1898-1922), vol. 7, no. 1, pp. 231–359, 1902.
- M. Binois and N. Wycoff, “A survey on high-dimensional gaussian process modeling with application to bayesian optimization,” arXiv preprint arXiv:2111.05040, 2021.
- J. Quinonero-Candela and C. E. Rasmussen, “A unifying view of sparse approximate gaussian process regression,” JMLR, vol. 6, pp. 1939–1959, 2005.
- D. P. Kingma, T. Salimans, and M. Welling, “Variational dropout and the local reparameterization trick,” NeurIPS, vol. 28, 2015.
- T. Lukasiewicz and J. Wang, “Np- match: When neural processes meet semi- supervised learning,” in ICML. PMLR, 2022.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.