Papers
Topics
Authors
Recent
Search
2000 character limit reached

Input Snapshots Fusion for Scalable Discrete-Time Dynamic Graph Neural Networks

Published 11 May 2024 in cs.LG | (2405.06975v2)

Abstract: In recent years, there has been a surge in research on dynamic graph representation learning, primarily focusing on modeling the evolution of temporal-spatial patterns in real-world applications. However, within the domain of discrete-time dynamic graphs, the exploration of temporal edges remains underexplored. Existing approaches often rely on additional sequential models to capture dynamics, leading to high computational and memory costs, particularly for large-scale graphs. To address this limitation, we propose the Input {\bf S}napshots {\bf F}usion based {\bf Dy}namic {\bf G}raph Neural Network (SFDyG), which combines Hawkes processes with graph neural networks to capture temporal and structural patterns in dynamic graphs effectively. By fusing multiple snapshots into a single temporal graph, SFDyG decouples computational complexity from the number of snapshots, enabling efficient full-batch and mini-batch training. Experimental evaluations on eight diverse dynamic graph datasets for future link prediction tasks demonstrate that SFDyG consistently outperforms existing methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. FastGCN: Fast learning with graph convolu-tional networks via importance sampling. In International Conference on Learning Representations. International Conference on Learning Representations, ICLR.
  2. Cluster-gcn: An efficient algorithm for training deep and large graph convolutional networks. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 257–266.
  3. Empirical evaluation of gated recurrent neural networks on sequence modeling. In NIPS 2014 Workshop on Deep Learning, December 2014.
  4. Inductive representation learning on temporal graphs. In International Conference on Learning Representations (ICLR).
  5. Graph Neural Networks for Social Recommendation. In The World Wide Web Conference. https://doi.org/10.1145/3308558.3313488
  6. Matthias Fey and Jan E. Lenssen. 2019. Fast Graph Representation Learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds.
  7. Neural message passing for quantum chemistry. In International conference on machine learning. PMLR, 1263–1272.
  8. Variational graph recurrent neural networks. Advances in neural information processing systems 32 (2019).
  9. Inductive representation learning on large graphs. Advances in neural information processing systems 30 (2017).
  10. Alan G Hawkes. 1971. Spectra of some self-exciting and mutually exciting point processes. Biometrika 58, 1 (1971), 83–90.
  11. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation 9, 8 (1997), 1735–1780.
  12. Open graph benchmark: Datasets for machine learning on graphs. Advances in neural information processing systems 33 (2020), 22118–22133.
  13. Sergey Ioffe and Christian Szegedy. 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In International conference on machine learning. pmlr, 448–456.
  14. Representation learning for dynamic graphs: A survey. The Journal of Machine Learning Research 21, 1 (2020), 2648–2720.
  15. Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014).
  16. Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations (ICLR).
  17. Predicting dynamic embedding trajectory in temporal interaction networks. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 1269–1278.
  18. Sgcn: A graph sparsifier based on graph convolutional networks. In Advances in Knowledge Discovery and Data Mining: 24th Pacific-Asia Conference, PAKDD 2020, Singapore, May 11–14, 2020, Proceedings, Part I 24. Springer, 275–287.
  19. Mengzhang Li and Zhanxing Zhu. 2021. Spatial-temporal fusion graph neural networks for traffic flow forecasting. In Proceedings of the AAAI conference on artificial intelligence, Vol. 35. 4189–4196.
  20. Ilya Loshchilov and Frank Hutter. 2016. Sgdr: Stochastic gradient descent with warm restarts. arXiv preprint arXiv:1608.03983 (2016).
  21. A unified view on graph neural networks as graph signal denoising. In Proceedings of the 30th ACM International Conference on Information & Knowledge Management. 1202–1211.
  22. Distgnn: Scalable distributed training for large-scale graph neural networks. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis. 1–14.
  23. EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs. In Proceedings of the Thirty-Fourth AAAI Conference on Artificial Intelligence.
  24. Graph neural networks for materials science and chemistry. Communications Materials 3, 1 (2022), 93.
  25. Herbert Robbins and Sutton Monro. 1951. A stochastic approximation method. The annals of mathematical statistics (1951), 400–407.
  26. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification. In International Conference on Learning Representations.
  27. Temporal Graph Networks for Deep Learning on Dynamic Graphs. In ICML 2020 Workshop on Graph Representation Learning.
  28. Dysat: Deep neural representation learning on dynamic graphs via self-attention networks. In Proceedings of the 13th international conference on web search and data mining. 519–527.
  29. Dyrep: Learning representations over dynamic graphs. In International conference on learning representations.
  30. Attention is all you need. Advances in neural information processing systems 30 (2017).
  31. Graph Attention Networks. 6th International Conference on Learning Representations (2017).
  32. Apan: Asynchronous propagation attention network for real-time temporal graph embedding. In Proceedings of the 2021 international conference on management of data. 2628–2638.
  33. Graph Convolutional Networks with Markov Random Field Reasoning for Social Spammer Detection. Proceedings of the AAAI Conference on Artificial Intelligence (Jun 2020), 1054–1061. https://doi.org/10.1609/aaai.v34i01.5455
  34. Discrete-time temporal network embedding via implicit hierarchical learning in hyperbolic space. In Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 1975–1985.
  35. ROLAND: Graph Learning Framework for Dynamic Graphs. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (Washington DC, USA) (KDD ’22). Association for Computing Machinery, New York, NY, USA, 2358–2366. https://doi.org/10.1145/3534678.3539300
  36. Spatio-temporal graph convolutional networks: a deep learning framework for traffic forecasting. In Proceedings of the 27th International Joint Conference on Artificial Intelligence. 3634–3640.
  37. DyTed: Disentangled Representation Learning for Discrete-time Dynamic Graph. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 3309–3320.
  38. WinGNN: Dynamic Graph Neural Networks with Random Gradient Aggregation Window. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (, Long Beach, CA, USA,) (KDD ’23). Association for Computing Machinery, New York, NY, USA, 3650–3662. https://doi.org/10.1145/3580305.3599551
  39. Embedding Temporal Network via Neighborhood Formation. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (London, United Kingdom) (KDD ’18). Association for Computing Machinery, New York, NY, USA, 2857–2866. https://doi.org/10.1145/3219819.3220054
  40. Embedding temporal network via neighborhood formation. In Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. 2857–2866.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.