Building a Graph-based Deep Learning network model from captured traffic traces
Abstract: Currently the state of the art network models are based or depend on Discrete Event Simulation (DES). While DES is highly accurate, it is also computationally costly and cumbersome to parallelize, making it unpractical to simulate high performance networks. Additionally, simulated scenarios fail to capture all of the complexities present in real network scenarios. While there exists network models based on Machine Learning (ML) techniques to minimize these issues, these models are also trained with simulated data and hence vulnerable to the same pitfalls. Consequently, the Graph Neural Networking Challenge 2023 introduces a dataset of captured traffic traces that can be used to build a ML-based network model without these limitations. In this paper we propose a Graph Neural Network (GNN)-based solution specifically designed to better capture the complexities of real network scenarios. This is done through a novel encoding method to capture information from the sequence of captured packets, and an improved message passing algorithm to better represent the dependencies present in physical networks. We show that the proposed solution it is able to learn and generalize to unseen captured network scenarios.
- Berlin, Heidelberg: Springer Berlin Heidelberg, 2010.
- Cham: Springer International Publishing, 2019.
- S. Jafer, Q. Liu, and G. Wainer, “Synchronization methods in parallel and distributed discrete-event simulation,” Simulation Modelling Practice and Theory, vol. 30, pp. 54–73, 2013.
- E. Kresch and S. Kulkarni, “A poisson based bursty model of internet traffic,” in 2011 IEEE 11th International Conference on Computer and Information Technology, pp. 255–260, 2011.
- V. Paxson and S. Floyd, “Wide area traffic: the failure of poisson modeling,” IEEE/ACM Transactions on Networking, vol. 3, no. 3, pp. 226–244, 1995.
- J. Popoola and R. A. Ipinyomi, “Empirical performance of weibull self-similar tele-traffic model,” International Journal of Engineering and Applied Sciences, vol. 4, 2017.
- Q. Zhang, K. K. W. Ng, C. Kazer, S. Yan, J. a. Sedoc, and V. Liu, “Mimicnet: Fast performance estimates for data center networks with machine learning,” in Proceedings of the 2021 ACM SIGCOMM 2021 Conference, SIGCOMM ’21, (New York, NY, USA), p. 287–304, Association for Computing Machinery, 2021.
- Q. Yang, X. Peng, L. Chen, L. Liu, J. Zhang, H. Xu, B. Li, and G. Zhang, “Deepqueuenet: Towards scalable and generalized network performance estimation with packet-level visibility,” in Proceedings of the ACM SIGCOMM 2022 Conference, SIGCOMM ’22, (New York, NY, USA), p. 441–457, Association for Computing Machinery, 2022.
- K. Rusek, J. Suárez-Varela, P. Almasan, P. Barlet-Ros, and A. Cabellos-Aparicio, “Routenet: Leveraging graph neural networks for network modeling and optimization in sdn,” IEEE Journal on Selected Areas in Communications, vol. 38, no. 10, pp. 2260–2270, 2020.
- M. Ferriol-Galmés, J. Paillisse, J. Suárez-Varela, K. Rusek, S. Xiao, X. Shi, X. Cheng, P. Barlet-Ros, and A. Cabellos-Aparicio, “Routenet-fermi: Network modeling with graph neural networks,” 2022.
- F. Scarselli, M. Gori, A. C. Tsoi, M. Hagenbuchner, and G. Monfardini, “The graph neural network model,” IEEE Transactions on Neural Networks, vol. 20, no. 1, pp. 61–80, 2009.
- S. Abadal, A. Jain, R. Guirado, J. López-Alonso, and E. Alarcón, “Computing graph neural networks: A survey from algorithms to accelerators,” ACM Comput. Surv., vol. 54, oct 2021.
- J. Suárez-Varela, P. Almasan, M. Ferriol-Galmés, K. Rusek, F. Geyer, X. Cheng, X. Shi, S. Xiao, F. Scarselli, A. Cabellos-Aparicio, and P. Barlet-Ros, “Graph neural networks for communication networks: Context, use cases and opportunities,” IEEE Network, vol. 37, no. 3, pp. 146–153, 2023.
- J. Bruna, W. Zaremba, A. Szlam, and Y. LeCun, “Spectral networks and locally connected networks on graphs,” 2014.
- A. Micheli, “Neural network for graphs: A contextual constructive approach,” IEEE Transactions on Neural Networks, vol. 20, no. 3, pp. 498–511, 2009.
- P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez-Gonzalez, V. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner, C. Gulcehre, F. Song, A. Ballard, J. Gilmer, G. Dahl, A. Vaswani, K. Allen, C. Nash, V. Langston, C. Dyer, N. Heess, D. Wierstra, P. Kohli, M. Botvinick, O. Vinyals, Y. Li, and R. Pascanu, “Relational inductive biases, deep learning, and graph networks,” 2018.
- J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in Proceedings of the 34th International Conference on Machine Learning - Volume 70, ICML’17, p. 1263–1272, JMLR.org, 2017.
- D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, 1986.
- S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory,” Neural Computation, vol. 9, pp. 1735–1780, 11 1997.
- K. Cho, B. van Merrienboer, D. Bahdanau, and Y. Bengio, “On the properties of neural machine translation: Encoder-decoder approaches,” 2014.
- J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Empirical evaluation of gated recurrent neural networks on sequence modeling,” 2014.
- J. McKenzie, “Mean absolute percentage error and bias in economic forecasting,” Economics Letters, vol. 113, no. 3, pp. 259–262, 2011.
- K. Gao, L. Chen, D. Li, V. Liu, X. Wang, R. Zhang, and L. Lu, “Dons: Fast and affordable discrete event network simulation with automatic parallelization,” in Proceedings of the ACM SIGCOMM 2023 Conference, ACM SIGCOMM ’23, (New York, NY, USA), p. 167–181, Association for Computing Machinery, 2023.
- K. Zhao, P. Goyal, M. Alizadeh, and T. E. Anderson, “Scalable tail latency estimation for data center networks,” in 20th USENIX Symposium on Networked Systems Design and Implementation (NSDI 23), (Boston, MA), pp. 685–702, USENIX Association, Apr. 2023.
- A. Valadarsky, M. Schapira, D. Shahaf, and A. Tamar, “Learning to route,” in Proceedings of the 16th ACM Workshop on Hot Topics in Networks, HotNets-XVI, (New York, NY, USA), p. 185–191, Association for Computing Machinery, 2017.
- S. Xiao, D. He, and Z. Gong, “Deep-q: Traffic-driven qos inference using deep generative network,” in Proceedings of the 2018 Workshop on Network Meets AI & ML, NetAI’18, (New York, NY, USA), p. 67–73, Association for Computing Machinery, 2018.
- M. Ferriol-Galmés, K. Rusek, J. Suárez-Varela, S. Xiao, X. Shi, X. Cheng, B. Wu, P. Barlet-Ros, and A. Cabellos-Aparicio, “Routenet-erlang: A graph neural network for network performance evaluation,” in IEEE INFOCOM 2022 - IEEE Conference on Computer Communications, pp. 2018–2027, 2022.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.