Improving Graph Machine Learning Performance Through Feature Augmentation Based on Network Control Theory
Abstract: Network control theory (NCT) offers a robust analytical framework for understanding the influence of network topology on dynamic behaviors, enabling researchers to decipher how certain patterns of external control measures can steer system dynamics towards desired states. Distinguished from other structure-function methodologies, NCT's predictive capabilities can be coupled with deploying Graph Neural Networks (GNNs), which have demonstrated exceptional utility in various network-based learning tasks. However, the performance of GNNs heavily relies on the expressiveness of node features, and the lack of node features can greatly degrade their performance. Furthermore, many real-world systems may lack node-level information, posing a challenge for GNNs.To tackle this challenge, we introduce a novel approach, NCT-based Enhanced Feature Augmentation (NCT-EFA), that assimilates average controllability, along with other centrality indices, into the feature augmentation pipeline to enhance GNNs performance. Our evaluation of NCT-EFA, on six benchmark GNN models across two experimental setting. solely employing average controllability and in combination with additional centrality metrics. showcases an improved performance reaching as high as 11%. Our results demonstrate that incorporating NCT into feature enrichment can substantively extend the applicability and heighten the performance of GNNs in scenarios where node-level information is unavailable.
- Y. Yazıcıoğlu, M. Shabbir, W. Abbas, and X. Koutsoukos, “Strong structural controllability of networks: Comparison of bounds using distances and zero forcing,” Automatica, vol. 146, p. 110562, 2022.
- A. Said, O. U. Ahmad, W. Abbas, M. Shabbir, and X. Koutsoukos, “Network controllability perspectives on graph representation,” IEEE Transactions on Knowledge and Data Engineering, 2023.
- R. E. Kalman et al., “Contributions to the theory of optimal control,” Bol. soc. mat. mexicana, vol. 5, no. 2, pp. 102–119, 1960.
- A.-L. Barabási, “Network science,” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, vol. 371, no. 1987, p. 20120375, 2013.
- A. Said, R. A. Abbasi, O. Maqbool, A. Daud, and N. R. Aljohani, “Cc-ga: A clustering coefficient based genetic algorithm for detecting communities in social networks,” Applied Soft Computing, 2018.
- L. C. Freeman et al., “Centrality in social networks: Conceptual clarification,” Social network: critical concepts in sociology. Londres: Routledge, vol. 1, pp. 238–263, 2002.
- L. Parkes, T. M. Moore, M. E. Calkins, M. Cieslak, D. R. Roalf, D. H. Wolf, R. C. Gur, R. E. Gur, T. D. Satterthwaite, and D. S. Bassett, “Network controllability in transmodal cortex predicts positive psychosis spectrum symptoms,” Biological Psychiatry, vol. 90, no. 6, pp. 409–418, 2021.
- T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” arXiv preprint arXiv:1609.02907, 2016.
- W. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” Advances in neural information processing systems, vol. 30, 2017.
- A. Said, M. Shabbir, T. Derr, W. Abbas, and X. Koutsoukos, “Enhanced graph neural networks with ego-centric spectral subgraph embeddings augmentation,” arXiv preprint arXiv:2310.12169, 2023.
- K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks?” arXiv preprint arXiv:1810.00826, 2018.
- J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in International conference on machine learning. PMLR, 2017, pp. 1263–1272.
- N. M. Kriege, F. D. Johansson, and C. Morris, “A survey on graph kernels,” Applied Network Science, vol. 5, no. 1, pp. 1–42, 2020.
- A. Said, T. Derr, M. Shabbir, W. Abbas, and X. Koutsoukos, “A survey of graph unlearning,” arXiv preprint arXiv:2310.02164, 2023.
- M. Togninalli, E. Ghisu, F. Llinares-López, B. Rieck, and K. Borgwardt, “Wasserstein weisfeiler-lehman graph kernels,” Advances in neural information processing systems, vol. 32, 2019.
- R. Kondor and H. Pan, “The multiscale laplacian graph kernel,” Advances in neural information processing systems, vol. 29, 2016.
- A. Said, M. Shabbir, S.-U. Hassan, Z. R. Hassan, A. Ahmed, and X. Koutsoukos, “On augmenting topological graph representations for attributed graphs,” Applied Soft Computing, vol. 136, p. 110104, 2023.
- X. Bresson and T. Laurent, “Residual gated graph convnets,” arXiv preprint arXiv:1711.07553, 2017.
- P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” arXiv preprint arXiv:1710.10903, 2017.
- Z. Wu, S. Pan, F. Chen, G. Long, C. Zhang, and S. Y. Philip, “A comprehensive survey on graph neural networks,” IEEE transactions on neural networks and learning systems, no. 1, 2020.
- L. Parkes, J. Z. Kim, J. Stiso, J. K. Brynildsen, M. Cieslak, S. Covitz, R. E. Gur, R. C. Gur, F. Pasqualetti, R. T. Shinohara et al., “Using network control theory to study the dynamics of the structural connectome,” bioRxiv, 2023.
- F. Pasqualetti, S. Zampieri, and F. Bullo, “Controllability metrics, limitations and algorithms for complex networks,” IEEE Transactions on Control of Network Systems, vol. 1, no. 1, pp. 40–52, 2014.
- O. U. Ahmad, A. Said, M. Shabbir, W. Abbas, and X. Koutsoukos, “Control-based graph embeddings with data augmentation for contrastive learning,” arXiv preprint arXiv:2403.04923, 2024.
- A. Said, R. G. Bayrak, T. Derr, M. Shabbir, D. Moyer, C. Chang, and X. Koutsoukos, “Neurograph: Benchmarks for graph machine learning in brain connectomics,” arXiv preprint arXiv:2306.06202, 2023.
- B. Rozemberczki, O. Kiss, and R. Sarkar, “Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs,” in Proceedings of the 29th ACM International Conference on Information and Knowledge Management (CIKM ’20). ACM, 2020, p. 3125–3132.
- C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, “Weisfeiler and leman go neural: Higher-order graph neural networks,” in Proceedings of the AAAI conference on artificial intelligence, vol. 33, no. 01, 2019, pp. 4602–4609.
- Y. Shi, Z. Huang, S. Feng, H. Zhong, W. Wang, and Y. Sun, “Masked label prediction: Unified message passing model for semi-supervised classification,” arXiv preprint arXiv:2009.03509, 2020.
- M. Zhang, Z. Cui, M. Neumann, and Y. Chen, “An end-to-end deep learning architecture for graph classification,” in Proceedings of the AAAI conference on artificial intelligence, vol. 32, no. 1, 2018.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.