|
|
Node-Level Adaptive Graph Convolutional Neural Network for Node Classification Tasks |
WANG Xinlong1, HU Rui1, GUO Yaliang1, DU Hangyuan1, ZHANG Binqi3, WANG Wenjian2,3 |
1. School of Computer and Information Technology, Shanxi University, Taiyuan 030006; 2. Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education, Shanxi University, Taiyuan 030006; 3. Department of Network Security, Shanxi Police College, Tai-yuan 030401 |
|
|
Abstract Graph neural networks learn node embeddings by recursively sampling and aggregating information from nodes in a graph. However, the relatively fixed pattern of existing methods in node sampling and aggregation results in inadequate capture of local pattern diversity, thereby degrading the performance of the model. To solve this problem, a node-level adaptive graph convolutional neural network(NA-GCN) is proposed. A sampling strategy based on node importance is designed to adaptively determine the neighborhood size of each node. An aggregation strategy based on the self-attention mechanism is presented to adaptively fuse the node information within a given neighborhood. Experimental results on multiple benchmark graph datasets show the superiority of NA-GCN in node classification tasks.
|
Received: 09 January 2024
|
|
Fund:National Natural Science Foundation of China(No.U21A20513,62076154), Key Research and Development Program of Shanxi Province(No.202202020101003,202302010101007), Fundamental Research Program of Shanxi Province(No.202303021221055) |
Corresponding Authors:
WANG Wenjian, Ph.D., professor. Her research interests include machine learning, data mining and com-putational intelligence.
|
About author:: WANG Xinlong, Master student. His research interests include machine learning and graph neural network. HU Rui, Ph.D. candidate. His research interests include graph representation learning and point cloud data analysis. GUO Yaliang, Master student. His research interests include machine learning and graph neural network. DU Hangyuan, Ph.D., associate profe-ssor. His research interests include graph representation learning. ZHANG Binqi, Master, lecturer. Her research interests include machine learning and data mining. |
|
|
|
[1] NEWMAN M E J, GIRVAN M. Finding and Evaluating Community Structure in Networks. Physical Review E, 2004, 69. DOI: 10.1103/PhysRevE.69.026113. [2] KROGAN N J, CAGNEY G, YU H Y, et al. Global Landscape of Protein Complexes in the Yeast Saccharomyces Cerevisiae. Nature, 2006, 440(7084): 637-643. [3] LUO X L, ZHU C J, ZHANG D T, et al. STG4Traffic: A Survey and Benchmark of Spatial-Temporal Graph Neural Networks for Tra-ffic Prediction[C/OL].[2023-12-02]. https://arxiv.org/pdf/2307.00495.pdf. [4] WU Z H, PAN S R, CHEN F W, et al. A Comprehensive Survey on Graph Neural Networks. IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1): 4-24. [5] 肖国庆,李雪琪,陈玥丹,等.大规模图神经网络研究综述.计算机学报, 2024, 47(1): 148-171. (XIAO G Q, LI X Q, CHEN Y D, et al. A Survey of Large-Scale Graph Neural Networks. Chinese Journal of Computers, 2024, 47(1): 148-171.) [6] 谢娟英,张建宇.图卷积神经网络综述.陕西师范大学学报(自然科学版), 2024, 52(2): 89-101. (XIE J Y, ZHANG J Y. The Review of the Graph Convolutional Neural Networks. Journal of Shaanxi Normal University(Natural Science Edition), 2024, 52(2): 89-101.) [7] DU H Y, YU R, BAI L, et al. Learning Structure Perception MLPs on Graphs: A Layer-Wise Graph Knowledge Distillation Framework. International Journal of Machine Learning and Cybernetics, 2024. DOI: 10.1007/s13042-024-02150-2. [8] ZHU X J, GHAHRAMANI Z, LAFFERTY J D. Semi-supervised Learning Using Gaussian Fields and Harmonic Functions // Proc of the 20th International Conference on Machine Learning. San Diego, USA: JMLR, 2003: 912-919. [9] BELKIN M, NIYOGI P, SINDHWANI V. Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples. Journal of Machine Learning Research, 2006, 7: 2399-2434. [10] BRUNA J, ZAREMBA W, SZLAM A, et al. Spectral Networks and Locally Connected Networks on Graphs[C/OL]. [2023-12-02]. https://arxiv.org/pdf/1312.6203.pdf. [11] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering // Proc of the 30th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2016: 3844-3852. [12] KIPF T N, WELLING M. Semi-supervised Classification with Graph Convolutional Networks[C/OL]. [2013-12-02]. https://arxiv.org/pdf/1609.02907.pdf. [13] WU F, SOUZA A, ZHANG T Y, et al. Simplifying Graph Convolutional Networks. Proceedings of Machine Learning Research, 2019, 97: 6861-6871. [14] 曾菊香,王平辉,丁益东,等.面向节点分类的图神经网络节点嵌入增强模型.浙江大学学报(工学版), 2023, 57(2): 219-225. (ZENG J X, WANG P H, DING Y D, et al. Graph Neural Network Based Node Embedding Enhancement Model for Node Classification. Journal of Zhejiang University(Engineering Science), 2023, 57(2): 219-225.) [15] LI Q M, HAN Z C, WU X M. Deeper Insights into Graph Convolutional Networks for Semi-supervised Learning // Proc of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative App-lications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. Palo Alto, USA: AAAI Press, 2018: 3538-3545. [16] RONG Y, HUANG W B, XU T Y, et al. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification[C/OL].[2023-12-02]. https://arxiv.org/pdf/1907.10903.pdf. [17] XU K Y L, LI C T, TIAN Y L, et al. Representation Learning on Graphs with Jumping Knowledge Networks. Proceedings of Machine Learning Research, 2018, 80: 5453-5462. [18] CHEN M, WEI Z W, HUANG Z F, et al. Simple and Deep Graph Convolutional Networks // Proc of the 37th International Confe-rence on Machine Learning. San Diego, USA: JMLR, 2020: 1725-1735. [19] LI G H, MULLER M, THABET A, et al. DeepGCNs: Can GCNs Go as Deep as CNNs? // Proc of the IEEE/CVF International Conference on Computer Vision. Washington, USA: IEEE, 2019: 9266-9275. [20] LI R Y, WANG S, ZHU F Y, et al. Adaptive Graph Convolutional Neural Networks // Proc of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. Palo Alto, USA: AAAI Press, 2018: 3546-3553. [21] XU K Y L, HU W H, LESKOVEC J, et al. How Powerful Are Graph Neural Networks?[C/OL]. [2023-12-02]. https://arxiv.org/pdf/1810.00826.pdf. [22] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph Attention Networks[C/OL].[2023-12-02]. https://arxiv.org/pdf/1710.10903. [23] HAMILTON W L, YING Z, LESKOVEC J. Inductive Representation Learning on Large Graphs // Proc of the 31st International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2017: 1025-1035. [24] YING R, HE R N, CHEN K F, et al. Graph Convolutional Neural Networks for Web-Scale Recommender Systems // Proc of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2018: 974-983. [25] 邹长宽,田小平,张晓燕,等.基于GraphSage节点度重要性聚合的网络节点分类研究.科学技术与工程, 2022, 22(32): 14306-14312. (ZOU C K, TIAN X P, ZHANG X Y, et al. Research on Network Node Classification Based on GraphSage Node Degree Importance Aggregation. Science Technology and Engineering, 2022, 22(32): 14306-14312.) [26] CHEN J F, ZHU J, SONG L. Stochastic Training of Graph Convolutional Networks with Variance Reduction. Proceedings of Machine Learning Research, 2018, 80: 942-950. [27] CHEN J, MA T F, XIAO C. FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling[C/OL]. [2023-12-02]. https://openreview.net/pdf?id=rytstxWAW. [28] ZOU D F, HU Z N, WANG Y W, et al. Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 11249-11259. [29] HUANG W B, ZHANG T, RONG Y, et al. Adaptive Sampling Towards Fast Graph Representation Learning // Proc of the 32nd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2018: 4563-4572. [30] CHIANG W L, LIU X Q, SI S, et al. Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks // Proc of the 25th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2019: 257-266. [31] ZENG H Q, ZHOU H K, SRIVASTAVA A, et al. GraphSAINT: Graph Sampling Based Inductive Learning Method[C/OL].[2023-12-02]. https://arxiv.org/pdf/1907.04931.pdf. [32] BAI J Y, REN Y X, ZHANG J W. Ripple Walk Training: A Subgraph-Based Training Framework for Large and Deep Graph Neural Network // Proc of the International Joint Conference on Neural Networks. Washington, USA: IEEE, 2021. DOI: 10.1109/IJCNN52387.2021.9533429. [33] ZENG H Q, ZHANG M H, XIA Y L, et al. Decoupling the Depth and Scope of Graph Neural Networks[C/OL].[2023-12-02]. https://arxiv.org/pdf/2201.07858v1. [34] CORSO G, CAVALLERI L, BEAINI D, et al. Principal Neighbourhood Aggregation for Graph Nets // Proc of the 34th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2020: 13260-13271. [35] PEROZZI B, AL-RFOU R, SKIENA S. DeepWalk: Online Lear-ning of Social Representations // Proc of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2014: 701-710. [36] HU Y, YOU H X, WANG Z C, et al. Graph-MLP: Node Classification without Message Passing in Graph[C/OL].[2023-12-02]. https://arxiv.org/pdf/2106.04051.pdf. [37] BRODY S, ALON U, YAHAV E. How Attentive Are Graph Attention Networks?[C/OL]. [2023-12-02]. https://arxiv.org/pdf/2105.14491.pdf. [38] YANG Z L, COHEN W, SALAKHUDINOV R. Revisiting Semi-supervised Learning with Graph Embeddings. Proceedings of Machine Learning Research, 2016, 48: 40-48. |
|
|
|