|
|
|
| Graph Neural Network Classifier Based onDecoupled Label Propagation and Multi-node Mixup Regularization |
| HE Wenwu1,2, LIU Xiaoyu1, MAO Guojun1,2 |
1. School of Computer Science and Mathematics, Fujian University of Technology, Fuzhou 350118; 2. Fujian Provincial Key Laboratory of Big Data Mining and App-lications, Fujian University of Technology, Fuzhou 350118 |
|
|
|
|
Abstract Graph neural network-distilled multilayer perceptrons(MLPs) balance inference performance and efficiency in graph-related tasks to some extent. However, MLPs treat graph nodes independently and struggle to explicitly capture neighborhood information of target nodes. Thus, their inference performance is limited. To solve this problem,a graph neural network classifier based on decoupled label propagation and multi-node mixup regularization(DLPMMR) is proposed. DLPMMR trains the MLP classifier under a knowledge distillation framework to ensure basic inference performance with high inference efficiency. During the training phase, a naive and hyperparameter-free double combination strategy is employed for multi-node mixup to enhance node diversity. A mixup regularization term is then constructed to explicitly control the complexity of the MLP so as to improve its generalization ability and robustness. During the inference phase, label propagation is introduced to incorporate missing neighborhood information into the predictions of the MLP. By decoupling target nodes from their neighboring nodes, the influence of neighbor node information on the classification decision of the target node is effectively regulated, and thus the inference accuracy of MLP is further enhanced. Experiments on five benchmark graph node classification datasets demonstrate that DLPMMR exhibits strong robustness and superior performance.
|
|
Received: 07 April 2025
|
|
|
| Fund:National Natural Science Foundation of China(No.41971340), Natural Science Foundation of Fujian Province(No.2024J01158) |
|
Corresponding Authors:
HE Wenwu, Ph.D., professor. His research interests include trust-worthy artificial intelligence and graph neural networks.
|
About author:: LIU Xiaoyu, Master student. Her research interests include artificial intelligence and graph neural networks. MAO Guojun, Ph.D., professor. His research interests include artificial intelligence, data mining and distributed computing. |
|
|
|
[1] GAO C, ZHENG Y, LI N, et al. A Survey of Graph Neural Networks for Recommender Systems: Challenges, Methods, and Directions. ACM Transactions on Recommender Systems, 2023, 1(1). DOI: 10.1145/356802. [2] PURIFICATO E, BORATTO L, DE LUCA E W. Leveraging Graph Neural Networks for User Profiling: Recent Advances and Open Challenges // Proc of the 32nd ACM International Conference on Information and Knowledge Management. New York, USA: ACM, 2023: 5216-5219. [3] LI Q, PENG H, LI J X, et al. A Survey on Text Classification: From Traditional to Deep Learning. ACM Transactions on Intelligent Systems and Technology, 2022, 13(2). DOI: 10.1145/34951. [4] GOU J P, YU B S, MAYBANK S J, et al. Knowledge Distillation: A Survey. International Journal of Computer Vision, 2021, 129(6): 1789-1819. [5] CHEN L, CHEN Z D, BRUNA J.On Graph Neural Networks Versus Graph-Augmented MLPS[C/OL]. [2025-03-17].https://arxiv.org/pdf/2010.15116. [6] ZHANG S C, LIU Y Z, SUN Y Z, et al. Graph-Less Neural Networks: Teaching Old MLPS New Tricks via Distillation[C/OL].[2025-03-17]. https://arxiv.org/pdf/2110.08727. [7] ZHENG W Q, HUANG E W, RAO N, et al. Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods[C/OL].[2025-03-17]. https://arxiv.org/pdf/2111.04840. [8] ZHANG H Y, CISSE M, DAUPHIN Y N, et al. mixup: Beyond Empirical Risk Minimization[C/OL].[2025-03-17]. https://arxiv.org/pdf/1710.09412.pdf. [9] VERMA V, QU M, KAWAGUCHI K, et al. GraphMix: Improved Training of GNNS for Semi-Supervised Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(11): 10024-10032. [10] LU W G, GUAN Z Y, ZHAO W, et al. NodeMixup: Tackling Under-Reaching for Graph Neural Networks. Proceedings of the AAAI Conference on Artificial Intelligence, 2024, 38(13): 14175-14183. [11] KIPF T N, WELLING M.Semi-supervised Classification with Graph Convolutional Networks[C/OL]. [2025-03-17].https://arxiv.org/pdf/1609.02907. [12] VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph Attention Networks[C/OL].[2025-03-17]. https://arxiv.org/pdf/1710.10903. [13] HAMILTON W L, YING R, LESKOVEC J.Inductive Representation Learning on Large Graphs // Proc of the 31st International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2017: 1025-1035. [14] CHEN J, MA T F, XIAO C.FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling[C/OL]. [2025-03-17].https://arxiv.org/pdf/1801.10247. [15] YANG Y D, QIU J Y, SONG M L, et al. Distilling Knowledge from Graph Convolutional Networks // Proc of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2020: 7072-7081. [16] YAN B C, WANG C K, GUO G Y, et al. TinyGNN: Learning Efficient Graph Neural Networks // Proc of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2020: 1848-1856. [17] LU W G, GUAN Z Y, ZHAO W, et al. AdaGMLP: AdaBoosting GNN-to-MLP Knowledge Distillation // Proc of the 30th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2024: 2060-2071. [18] TIAN Y J, ZHANG C X, GUO Z C, et al. Learning MLPs on Graphs: A Unified View of Effectiveness, Robustness, and Efficiency[C/OL].[2025-03-17]. https://openreview.net/pdf?id=Cs3r5KLdoj. [19] KIM J, YUN S, PARK C.S-Mixup: Structural Mixup for Graph Neural Networks // Proc of the 32nd ACM International Conference on Information and Knowledge Management. New York, USA: ACM, 2023: 4003-4007. [20] BRAHIM L, LOUBNA B, ALI I.A Literature Survey on Label Pro-pagation for Community Detection // Proc of the 5th International Conference on Intelligent Computing in Data Sciences. Washington, USA: IEEE, 2021. DOI: 10.1109/ICDS53782.2021.9626716. [21] GHAYEKHLOO M, NICKABADI A.CLP-GCN: Confidence and La-bel Propagation Applied to Graph Convolutional Networks. Applied Soft Computing, 2023, 132. DOI: 10.1016/j.asoc.2022.109850. [22] WANG H W, LESKOVEC J.Unifying Graph Convolutional Neural Networks and Label Propagation[C/OL]. [2025-03-17].https://arxiv.org/pdf/2002.06755. [23] YANG C, LIU J W, SHI C.Extract the Knowledge of Graph Neural Networks and Go Beyond It: An Effective Knowledge Distillation Framework // Proc of the Web Conference. New York, USA: ACM, 2021:1227-1237. [24] XU H, XIANG L Y, HUANG F, et al. GRACE: Graph Self-Distillation and Completion to Mitigate Degree-Related Biases // Proc of the 29th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2023: 2813-2824. [25] SEN P, NAMATA G, BILGIC M, et al. Collective Classification in Network Data. AI Magazine, 2008, 29(3): 93-106. [26] NAMATA G, LONDON B, GETOOR L, et al. Query-Driven Active Surveying for Collective Classification[C/OL].[2025-03-17]. https://dtai.cs.kuleuven.be/events/mlg2012/papers/11_querying_namata.pdf. [27] SHCHUR O, MUMME M, BOJCHEVSKI A, et al. Pitfalls of Graph Neural Network Evaluation[C/OL].[2025-03-17]. https://arxiv.org/pdf/1811.05868. [28] DU H Y, YU R, BAI L, et al. Learning Structure Perception MLPs on Graphs: A Layer-Wise Graph Knowledge Distillation Framework. International Journal of Machine Learning and Cybernetics, 2024, 15(10): 4357-4372. |
|
|
|