模式识别与人工智能
Saturday, May. 3, 2025 Home      About Journal      Editorial Board      Instructions      Ethics Statement      Contact Us                   中文
Pattern Recognition and Artificial Intelligence  2024, Vol. 37 Issue (4): 328-338    DOI: 10.16451/j.cnki.issn1003-6059.202404004
Researches and Applications Current Issue| Next Issue| Archive| Adv Search |
Enhanced Residual Networks via Mixed Knowledge Fraction
TANG Shengji1, YE Peng2, LIN Weihao3, CHEN Tao3
1. School of Information Science and Technology, Fudan University, Shanghai 200433

Download: PDF (1152 KB)   HTML (1 KB) 
Export: BibTeX | EndNote (RIS)      
Abstract  

Methodssuch as stimulative training and group knowledge based training are employed to collect group knowledge from shallow subnets in residual networks for self-distillation, thereby enhancing network performance. However, the group knowledge acquired by these methods suffers from issues such as slow updating and difficulties in combining with DataMix techniques. To address these issues, enhanced residual networks via mixed knowledge fraction(MKF) are proposed. The mixed knowledge is decomposed and modeled as quadratic programming by minimizing the fraction loss, and thus high-quality group knowledge is obtained from the mixed knowledge. To improve the robustness and diversity of the knowledge, a compound DataMix technique is proposed to construct a composite data augmentation method. Different from high-precision optimization algorithms with poor efficiency, a simple and efficient linear knowledge fraction technique is designed. The previous group knowledge is taken as knowledge bases, and the mixed knowledge is decomposed based on the knowledge bases. The enhanced group knowledge is then adopted to distill sampled subnetworks. Experiments on mainstream residual networks and classification datasets verify the effectiveness of MKF.

Key wordsDeep Learning      Neural Network      Knowledge Distillation      Network Enhancement      Residual Network     
Received: 25 March 2024     
ZTFLH: TP 37  
Fund:

National Key Research and Development Program of China(No.2022ZD0160100), National Natural Science Foun-dation of China(No.62071127,62101137), Shanghai Natural Science Foundation(No.23ZR1402900), Shanghai Municipal Science and Technology Major Project(No.2021SHZDZX0103)

Corresponding Authors: CHEN Tao, Ph.D., professor. His research interests include computer vision and machine learning.   
About author:: TANG Shengji, Master student. His research interests include deep learning, model efficiency, and model design and enhancement. YE Peng, Ph.D. candidate. His research interests include computer science, model design and optimization, and artificial intelligence for science. LIN Weihao, Ph.D. candidate. His research interests include computer vision, image recognition, video processing and model compression.
Service
E-mail this article
Add to my bookshelf
Add to citation manager
E-mail Alert
RSS
Articles by authors
TANG Shengji
YE Peng
LIN Weihao
CHEN Tao
Cite this article:   
TANG Shengji,YE Peng,LIN Weihao等. Enhanced Residual Networks via Mixed Knowledge Fraction[J]. Pattern Recognition and Artificial Intelligence, 2024, 37(4): 328-338.
URL:  
http://manu46.magtech.com.cn/Jweb_prai/EN/10.16451/j.cnki.issn1003-6059.202404004      OR     http://manu46.magtech.com.cn/Jweb_prai/EN/Y2024/V37/I4/328
Copyright © 2010 Editorial Office of Pattern Recognition and Artificial Intelligence
Address: No.350 Shushanhu Road, Hefei, Anhui Province, P.R. China Tel: 0551-65591176 Fax:0551-65591176 Email: bjb@iim.ac.cn
Supported by Beijing Magtech  Email:support@magtech.com.cn