|
|
Deep Incremental Image Classification Method Based on Double-Branch Iteration |
HE Li1, HAN Keping1, ZHU Hongxi1, LIU Ying1 |
1. School of Science and Technology, Tianjin University of Finance and Economics, Tianjin 300222 |
|
|
Abstract To solve the catastrophic forgetting problem caused by incremental learning, a deep incremental image classification method based on double-branch iteration is proposed. The primary network is utilized to store the acquired old class knowledge, while the branch network is exploited to learn the new class knowledge. The parameters of the branch network are optimized by the weight of the primary network in the incremental iteration process. Density peak clustering method is employed to select typical samples from the iterative dataset and construct retention set. The retention set is added into the incremental iteration training to mitigate catastrophic forgetting. The experiments demonstrate the better performance of the proposed method.
|
Received: 23 October 2019
|
|
Fund:Supported by National Natural Science Foundation of China(No.1170011574,61502331), Natural Science Foundation of Tianjin(No.16JCYBJC42000,18JCYBJC85100), Tianjin Enterprise Technology Commissioner Project(No.19JCTPJC56300), Scien- tific Research Plan Project of Tianjin Municipal Education Commission(No.2017KJ237) |
Corresponding Authors:
HE Li, Ph.D., professor. Her research interests include data mining and machine learning.
|
About author:: HAN Keping, master student. Her research interests include machine learning and incremental learning; ZHU Hongxi, master student. Her research interests include machine learning and image processing; LIU Ying, master student. Her research interests include data mining and machine learning. |
|
|
|
[1] 姚凯旋,曹飞龙.基于多输入密集连接神经网络的遥感图像时空融合算法.模式识别与人工智能, 2019, 32(5): 429-435. (YAO K X, CAO F L. Spatial-Temporal Fusion Algorithm for Remote Sensing Images Based on Multi-input Dense Connected Neural Network. Pattern Recognition and Artificial Intelligence, 2019, 32(5): 429-435.) [2] 蒋 斌,涂文轩,杨 超,等.基于DenseNet的复杂交通场景语义分割方法.模式识别与人工智能, 2019, 32(5): 472-480. (JIANG B, TU W X, YANG C, et al. Semantic Segmentation Method for Complex Traffic Scene Based on DenseNet. Pattern Re-cognition and Artificial Intelligence, 2019, 32(5): 472-480.) [3] 马 力,王永雄.基于稀疏化双线性卷积神经网络的细粒度图像分类.模式识别与人工智能, 2019, 32(4): 336-344. (MA L, WANG Y X. Fine-Grained Visual Classification Based on Sparse Bilinear Convolutional Neural Network. Pattern Recognition and Artificial Intelligence, 2019, 32(4): 336-344.) [4] 张焯林,赵建伟,曹飞龙.构建带空洞卷积的深度神经网络重建高分辨率图像.模式识别与人工智能, 2019, 32(3): 259-267. (ZHANG Z L, ZHAO J F, CAO F L. Building Deep Neural Networks with Dilated Convolutions to Reconstruct High-Resolution Image. Pattern Recognition and Artificial Intelligence, 2019, 32(3): 259-267.) [5] 盛家川,李玉芝.国画的艺术目标分割及深度学习与分类.中国图象图形学报, 2018, 23(8): 1193-1206. (SHENG J C, LI Y Z. Learning Artistic Objects for Improved Classi-fication of Chinese Paintings. Journal of Image and Graphics, 2018, 23(8): 1193-1206.) [6] MCCLOSKEY M, COHEN N J. Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem. Psychology of Learning and Motivation, 1989, 24: 109-165. [7] GOODFELLOW I J, MIRZA M, XIAO D, et al. An Empirical Investigation of Catastrophic Forgetting in Gradient-Based Neural Networks[C/OL]. [2019-09-04]. https://arxiv.org/pdf/1312.6211.pdf. [8] LI Z Z, HOIEM D. Learning without Forgetting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 40(12): 2935-2947. [9] KIRKPATRICK J, PASCANU R, RABINOWITZ N, et al. Overcoming Catastrophic Forgetting in Neural Networks. Proceedings of the National Academy of Sciences of the United States of America, 2017, 114(13): 3521-3526. [10] LIU X L, MASANA M, HERRANZ L, et al. Rotate Your Networks: Better Weight Consolidation and Less Catastrophic Forge-tting // Proc of the 24th International Conference on Pattern Recognition. Washington, USA: IEEE, 2018: 2262-2268. [11] SHIN H, LEE J K, KIM J, et al.Continual Learning with Deep Generative Replay[C/OL]. [2019-09-04]. https://arxiv.org/pdf/1705.08690.pdf. [12] LIANG K J, LI C Y, WANG G Y, et al. Generative Adversarial Network Training Is a Continual Learning Problem[C/OL]. [2019-09-04]. https://arxiv.org/pdf/1811.11083.pdf. [13] LAVDA F, RAMAPURAM J, GREGOROVA M, et al. Continual Classification Learning Using Generative Models[C/OL]. [2019-09-04]. https://arxiv.org/pdf/1810.10612.pdf. [14] REBUFFI S A, KOLESNIKOV A, SPERL G, et al. iCaRL: Incremental Classifier and Representation Learning // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2017: 5533-5542. [15] WU Y, CHEN Y P, WANG L J, et al. Incremental Classifier Learning with Generative Adversarial Networks[C/OL]. [2019-09-04]. https://arxiv.org/pdf/1802.00853.pdf. [16] CASTRO F M, MARÍN-JIMÉNEZ M J, GUIL N, et al. End-to-End Incremental Learning // Proc of the European Conference on Computer Vision. Heidelberg, Germany: Springer, 2018: 241-257. [17] HE C, WANG R P, SHAN S G, et al. Exemplar-Supported Ge-nerative Reproduction for Class Incremental Learning[C/OL]. [2019-09-04]. http://bmvc2018.org/contents/supplementary/pdf/0325_supp.pdf. [18] RODRIGUEZ A, LAIO A. Clustering by Fast Search and Find of Density Peaks. Science, 2014, 344(6191): 1492-1496. [19] LECUN Y, BOTTOU L, BENGIO Y, et al. Gradient-Based Lear-ning Applied to Document Recognition. Proceedings of the IEEE, 1998, 86(11): 2278-2324. [20] HE K M, ZHANG X Y, REN S Q, et al. Deep Residual Learning for Image Recognition // Proc of the IEEE Conference on Computer Vision and Pattern Recognition. Washington, USA: IEEE, 2016: 770-778. [21] KEMKER R, MCCLURE M, ABITINO A, et al. Measuring Catastrophic Forgetting in Neural Networks[C/OL]. [2019-09-04]. https://128.84.21.199/pdf/1708.02072v1.pdf. |
|
|
|