Abstract:The diversity among base classifiers is crucial for ensemble learning, and intuitively resampling pairwise constraints get better diversity than resampling instances. The supervision information in the form of pairwise constraints is introduced for feature extraction of samples to generate new training data based on canonical correlation analysis (CCA). In this algorithm, the spirit of ensemble learning is embodied in the way to select constraints. The constraints are resampled randomly to get the diverse base classifiers on multiview data. The experiments are carried out on multiple feature database and Yale and AR facial databases, and the results show that the proposed ensemble method achieves better performance than the conventional ensemble learning methods.
郭云,张道强,宋通. 基于约束的典型相关分析集成学习算法[J]. 模式识别与人工智能, 2012, 25(5): 851-858.
GUO Yun, ZHANG Dao-Qiang, SONG Tong. An Ensemble Learning Method Based on CCA with Pairwise Constraints. , 2012, 25(5): 851-858.
[1] Guillaumin M,Verbeek J J,Schmid C.Multimodal Semi-Supervised Learning for Image Classification // Proc of the 23rd IEEE Conference on Computer Vision and Pattern Recognition.San Francisco,USA,2010: 902-909 [2] Blum A,Mitchell T.Combining Labeled and Unlabeled Data with Co-Training // Proc of the 11th Annual Conference on Computational Learning Theory.Madison,USA,1998: 92-100 [3] Abhishek K,Hal Daumé III.A Co-Training Approach for Multi-View Spectral Clustering // Proc of the 28th International Conference on Machine Learning.Bellevue,USA,2011: 393-400 [4] Nigam K,Ghani R.Analyzing the Effectiveness and Applicability of Co-Training // Proc of the International Conference on Information and Knowledge Management.Washington,USA,2000: 86-93 [5] Hotelling H.Relation between Two Sets of Variates.Biometrica,1936,28 (3/4): 321-377 [6] Sun Tingkai,Chen Songcan,Yang Jingyu,et al.A Novel Method of Combined Feature Extraction for Recognition // Proc of the 8th IEEE International Conference on Data Mining.Pisa,Italy,2008: 1043-1048 [7] Dietterich T G.Ensemble Methods in Machine Learning // Proc of the 1st International Workshop on Multiple Classifier Systems.Cagliari,Italy,2000: 1-15 [8] Breiman L.Bagging Predictors.Machine Learning,1996,24(2): 123-140 [9] Freund Y,Schapire R E.Experiments with a New Boosting Algorithm // Proc of the 13th International Conference on Machine Learning.Bari,Italy,1996: 148-156 [10] Ho T K.The Random Subspace Method for Constructing Decision Forests.Pattern Analysis and Machine Intelligence,1998,20(8): 832-844 [11] Zhang Daoqiang,Chen Songcan,Zhou Zhihua,et al.Constraint Projections for Ensemble Learning // Proc of the 23rd AAAI Conference on Artificial Intelligence.Chicago,USA,2008,II: 758-763 [12] Okun O,Priisalu H.Multiple Views in Ensembles of Nearest Neighbor Classifiers // Proc of the ICML Workshop on Learning with Multiple Views.Bonn,Germany,2005: 51-58 [13] Guo H,Viktor H L.Mining Relational Databases with Multi-View Learning // Proc of the 4th International Workshop on Multi-Relational Mining.Chicago,USA,2005: 15-24 [14] Zhang Jiangchun,Zhang Daoqiang.A Novel Ensemble Construction Method for Multi-View Data Using Random Cross-View Correlation between Within-Class Examples.Pattern Recognition,2011,44(6): 1162-1171 [15] Peng Yan,Zhang Daoqiang.Semi-Supervised CCA.Journal of Software,2008,19(11): 2822-2832(in Chinese) (彭 岩,张道强.半监督典型相关分析算法.软件学报,2008,19(11): 2822-2832)