Parallel Extreme Learning Machine Based on Improved Particle Swarm Optimization
LI Wanhua1,2,3, CHEN Yuzhong1,2,3, GUO Kun1,2,3, GUO Songrong1,2,3, LIU Zhanghui1,2
1.College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350116.2.Fujian Provincial Key Laboratory of Network Computing and Intelligent Information Processing, .Fuzhou University, Fuzhou 350116.3.Fujian Collaborative Innovation Center for Big Data Applications in Governments, Fuzhou 350003
Abstract:To improve the stability of extreme learning machine(ELM), an extreme learning machine based on improved particle swarm optimization (IPSO-ELM) is proposed. By combining the improved particle swarm optimization with ELM, IPSO-ELM can find the optimal number of nodes in the hidden layer as well as the optimal input weights and hidden biases. Furthermore, a mutation operator is introduced into IPSO-ELM to enhance the diversity of swarm and improve the convergence speed of the random search process. Then, to handle the large-scale electrical load data, a parallel version of IPSO-ELM named PIPSO-ELM is implemented with the popular parallel computing framework Spark. Experimental results of real-life electrical load data show that PIPSO-ELM obtains better stability and scalability with higher efficiency in large-scale electrical load prediction.
[1] WEI R R, WEI Z Z, RONG R, et al. Short Term Load Forecasting Based on PCA and LS-SVM. Advanced Materials Research, 2013, 756/757/758/759: 4193-4197. [2] AHMAD A S, HASSAN M Y, ABDULLAH M P, et al. A Review on Applications of ANN and SVM for Building Electrical Energy Consumption Forecasting. Renewable and Sustainable Energy Reviews, 2014, 33: 102-109. [3] SCHORFHEIDE F, SONG D H. Real-Time Forecasting with a Mixed-Frequency VAR. Journal of Business & Economic Statistics, 2015, 33(3): 366-380. [4] SUN Y Q, WANG R J, SUN B Y, et al. Prediction about Time Series Based on Updated Prediction ARMA Model // Proc of the 10th International Conference on Fuzzy Systems and Knowledge Discovery. New York, USA: IEEE, 2013: 680-684. [5] XIA C H, WANG J, MCMENEMY K. Short, Medium and Long Term Load Forecasting Model and Virtual Load Forecaster Based on Radial Basis Function Neural Networks. International Journal of Electrical Power & Energy Systems, 2010, 32(7): 743-750. [6] KODOGIANNIS V S, AMINA M, PETROUNIAS I. A Clustering-Based Fuzzy Wavelet Neural Network Model for Short-Term Load Forecasting. International Journal of Neural Systems, 2013, 23(5). DOI: 10.1142/S012906571350024X. [7] PENG H W, WU S F, WEI C C, et al. Time Series Forecasting with a Neuro-Fuzzy Modeling Scheme. Applied Soft Computing, 2015, 32: 481-493. [8] EGRIOGLU E, YOLCU U, ALADAG C H, et al. Recurrent Multiplicative Neuron Model Artificial Neural Network for Non-linear Time Series Forecasting. Procedia-Social and Behavioral Sciences, 2014, 109: 1094-1100. [9] HUANG G B, ZHU Q Y, SIEW C K. Extreme Learning Machine: Theory and Applications. Neurocomputing, 2006, 70(1/2/3): 489-501. [10] XU Y. A Gradient-Based ELM Algorithm in Regressing Multi-variable Functions // Proc of the 3rd International Symposium on Neural Networks. Berlin, Germany: Springer, 2006, I: 653-658. [11] ZHU Q Y, QIN A K, SUGANTHAN P N, et al. Evolutionary Extreme Learning Machine. Pattern Recognition, 2005, 38(10): 1759-1763. [12] KENNEDY J, EBERHART R. Particle Swarm Optimization // Proc of the IEEE International Conference on Neural Networks. New York, USA: IEEE, 1995, IV: 1942-1948. [13] XU Y, SHU Y. Evolutionary Extreme Learning Machine-Based on Particle Swarm Optimization // Proc of the 3rd International Symposium on Neural Networks. Berlin, Germany: Springer, 2006, I: 644-652. [14] HUANG G B, LIANG N Y, RONG H J, et al. On-line Sequential Extreme Learning Machine // Proc of the IASTED International Conference on Computational Intelligence. Calgary, Canada: ACTA Press, 2005: 232-237. [15] ZAHARIA M, CHOWDHURY M, DAS T, et al. Resilient Distri-buted Datasets: A Fault-Tolerant Abstraction for In-memory Cluster Computing.Technical Report,USB/EECS-2011-82. Berkeley, USA: University of California, 2012. [16] HUANG G B, CHEN L, SIEW C K. Universal Approximation Using Incremental Constructive Feedforward Networks with Random Hidden Nodes. IEEE Trans on Neural Networks, 2006, 17(4): 879-892. [17] WHITE T. Hadoop: The Definitive Guide. Sebastopol, USA: O'reilly Media, Inc. 2010. [18] CZERWINSKI D. Digital Filter Implementation in Hadoop Data Mining System // Proc of the 22nd International Conference on Digital Filter Implementation in Hadoop Data Mining System. Berlin, Germany: Springer, 2015: 410-420. [19] FOLKERT K, FOJCIK M. Ontology-Based Integrated Monitoring of Hadoop Clusters in Industrial Environments with OPC UA and RESTful Web Services [C/OL].[2015-12-01].link.springer.com/chapter/10.1007/978-3-319-19419-6-15/fulltext.html. [20] HE Q, WANG H C, ZHUANG F Z, et al. Parallel Sampling from Big Data with Uncertainty Distribution. Fuzzy Sets and Systems, 2015, 258: 117-133