Daily Behavior Recognition with Single Sensor Based on Functional Time Series Data Modeling
SU Benyue1,2, ZHENG Dandan1,2, SHENG Min2,3
1.School of Computer and Information, Anqing Normal University, Anqing 246133 2.The University Key Laboratory of Intelligent Perception and Computing of Anhui Province, Anqing Normal University, Anqing 246133 3.School of Mathematics and Computational Science, Anqing Normal University, Anqing 246133
Abstract:In inertial sensor based human activity recognition, the periodic and temporal characteristics are often ignored in the traditional algorithms, and there are corresponding requirements for the size of the sliding window to extract features. In this paper, a recognition algorithm based on functional data analysis and hidden Markov model for periodic behavior is proposed with a single wearable sensor placed on the waist for human daily activities. Firstly, the functional data analysis method is used to fit the motion capture data of periodic daily activities, and then the single cycle data are extracted after fitting. Secondly,based on the single periodic behavior data, a hidden Markov model describing each daily behavior process is established. Finally, human activities are classified with the maximum likelihood .Compared with the multisensor human activity recognition methods, the proposed method is able to effectively classify 8 daily activities via single sensor with high recognition rates in both user dependent mode and user independent mode.
苏本跃, 郑丹丹, 盛敏. 基于函数型数据时间序列建模的单传感器日常行为识别[J]. 模式识别与人工智能, 2018, 31(7): 653-661.
SU Benyue, ZHENG Dandan, SHENG Min. Daily Behavior Recognition with Single Sensor Based on Functional Time Series Data Modeling. , 2018, 31(7): 653-661.
[1] JOHANSSON G. Visual Motion Perception. Scientific American, 1975, 232(6): 76-88. [2] CHEN L M, NUGENT C D, WANG H. A Knowledge-Driven Approach to Activity Recognition in Smart Homes. IEEE Transactions on Knowledge and Data Engineering, 2012, 24(6): 961-974. [3] YANG J, WANG S Q, CHEN N J, et al. Wearable Accelerometer Based Extendable Activity Recognition System // Proc of the IEEE International Conference on Robotics and Automation. Washington, USA: IEEE, 2010: 3641-3647. [4] LEE Y S, CHO S B. Activity Recognition Using Hierarchical Hi-dden Markov Models on a Smartphone with 3D Accelerometer // Proc of the 6th International Conference on Hybrid Artificial Intelligence Systems. Berlin, Germany: Springer, 2011, I: 460-467. [5] SU B Y, TANG Q F, WANG G J, et al. The Recognition of Human Daily Actions with Wearable Motion Sensor System // PAN Z G, CHEOK A D, MLLER W, et al., eds. Transactions on Edutainment XII. Berlin, Germany: Springer, 2016: 68-77. [6] PREECE S J, GOULERMAS J Y, KENNEY L P J, et al. A Comparison of Feature Extraction Methods for the Classification of Dynamic Activities from Accelerometer Data. IEEE Transactions on Biomedical Engineering, 2009, 56(3): 871-879. [7] HE W H, GUO Y C, GAO C, et al. Recognition of Human Activities with Wearable Sensors[J/OL]. [2018-01-20]. https://aspeurasipjournals.springeropen.com/track/pdf/10.1186/1687-6180-2012-108. [8] 李 锋,潘敬奎.基于三轴加速度传感器的人体运动识别.计算机研究与发展, 2016, 53(3): 621-631. (LI F, PAN J K. Human Motion Recognition Based on Triaxial Accelerometer. Journal of Computer Research and Development, 2016, 53(3): 621-631.) [9] LIU J, SHAHROUDY A, XU D, et al. Spatio-Temporal LSTM with Trust Gates for 3D Human Action Recognition // Proc of the 14th European Conference on Computer Vision. Berlin, Germany: Springer, 2016, III: 816-833. [10] HAMMERLA N Y, HALLORAN S, PLÖETZ T. Deep, Convolutional, and Recurrent Models for Human Activity Recognition Using Wearables // Proc of the 25th International Joint Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2016: 1533-1540. [11] MO L F, LI F, ZHU Y J, et al. Human Physical Activity Recognition Based on Computer Vision with Deep Learning Model // Proc of the IEEE International Instrumentation and Measurement Technology Conference Proceedings. Washington, USA: IEEE, 2016. DOI: 10.1109/I2MTC.2016.7320541. [12] DAVIS K, OWUSU E, BASTANI V, et al. Activity Recognition Based on Inertial Sensors for Ambient Assisted Living // Proc of the 19th International Conference on Information Fusion. Washington, USA: IEEE, 2016: 371-378. [13] LING B, INTILLE S S. Activity Recognition from User-Annotated Acceleration Data // Proc of the 2nd International Conference on Pervasive Computing. Berlin, Germany: Springer, 2004: 1-17. [14] YANG A Y, JAFARI R, SASTRY S S, et al. Distributed Recognition of Human Actions Using Wearable Motion Sensor Networks. Journal of Ambient Intelligence and Smart Environments, 2009, 1(2): 103-115. [15] SAN-SEGUNDO R, LORENZO-TRUEBA J, MARTINEZ-GONZ-LEZ B, et al. Segmenting Human Activities Based on HMMs Using Smartphone Inertial Sensors. Pervasive and Mobile Computing, 2016, 30: 84-96. [16] 苏本跃,蒋 京,汤庆丰,等.基于函数型数据分析方法的人体动态行为识别.自动化学报, 2017, 43(5): 866-876. (SU B Y, JIANG J, TANG Q F, et al. Human Dynamic Action Recognition Based on Functional Data Analysis. Acta Automatica Sinica, 2017, 43(5): 866-876.) [17] ALTUN K, BARSHAN B, TUNC’EL O. Comparative Study on Classifying Human Activities with Miniature Inertial and Magnetic Sensors. Pattern Recognition, 2010, 43(10): 3605-3620. [18] YOUNG A J, HARGROVE L J. A Classification Method for User-Independent Intent Recognition for Transfemoral Amputees Using Powered Lower Limb Prostheses. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2016, 24(2): 217-225. [19] 肖 玲,李仁发,罗 娟.体域网中一种基于压缩感知的人体动作识别方法.电子与信息学报, 2013, 35(1): 119-125. (XIAO L. LI R F, LUO J. Recognition of Human Activity Based on Compressed Sensing in Body Sensor Networks. Journal of Electronics and Information Technology, 2013, 35(1): 119-125.) [20] SU B Y, TANG Q F, JIANG J, et al. A Novel Method for Short-Time Human Activity Recognition Based on Improved Template Matching Technique // Proc of the 15th ACM SIGGRAPH Confe-rence on Virtual-Reality Continuum and Its Applications in Industry. New York, USA: ACM, 2016, I: 233-242. [21] SHENG M, JIANG J, SU B Y, et al. Short-Time Activity Recognition with Wearable Sensors Using Convolutional Neural Network // Proc of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry. New York, USA: ACM, 2016, I: 413-416.