Personalizing Human Physical Life Routines Recognition over Cloud-Based Sensor Data Via Machine Learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33105
Personalizing Human Physical Life Routines Recognition over Cloud-Based Sensor Data Via Machine Learning

Authors: Kaushik Sathupadi, Sandesh Achar

Abstract:

Pervasive computing is a growing research field that aims to acknowledge human physical life routines (HPLR) based on body-worn sensors such as MEMS (Micro-Electro-Mechanical Systems) sensors-based technologies. The use of these technologies for human activity recognition is progressively increasing. On the other hand, personalizing human life routines using numerous machine-learning techniques has always been an intriguing topic. In contrast, various methods have demonstrated the ability to recognize basic movement patterns. However, it still needs to be improved to anticipate the dynamics of human living patterns. This study presents state-of-the-art techniques for recognizing static and dynamic patterns and forecasting those challenging activities from multi-fused sensors. Furthermore, numerous MEMS signals are extracted from one self-annotated IM-WSHA dataset and two benchmarked datasets. First, raw data were processed with z-normalization and denoiser methods. Then, we adopted statistical, local binary pattern, auto-regressive model, and intrinsic time scale decomposition major features for feature extraction from different domains. Next, the acquired features are optimized using maximum relevance and minimum redundancy (mRMR). Finally, the artificial neural network is applied to analyze the whole system's performance. As a result, we attained a 90.27% recognition rate for the self-annotated dataset, while the HARTH and KU-HAR achieved 83% on nine living activities and 90.94% on 18 static and dynamic routines. Thus, the proposed HPLR system outperformed other state-of-the-art systems when evaluated with other methods in the literature.

Keywords: Artificial intelligence, machine learning, gait analysis, local binary pattern, statistical features, micro-electro-mechanical systems, maximum relevance and minimum redundancy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 0

References:


[1] Feng, Z.; Mo, L.; Li, M. A Random Forest-based ensemble method for activity recognition. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 5074–5077.
[2] Yang, D.; Huang, J.; Tu, X.; Ding, G.; Shen, T.; Xiao, X. A wearable activity recognition device using air-pressure and IMU sensors. IEEE Access 2018, 7, 6611–6621.
[3] Uddin, M.T.; Uddin, M.A. A Guided Random Forest based Feature Selection Approach for Activity Recognition. In Proceedings of the 2nd Int’l Conf. on Electrical Engineering and Information & Communication Technology (ICEEICT), Dhaka, Bangladesh, 21–23 May 2015.
[4] Sabatini, A.M. Estimating Three-Dimensional Orientation of Human Body Parts by Inertial/Magnetic Sensing. Sensors 2011, 11, 1489-1525.
[5] Rodriguez, C.; Fernando, B.; Li, H. Action Anticipation by Predicting Future Dynamic Images. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018.
[6] Gholami, S.; Noori, M. You Don’t Need Labeled Data for Open-Book Question Answering. Appl. Sci. 2022, 12, 111.
[7] Casson, A.J.; Galvez, A.V.; Jarchi, D. Gyroscope vs. accelerometer measurements of motion from wrist PPG during physical exercise. ICT Express 2016, 2, 175–179.
[8] Mustafa, Z., Nsour, H., & ud din Tahir, S. B. (2023). Hand gesture recognition via deep data optimization and 3D reconstruction. PeerJ Computer Science, 9, e1619.
[9] Wang, J.; Xu, Z. Spatio-temporal texture modelling for real-time crowd anomaly detection. Comput. Vis. Image Underst. 2016, 144, 177–187.
[10] Wang, K.; He, J.; Zhang, L. Attention-based Convolutional Neural Network for Weakly Labeled Human Activities Recognition with Wearable Sensors. IEEE Sens. J. 2019, 19, 7598–7604.
[11] Xiao, Q.; Song, R. Human motion retrieval based on statistical learning and bayesian fusion. PLoS ONE 2016, 11, e0164610.
[12] Bhargavi, D.; Coyotl, E.P.; Gholami, S. Knock, knock. Who’s there?--Identifying football player jersey numbers with synthetic data. arXiv 2022, arXiv:2203.00734.
[13] Lara, O.D.; Labrador, M.A. A survey on human activity recognition using wearable sensors. IEEE Commun. Surveys Tutor. 2013, 15, 1192–1209.
[14] Chen, L.; Hoey, J.; Nugent, C.D.; Cook, D.J.; Yu, Z. Sensor-based activity recognition. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2012, 42, 790–808.
[15] Ni, B.; Wang, G.; Moulin, P. RGBD-HuDaAct: A Color-Depth Video Database for Human Daily Activity Recognition. In Consumer Depth Cameras for Computer Vision; Springer: London, UK, 2013; pp. 193–208.
[16] Crispim-Junior, C.F.; Gómez Uría, A.; Strumia, C.; Koperski, M.; König, A.; Negin, F.; Cosar, S.; Nghiem, A.T.; Chau, D.P.; Charpiat, G.; Bremond, F. Online Recognition of Daily Activities by Color-Depth Sensing and Knowledge Models. Sensors 2017, 17, 1528.
[17] Wu, H.; Ma, X.; Zhang, Z; Wang, H; Li, Y. Collecting public RGB-D datasets for human daily activity recognition, Int. J. Adv. Robot. Syst., 14, 2017.
[18] Gu, Y.; Ye, X.; Sheng, W. Depth MHI Based Deep Learning Model for Human Action Recognition. In Proceedings of the 2018 13th World Congress on Intelligent Control and Automation (WCICA), Changsha, China, 4–8 July 2018; pp. 395–400.
[19] Sharif, M.; Khan, M.A.; Akram, T.; Younus, M.J.; Saba, T.; Rehman, A. A framework of human detection and action recognition based on uniform segmentation and combination of Euclidean distance and joint entropy-based features selection. EURASIP J. Image Video Process. 2017, 2017, 89.
[20] Wu, H.; Pan, W.; Xiong, X.; Xu, S. Human activity recognition based on the combined svm&hmm. In Proceedings of the 2014 IEEE International Conference on Information and Automation (ICIA), Hailar, China, 28–30 July 2014; pp. 219–224.
[21] Ahmed, N.; Rafiq, J.I.; Islam, M.R. Enhanced Human Activity Recognition Based on Smartphone Sensor Data Using Hybrid Feature Selection Model. Sensors 2020, 20, 317.
[22] Elkerdawi, S.M.; Sayed, R.; ElHelw, M. Real-time vehicle detection and tracking using Haar-like features and compressive tracking. In Proceedings of the ROBOT2013: First Iberian Robotics Conference; 2014; pp. 381–390.
[23] Miller, N.; Thomas, M.A.; Eichel, J.A.; Mishra, A. A hidden Markov model for vehicle detection and counting. In Proceedings of the 2015 12th Conference on Computer and Robot Vision; 2015; pp. 269–276.
[24] Sun, D.; Watada, J. Detecting pedestrians and vehicles in traffic scene based on boosted HOG features and SVM. In Proceedings of the 2015 IEEE 9th International Symposium on Intelligent Signal Processing (WISP) Proceedings; 2015; pp. 1–4.
[25] Laopracha, N.; Sunat, K.; Chiewchanwattana, S. A novel feature selection in vehicle detection through the selection of dominant patterns of histograms of oriented gradients (DPHOG). IEEE Access 2019, 7, 20894–20919.
[26] Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the Proceedings of the IEEE conference on computer vision and pattern recognition; 2014; pp. 580–587.
[27] Ramasamy Ramamurthy, S.; Roy, N. Recent trends in machine learning for human activity recognition—A survey. WIREs Data Min. Knowl. Discov. 2018, 8, e1254.
[28] Peng, H.; Long, F.; Ding, C. Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 1226–1238.
[29] Abdolrasol, M.G.M.; Hussain, S.M.S.; Ustun, T.S.; Sarker, M.R.; Hannan, M.A.; Mohamed, R.; Ali, J.A.; Mekhilef, S.; Milad, A. Artificial Neural Networks Based Optimization Techniques: A Review. Electronics 2021, 10, 2689.
[30] Tahir, S.B.U.D. Intelligent Media-Wearable Smart Home Activities (IM-WSHA). 2020. Available online: http://portals.au.edu.pk/imc/Pages/Datasets.aspx (accessed on 10 Feb 2022).
[31] Logacjov, A.; Bach, K.; Kongsvold, A.; Bårdstu, H.B.; Mork, P.J. HARTH: A Human Activity Recognition Dataset for Machine Learning. Sensors 2021, 21, 7853.
[32] Sikder, N.; Nahid, A.-A. KU-HAR: An open dataset for heterogeneous human activity recognition. Pattern Recognit. Lett. 2021, 146, 46–54.
[33] Shloul, T.; Javeed, M.; Gochoo, M.; Alsuhibany, S.; Ghadi, Y.; Jalal, A.; Park, J. Student’s Health Exercise Recognition Tool for E-Learning Education. Intell. Autom. Soft Comput. 2022, 35, 149–161.
[34] Y. Y. Ghadi et al., MS-DLD: Multi-Sensors Based Daily Locomotion Detection via Kinematic-Static Energy and Body-Specific HMMs," IEEE Access, 10, 2022, 23964-23979.
[35] Webber, M.; Rojas, R.F. Human Activity Recognition with Accelerometer and Gyroscope: A Data Fusion Approach. IEEE Sens. J. 2021, 21, 16979–16989.
[36] Abid, M.H.; Nahid, A.-A. Two Unorthodox Aspects in Handcrafted-Feature Extraction for Human Activity Recognition Datasets. In Proceedings of the 2021 International Conference on Electronics, Communications and Information Technology (ICECIT), Khulna, Bangladesh, 14–16 September 2021; pp. 1–4.
[37] Subasi, A., Radhwan, M., Kurdi, R., & Khateeb, K. (2018, February). IoT based mobile healthcare system for human activity recognition. In 2018 15th learning and technology conference (L&T) (pp. 29-34). IEEE.