Classification Algorithms in Human Activity Recognition using Smartphones
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32799
Classification Algorithms in Human Activity Recognition using Smartphones

Authors: Mohd Fikri Azli bin Abdullah, Ali Fahmi Perwira Negara, Md. Shohel Sayeed, Deok-Jai Choi, Kalaiarasi Sonai Muthu

Abstract:

Rapid advancement in computing technology brings computers and humans to be seamlessly integrated in future. The emergence of smartphone has driven computing era towards ubiquitous and pervasive computing. Recognizing human activity has garnered a lot of interest and has raised significant researches- concerns in identifying contextual information useful to human activity recognition. Not only unobtrusive to users in daily life, smartphone has embedded built-in sensors that capable to sense contextual information of its users supported with wide range capability of network connections. In this paper, we will discuss the classification algorithms used in smartphone-based human activity. Existing technologies pertaining to smartphone-based researches in human activity recognition will be highlighted and discussed. Our paper will also present our findings and opinions to formulate improvement ideas in current researches- trends. Understanding research trends will enable researchers to have clearer research direction and common vision on latest smartphone-based human activity recognition area.

Keywords: Classification algorithms, Human Activity Recognition (HAR), Smartphones

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1071041

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6242

References:


[1] O. W. H. Wu, A. a T. Bui, M. a Batalin, L. K. Au, J. D. Binney, and W. J. Kaiser, "MEDIC: medical embedded device for individualized care.," Artificial intelligence in medicine, vol. 42, no. 2, pp. 137-52, Feb. 2008.
[2] X. Long, B. Yin, and R. M. Aarts, "Single-Accelerometer-Based Daily Physical Activity Classification," pp. 6107-6110, 2009.
[3] R. Bartalesi, F. Lorussi, M. Tesconi, A. Tognetti, G. Zupone, and D. D. Rossi, "Wearable Kinesthetic System for Capturing and Classifying Upper Limb Gesture Kinesthetic Wearable Sensors and Kine- matic Models of Human Joints."
[4] A. Tognetti, F. Lorussi, M. Tesconi, R. Bartalesi, G. Zupone, and D. De Rossi, "Wearable kinesthetic systems for capturing and classifying body posture and gesture," 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference, pp. 1012-1015, 2005.
[5] T.-seong Kim, "Multi-modal Sensor-based Human Activity Recognition for Smart Homes Author biography," no. February, pp. 157-174, 2011.
[6] Y.-ting Chiang, Y.-ting Tsao, and J. Y.-jen Hsu, "A Framework for Activity Recognition in a Smart Home."
[7] C. N. Jing Liao, Yaxin Bi, "Activity recognition for Smart Homes using Dempster-Shafer theory of Evidence based on a revised lattice structure," 2010 Sixth International Conference on Intelligent Environments, pp. 174-177, Jul. 2010.
[8] K. Y. C. Zaw Zaw Htike, Simon Egerton, "Real-time Human Activity Recognition using External and Internal Spatial Features," 2010 Sixth International Conference on Intelligent Environments, pp. 25-28, Jul. 2010.
[9] C. Wu, A. H. Khalili, and H. Aghajan, "Multiview activity recognition in smart homes with spatio-temporal features," Proceedings of the Fourth ACM/IEEE International Conference on Distributed Smart Cameras - ICDSC -10, p. 142, 2010.
[10] L. Wang, T. Gu, X. Tao, H. Chen, and J. Lu, "Recognizing multi-user activities using wearable sensors in a smart home," Pervasive and Mobile Computing, vol. 7, no. 3, pp. 287-298, Jun. 2011.
[11] N. Robertson and I. Reid, "A general method for human activity recognition in video," Computer Vision and Image Understanding, vol. 104, no. 2-3, pp. 232-248, Nov. 2006.
[12] C. Thurau and V. Hlavac, "Pose primitive based human action recognition in videos or still images," 2008 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1-8, Jun. 2008.
[13] R. Poovandran, "Human activity recognition for video surveillance," 2008 IEEE International Symposium on Circuits and Systems, pp. 2737-2740, May 2008.
[14] G. D. Abowd, C. G. Atkeson, J. Hong, S. Long, R. Kooper, and M. Pinkerton, "Cyberguide: A mobile context-aware tour guide," vol. 3, pp. 421-433, 1997.
[15] K. Cheverst, N. Davies, K. Mitchell, A. Friday, and C. Efstratiou, "Developing a Context-aware Electronic Tourist Guide: Some Issues and Experiences," vol. 2, no. 1, pp. 17-24, 2000.
[16] T. Schl, B. Poppinga, N. Henze, and S. Boll, "Gesture Recognition with a Wii Controller," pp. 18-21, 2008.
[17] F. Alt, A. S. Shirazi, M. Pfeiffer, P. Holleis, and A. Schmidt, "TaxiMedia: An Interactive Context-Aware Entertainment and Advertising System," 2009.
[18] L. Bao and S. S. Intille, "Activity Recognition from User-Annotated Acceleration Data," Most, pp. 1-17, 2004.
[19] M.-hee Lee, J. Kim, K. Kim, I. Lee, S. H. Jee, and S. K. Yoo, "Physical Activity Recognition Using a Single Tri-Axis Accelerometer," vol. I, pp. 20-23, 2009.
[20] A. M. Khan, Y. K. Lee, and T.-S. Kim, "Accelerometer signal-based human activity recognition using augmented autoregressive model coefficients and artificial neural nets," 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5172-5175, Aug. 2008.
[21] M. Fahriddin, M. G. Song, J. Y. Kim, and S. Y. Na, "Human Activity Recognition Using New Multi-Sensor Module in Mobile Environment," 2011.
[22] Y. Cho, Y. Nam, Y.-joo Choi, and W.-duke Cho, "SmartBuckle: Human Activity Recognition using a 3-axis Accelerometer and a Wearable Camera," pp. 2-4, 2008.
[23] S.-I. Yang and S.-B. Cho, "Recognizing human activities from accelerometer and physiological sensors," 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pp. 100-105, Aug. 2008.
[24] M. Z. Uddin, "Human Activity Recognition Using Body Joint-Angle Features and Hidden Markov Model," ETRI Journal, vol. 33, no. 4, pp. 569-579, Aug. 2011.
[25] D. Tran and A. Sorokin, "Human Activity Recognition with Metric Learning," no. Section 2, pp. 548-561, 2008.
[26] N. Haven, "Recognizing Activities from Context and Arm Pose using Finite State Machines."
[27] E. Miluzzo, M. Papandrea, N. D. Lane, H. Lu, and A. T. Campbell, "Pocket , Bag , Hand , etc . - Automatically Detecting Phone Context through Discovery."
[28] C. Qin and X. Bao, "TagSense: A Smartphone-based Approach to Automatic Image Tagging," pp. 1-14, 2011.
[29] G. Haché, E. D. Lemaire, and N. Baddour, "Wearable Mobility Monitoring Using a Multimedia Smartphone Platform," vol. 60, no. 9, pp. 3153-3161, 2011.
[30] H.-tze Cheng, S. Buthpitiya, F.-tso Sun, and M. Griss, "OmniSense: A Collaborative Sensing Framework for User Context Recognition Using Mobile Phones," in Eleventh Workshop on Mobile Computing Systems and Applications (HotMobile) 2010, 2010, p. 4503.
[31] P. D. Thompson, "Exercise and Physical Activity in the Prevention and Treatment of Atherosclerotic Cardiovascular Disease: A Statement From the Council on Clinical Cardiology (Subcommittee on Exercise, Rehabilitation, and Prevention) and the Council on Nutrition, Physical," Arteriosclerosis, Thrombosis, and Vascular Biology, vol. 23, no. 8, p. 42e-49, Aug. 2003.
[32] J. L. Obermayer, W. T. Riley, O. Asif, and J. Jean-Mary, "College smoking-cessation using cell phone text messaging.," Journal of American college health J of ACH, vol. 53, no. 2, pp. 71-78, 2004.
[33] S. Consolvo et al., "Flowers or a Robot Army? Encouraging Awareness & Activity with Personal , Mobile Displays," pp. 54-63, 2008.
[34] G. Bieber, A. Luthardt, C. Peter, and B. Urban, "The Hearing Trousers Pocket - Activity Recognition by Alternative Sensors," Science, 2011.
[35] S. C. Wangberg, E. Arsand, and N. Andersson, "Diabetes education via mobile text messaging.," Journal of telemedicine and telecare, vol. 12 Suppl 1, pp. 55-6, Jan. 2006.
[36] C. Nickel, C. Busch, and M. Möbius, "Using Hidden Markov Models for Accelerometer-Based Biometric Gait Recognition," Applied Sciences, pp. 58-63, 2011.
[37] D. Riboni and C. Bettini, "COSAR: hybrid reasoning for context-aware activity recognition," Personal and Ubiquitous Computing, vol. 15, no. 3, pp. 271-289, Aug. 2010.
[38] L. Sun, D. Zhang, and N. Li, "Physical Activity Monitoring with Mobile Phones," pp. 104-111, 2011.
[39] Y.-seol Lee and S.-bae Cho, "Activity Recognition Using Hierarchical Hidden Markov Models on a Smartphone with 3D Accelerometer," pp. 460-467, 2011.
[40] Z. Zhao, Y. Chen, J. Liu, Z. Shen, and M. Liu, "Cross-People Mobile- Phone Based Activity Recognition," Science, pp. 2545-2550, 2006.
[41] B. Longstaff, S. Reddy, and D. Estrin, "Improving Activity Classification for Health Applications on Mobile Devices using Active and Semi-Supervised Learning," Computer.
[42] J. R. Kwapisz, G. M. Weiss, and S. A. Moore, "Activity Recognition using Cell Phone Accelerometers," Human Factors, vol. 12, no. 2, pp. 74-82, 2010.
[43] L. Sun, D. Zhang, B. Li, B. Guo, and S. Li, "Activity Recognition on an Accelerometer Embedded Mobile Phone with Varying Positions and Orientations," pp. 548-562, 2010.
[44] M. Berchtold, M. Budde, H. R. Schmidtke, and M. Beigl, "An Extensible Modular Recognition Concept That Makes Activity Recognition Practical," pp. 400-409, 2010.
[45] N. Bicocchi, M. Mamei, and F. Zambonelli, "Detecting activities from body-worn accelerometers via instance-based algorithms," Pervasive and Mobile Computing, vol. 6, no. 4, pp. 482-495, Aug. 2010.
[46] S. L. Lau, I. König, K. David, B. Parandian, C. Carius-d├╝ssel, and M. Schultz, "Supporting Patient Monitoring Using Activity Recognition with a Smartphone," Immanuel, no. April, pp. 810-814, 2010.
[47] S. L. Lau and K. David, "Movement Recognition using the Accelerometer in Smartphones," Communication, pp. 1-9, 2010.
[48] V. Könönen, J. M├ñntyj├ñrvi, H. Simil├ñ, J. P├ñrkk├ñ, and M. Ermes, "Automatic feature selection for context recognition in mobile devices," Pervasive and Mobile Computing, vol. 6, no. 2, pp. 181-197, 2010.
[49] J. Yang, "Toward Physical Activity Diary: Motion Recognition Using Simple Acceleration Features with Mobile Phones," Data Processing, pp. 1-9, 2009.
[50] T. S. Saponas, J. Lester, J. Froehlich, J. Fogarty, and J. Landay, "iLearn on the iPhone: Real-Time Human Activity Classification on Commodity Mobile Phones."
[51] E. Martin et al., "Enhancing Context Awareness with Activity Recognition and Radio Fingerprinting," pp. 263-266, 2011.
[52] A. M. Khan, Y. Lee, and S. Y. Lee, "Human Activity Recognition via An Accelerometer-Enabled-Smartphone Using Kernel Discriminant Analysis," Analysis, 2010.
[53] P. Wu, H.-kai Peng, J. Zhu, and Y. Zhang, "SensCare: Semi- Automatic Activity Summarization System for Elderly Care," Scenario.
[54] T. Brezmes, J.-luis Gorricho, and J. Cotrina, "Activity Recognition from Accelerometer Data on a Mobile Phone," Test, pp. 796-799, 2009.
[55] N. Győrb├¡r├│, ├ü. F├íbi├ín, and G. Hom├ínyi, "An Activity Recognition System For Mobile Phones," Mobile Networks and Applications, vol. 14, no. 1, pp. 82-91, Nov. 2008.