Improving Activity Recognition Classification of Repetitious Beginner Swimming Using a 2-Step Peak/Valley Segmentation Method with Smoothing and Resampling for Machine Learning
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33093
Improving Activity Recognition Classification of Repetitious Beginner Swimming Using a 2-Step Peak/Valley Segmentation Method with Smoothing and Resampling for Machine Learning

Authors: Larry Powell, Seth Polsley, Drew Casey, Tracy Hammond

Abstract:

Human activity recognition (HAR) systems have shown positive performance when recognizing repetitive activities like walking, running, and sleeping. Water-based activities are a reasonably new area for activity recognition. However, water-based activity recognition has largely focused on supporting the elite and competitive swimming population, which already has amazing coordination and proper form. Beginner swimmers are not perfect, and activity recognition needs to support the individual motions to help beginners. Activity recognition algorithms are traditionally built around short segments of timed sensor data. Using a time window input can cause performance issues in the machine learning model. The window’s size can be too small or large, requiring careful tuning and precise data segmentation. In this work, we present a method that uses a time window as the initial segmentation, then separates the data based on the change in the sensor value. Our system uses a multi-phase segmentation method that pulls all peaks and valleys for each axis of an accelerometer placed on the swimmer’s lower back. This results in high recognition performance using leave-one-subject-out validation on our study with 20 beginner swimmers, with our model optimized from our final dataset resulting in an F-Score of 0.95.

Keywords: Time window, peak/valley segmentation, feature extraction, beginner swimming, activity recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205

References:


[1] J. E. Morais, J. P. Oliveira, T. Sampaio, and T. M. Barbosa, “Wearables in swimming for real-time feedback: A systematic review,” Sensors, vol. 22, no. 10, p. 3677, 2022.
[2] U. Maurer, A. Smailagic, D. P. Siewiorek, and M. Deisher, “Activity recognition and monitoring using multiple sensors on different body positions,” in International Workshop on Wearable and Implantable Body Sensor Networks (BSN’06). IEEE, 2006, pp. 4–pp.
[3] Q. Sun, E. Gonzalez, and B. Abadines, “A wearable sensor based hand movement rehabilitation and evaluation system,” in 2017 Eleventh International Conference on Sensing Technology (ICST). IEEE, 2017, pp. 1–4.
[4] D. De, P. Bharti, S. K. Das, and S. Chellappan, “Multimodal wearable sensing for fine-grained activity recognition in healthcare,” IEEE Internet Computing, vol. 19, no. 5, pp. 26–35, 2015.
[5] S. Reddy, M. Mun, J. Burke, D. Estrin, M. Hansen, and M. Srivastava, “Using mobile phones to determine transportation modes,” ACM Transactions on Sensor Networks (TOSN), vol. 6, no. 2, pp. 1–27, 2010.
[6] O. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors,” IEEE communications surveys & tutorials, vol. 15, no. 3, pp. 1192–1209, 2012.
[7] R. Mooney, L. R. Quinlan, G. Corley, A. Godfrey, C. Osborough, and G. O´ Laighin, “Evaluation of the finis swimsense® and the garmin swim™ activity monitors for swimming performance and stroke kinematics analysis,” PloS one, vol. 12, no. 2, p. e0170902, 2017.
[8] Z. Zhang, D. Xu, Z. Zhou, J. Mai, Z. He, and Q. Wang, “Imu-based underwater sensing system for swimming stroke classification and motion analysis,” in 2017 IEEE International Conference on Cyborg and Bionic Systems (CBS). IEEE, 2017, pp. 268–272.
[9] X. Qin, Y. Song, G. Zhang, F. Guo, and W. Zhu, “Quantifying swimming activities using accelerometer signal processing and machine learning: a pilot study,” Biomedical Signal Processing and Control, vol. 71, p. 103136, 2022.
[10] M. Lee, H. Lee, and S. Park, “Accuracy of swimming wearable watches for estimating energy expenditure.” International Journal of Applied Sports Sciences, vol. 30, no. 1, 2018.
[11] Fitbit. (2022) Us-10918907-b2 - automatic detection and quantification of swimming.
[Online]. Available: https://portal.unifiedpatents.com/patents/patent/US-10918907-B2
[12] M. B¨achlin and G. Tr¨oster, “Pervasive computing in swimming: A model describing acceleration data of body worn sensors in crawl swimming,” in 2009 Joint Conferences on Pervasive Computing (JCPC), 2009, pp. 293–298.
[13] S. Fantozzi, V. Coloretti, M. F. Piacentini, C. Quagliarotti, S. Bartolomei, G. Gatta, and M. Cortesi, “Integrated timing of stroking, breathing, and kicking in front-crawl swimming: A novel stroke-by-stroke approach using wearable inertial sensors,” Sensors, vol. 22, no. 4, p. 1419, 2022.
[14] Y. Ohgi, “Microcomputer-based acceleration sensor device for sports biomechanics-stroke evaluation by using swimmer’s wrist acceleration,” in SENSORS, 2002 IEEE, vol. 1. IEEE, 2002, pp. 699–704.
[15] C. Ma, W. Li, J. Cao, J. Du, Q. Li, and R. Gravina, “Adaptive sliding window based activity recognition for assisted livings,” Information Fusion, vol. 53, pp. 55–65, 2020.
[16] J. Cherian, V. Rajanna, D. Goldberg, and T. Hammond, “Did you remember to brush?: a noninvasive wearable approach to recognizing brushing teeth for elderly care,” in Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare. Barcelona, Spain: ACM, 2017, pp. 48–57.
[17] B. A. Morrongiello, M. Sandomierski, D. C. Schwebel, and B. Hagel, “Are parents just treading water? the impact of participation in swim lessons on parents’ judgments of children’s drowning risk, swimming ability, and supervision needs,” Accident Analysis & Prevention, vol. 50, pp. 1169–1175, 2013.
[18] R. K. Stallman, K. Moran Dr, L. Quan, and S. Langendorfer, “From swimming skill to water competence: Towards a more inclusive drowning prevention future,” International Journal of Aquatic Research and Education, vol. 10, no. 2, p. 3, 2017.
[19] W. Sousa Lima, E. Souto, K. El-Khatib, R. Jalali, and J. Gama, “Human activity recognition using inertial sensors in a smartphone: An overview,” Sensors, vol. 19, no. 14, p. 3213, 2019.
[20] T. M. Sezgin, T. Stahovich, and R. Davis, “Sketch based interfaces: Early processing for sketch understanding,” in ACM SIGGRAPH 2007 Courses, ser. SIGGRAPH ’07. New York, NY, USA: Association for Computing Machinery, 2007, p. 37–es.
[Online]. Available: https://doi.org/10.1145/1281500.1281548
[21] B. Paulson and T. Hammond, “A system for recognizing and beautifying low-level sketch shapes using ndde and dcr,” in ACM Symposium on User Interface Software and Technology (UIST2007), 2007.
[22] R. Blagojevic, S. H.-H. Chang, and B. Plimmer, “The power of automatic feature selection: Rubine on steroids.” SBIM, vol. 10, pp. 79–86, 2010.
[23] Z. Wang, X. Shi, J. Wang, F. Gao, J. Li, M. Guo, H. Zhao, and S. Qiu, “Swimming motion analysis and posture recognition based on wearable inertial sensors,” in 2019 IEEE international conference on systems, man and cybernetics (SMC). IEEE, 2019, pp. 3371–3376.
[24] B. Burkett, R. Mellifont, and B. Mason, “The influence of swimming start components for selected olympic and paralympic swimmers,” Journal of applied biomechanics, vol. 26, no. 2, pp. 134–141, 2010.
[25] Y. Matsuda, Y. Yamada, Y. Ikuta, T. Nomura, and S. Oda, “Intracyclic velocity variation and arm coordination for different skilled swimmers in the front crawl,” Journal of human kinetics, vol. 44, p. 67, 2014.
[26] M. Hołub, A. Stanula, J. Baron, W. Głyk, T. Rosemann, and B. Knechtle, “Predicting breaststroke and butterfly stroke results in swimming based on olympics history,” International journal of environmental research and public health, vol. 18, no. 12, p. 6621, 2021.