Proprioceptive Neuromuscular Facilitation Exercises of Upper Extremities Assessment Using Microsoft Kinect Sensor and Color Marker in a Virtual Reality Environment
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33090
Proprioceptive Neuromuscular Facilitation Exercises of Upper Extremities Assessment Using Microsoft Kinect Sensor and Color Marker in a Virtual Reality Environment

Authors: M. Owlia, M. H. Azarsa, M. Khabbazan, A. Mirbagheri

Abstract:

Proprioceptive neuromuscular facilitation exercises are a series of stretching techniques that are commonly used in rehabilitation and exercise therapy. Assessment of these exercises for true maneuvering requires extensive experience in this field and could not be down with patients themselves. In this paper, we developed software that uses Microsoft Kinect sensor, a spherical color marker, and real-time image processing methods to evaluate patient’s performance in generating true patterns of movements. The software also provides the patient with a visual feedback by showing his/her avatar in a Virtual Reality environment along with the correct path of moving hand, wrist and marker. Primary results during PNF exercise therapy of a patient in a room environment shows the ability of the system to identify any deviation of maneuvering path and direction of the hand from the one that has been performed by an expert physician.

Keywords: Image processing, Microsoft Kinect, proprioceptive neuromuscular facilitation, upper extremities assessment, virtual reality.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1127567

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1936

References:


[1] N. Smidt, H. C. de Vet, L. M. Bouter, and J. Dekker, "Effectiveness of exercise therapy: a best-evidence summary of systematic reviews," Australian Journal of Physiotherapy, vol. 51, pp. 71-85, 2005.
[2] M. J. Sharman, A. G. Cresswell, and S. Riek, "Proprioceptive neuromuscular facilitation stretching," Sports Medicine, vol. 36, pp. 929-939, 2006.
[3] J. W. Youdas, D. B. Arend, J. M. Exstrom, T. J. Helmus, J. D. Rozeboom, and J. H. Hollman, "Comparison of muscle activation levels during arm abduction in the plane of the scapula vs. proprioceptive neuromuscular facilitation upper extremity patterns," The Journal of Strength & Conditioning Research, vol. 26, pp. 1058-1065, 2012.
[4] H. Mousavi Hondori and M. Khademi, "A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation," Journal of Medical Engineering, vol. 2014, 2014.
[5] A. Fern'ndez-Baena, A. Susin, and X. Lligadas, "Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments," in Intelligent Networking and Collaborative Systems (INCoS), 2012 4th International Conference on, 2012, pp. 656-661.
[6] D. Antón, A. Goni, A. Illarramendi, J. J. Torres-Unda, and J. Seco, "KiReS: A Kinect-based telerehabilitation system," in e-Health Networking, Applications & Services (Healthcom), 2013 IEEE 15th International Conference on, 2013, pp. 444-448.
[7] H. M. Hondori, M. Khademi, and C. V. Lopes, "Monitoring intake gestures using sensor fusion (microsoft kinect and inertial sensors) for smart home tele-rehab setting," in 2012 1st Annual IEEE Healthcare Innovation Conference, 2012.
[8] P. Plantard, E. Auvinet, A.-S. L. Pierres, and F. Multon, "Pose estimation with a kinect for ergonomic studies: Evaluation of the accuracy using a virtual mannequin," Sensors, vol. 15, pp. 1785-1803, 2015.
[9] M. Gabel, R. Gilad-Bachrach, E. Renshaw, and A. Schuster, "Full body gait analysis with Kinect," in Engineering in Medicine and Biology Society (EMBC), 2012 Annual International Conference of the IEEE, 2012, pp. 1964-1967.
[10] S. Kim, G. Kim, and J. Kim, "Motion Recognition for Kinect Sensor Data Using Machine Learning Algorithm with PNF Patterns of Upper Extremities," The Korean Society of Physical Therapy, vol. 27, pp. 214-220, 2015.
[11] D. Sýkora, D. Sedlacek, and K. Riege, "Real-time Color Ball Tracking for Augmented Reality," in IPT/EGVE, 2008, pp. 9-16.
[12] J. Ballester and C. Pheatt, "Using the Xbox Kinect sensor for positional data acquisition," American journal of Physics, vol. 81, pp. 71-77, 2013.