Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33090
Worker Behavior Interpretation for Flexible Production
Authors: Bastian Hartmann, Christoph Schauer, Norbert Link
Abstract:
This paper addresses the problem of recognizing and interpreting the behavior of human workers in industrial environments for the purpose of integrating humans in software controlled manufacturing environments. In this work we propose a generic concept in order to derive solutions for task-related manual production applications. Thus, we are able to use a versatile concept providing flexible components and being less restricted to a specific problem or application. We instantiate our concept in a spot welding scenario in which the behavior of a human worker is interpreted when performing a welding task with a hand welding gun. We acquire signals from inertial sensors, video cameras and triggers and recognize atomic actions by using pose data from a marker based video tracking system and movement data from inertial sensors. Recognized atomic actions are analyzed on a higher evaluation level by a finite state machine.Keywords: activity recognition, task modeling, marker-based video-tracking, inertial sensors.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1076380
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737References:
[1] J. K. Aggarwal, and Q. Cai, "Human Motion Analysis: A Review," Comput. Vis. Image Underst., 73, no. 3, 1991, 428-440.
[2] D. Ayers, and M. Shah, "Monitoring human behavior from video taken in an office environment," Image and Vision Comput., 19, no. 12, 2001,833-846.
[3] C. Colombo, D. Comanducci, and A. Del Bimbo, "Behavior monitoring through automatic analysis of video sequences," Proc. ACM International Conference on Image and Video Retrieval CIVR , 2007,288-293.
[4] D. M. Gavrila, "The visual analysis of human movement: a survey," Comput. Vis. Image Underst., 73, no. 1, 1999, 82-98.
[5] D. Harel, "Statecharts: A visual formalism for complex systems," Sci. Comput. Program. 8, no. 3, 1987, 231-274.
[6] H. Kato, and M. Billinghurst, "Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System," Proc. of 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR), 1999.
[7] R. C. Luo, C.-C. Yih, and K. L. Su, "Multisensor fusion and integration: approaches, applications, and future research directions," Sensors Journal, IEEE 2, no. 2, 2002, 107-119.
[8] T. B. Moeslund, A. Hilton, and V. Kr├╝ger, "A survey of advances in vision-based human motion capture and analysis," Comput. Vis. Image Underst., 104, no. 2, 2006, 90-126.
[9] N. T. Nguyen, H. H. Bui, S. Venkatsh, G. West, "Recognizing and monitoring high-level behaviors in complex spatial environments," Proc. of the IEEE Comp. Society Conference on Computer Vision and Pattern Recognition, 2003, II- 620-5 vol.2.
[10] L. Ojeda, and J. Borenstein, "Non-GPS Navigation for Emergency Responders," Int. Joint Topical Meeting: Sharing Solutions for Emergencies and Hazardous Environments, 2006, 12-15.
[11] N. Parnian, and M. F. Golnaraghi, "Integration of vision and inertial sensors for industrial tools tracking," Sensor Review 27, 2007, 132-141.
[12] N. Robertson, and I. Reid, "Behaviour understanding in video: a combined method," Computer Vision, ICCV 2005. Tenth IEEE International Conference on 1, 2005, 808-815.
[13] C. Schauer, "Verfolgung von Freiheitsgraden eines Werkzeugs aus Videodaten," Diploma Thesis, University of Applied Sciences , 2007
[14] D. Shin, R. A. Wysk, and L. Rothrock, "An investigation of a human material handler on part flow in automated manufacturing systems," Systems, Man and Cybernetics, Part A, IEEE Transactions on 36, no. 1, 2006, 123-135.
[15] T. Stiefmeier, G. Ogris, H. Junker, P. Lukowicz, and G. Troster, "Combining Motion Sensors and Ultrasonic Hands Tracking for Continuous Activity Recognition in a Maintenance Scenario," Wearable Computers, 2006 10th IEEE International Symposium on, 2006, 97-104.
[16] T. Stiefmeier, D. Roggen, G. Troster, G. Ogris, and P. Lukowicz, "Wearable Activity Tracking in Car Manufacturing," Pervasive Computing, IEEE 7, no. 2, 2008, 42-50.
[17] D. Vlasic, R. Adelsberger, G. Vannucci, J. Barnwell, M. Gross, W. Matusik, and J. Popović, "Practical motion capture in everyday surroundings," ACM Trans. Graph., 26, no. 3, 2007, 35:1-35:9.
[18] J. A. Ward, P. Lukowicz, G. Troster, and T. E. Starner, "Activity Recognition of Assembly Tasks Using Body-Worn Microphones and Accelerometers," Pattern Analysis and Machine Intelligence, IEEE Transactions on 28, no. 10, 2006, 1553-1567.
[19] L. Bao and S. S. Intille, "Activity Recognition from User-Annotated Acceleration Data," Proceceedings of the 2nd International Conference on Pervasive Computing, 2004 1-17.
[20] M. H. Ko, G . West, S. Venkatesh, M. Kumar, "Using dynamic time warping for online temporal fusion in multisensor systems," Information Fusion, 9, no. 3, 2008, 370-388.
[21] P. Zappi, T. Stiefmeier, E. Farella, D. Roggen, L. Benini, G. Troster, "Activity recognition from on-body sensors by classifier fusion: sensor scalability and robustness," Intelligent Sensors, Sensor Networks and Information, 2007. ISSNIP 2007. 3rd International Conference on, 281-286.