The Visual Inspection of Surgical Tasks Using Machine Vision: Applications to Robotic Surgery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33104
The Visual Inspection of Surgical Tasks Using Machine Vision: Applications to Robotic Surgery

Authors: M. Ovinis, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs

Abstract:

In this paper, the feasibility of using machine vision to assess task completion in a surgical intervention is investigated, with the aim of incorporating vision based inspection in robotic surgery systems. The visually rich operative field presents a good environment for the development of automated visual inspection techniques in these systems, for a more comprehensive approach when performing a surgical task. As a proof of concept, machine vision techniques were used to distinguish the two possible outcomes i.e. satisfactory or unsatisfactory, of three primary surgical tasks involved in creating a burr hole in the skull, namely incision, retraction, and drilling. Encouraging results were obtained for the three tasks under consideration, which has been demonstrated by experiments on cadaveric pig heads. These findings are suggestive for the potential use of machine vision to validate successful task completion in robotic surgery systems. Finally, the potential of using machine vision in the operating theatre, and the challenges that must be addressed, are identified and discussed.

Keywords: Machine vision, robotic surgery, visual inspection.

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1336080

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655

References:


[1] Glauser, D., Fankhauser, H., Epitaux, M., Hefti, J.L., Jaccottet, A.: Neurosurgical robot Minerva: first results and current developments. J. Image Guid. Surg. 1, 266-272 (1995).
[2] A. Blum et al., "Digital Image Analysis for Diagnosis of Cutaneous Melanoma. Development of a Highly Effective Computer Algorithm Based on Analysis of 837 Melanocytic Lesions," Br. J. Dermatol., vol. 151, pp. 1029-1038, 2004.
[3] F. Ercal et al., "Neural Network Diagnosis of Malignant Melanoma from Color Images," IEEE Trans. Biomed. Eng., vol. 41, pp. 837-845, 2002.
[4] H. Voigt and R. Classen, "Computer Vision and Digital Imaging Technology in Melanoma Detection," Seminars in Oncology, vol. 29, pp. 308-327, 2002.
[5] S. Narkilahti et al., "Monitoring and Analysis of Dynamic Growth of Human Embryonic Stem Cells: Comparison of Automated Instrumentation and Conventional Culturing Methods," Biomed. Eng., vol. 6, pp. 6-11, 2007.
[6] B. P. L. Lo et al., "Episode Classification for the Analysis of Tissue/Instrument Interaction with Multiple Visual Cues," Lect. Notes Comput. Sc., vol. 2879, pp. 230-237, 2003.
[7] N. Padoy et al., "On-Line Recognition of Surgical Activity for Monitoring in the Operating Room," in Proc. 20th Conf. Innovative Applications of Artificial Intell, Chicago, Illinois, 2008, pp. 1718-1724.
[8] W. O. Sack et al., Essentials of Pig Anatomy. Ithaca, New York: Veterinary Textbooks, 1982.
[9] G. M. Kaiser and N. R. Fruhauf, "Method of Intracranial Pressure Monitoring and Cerebrospinal Fluid Sampling in Swine," Lab. Anim., vol. 41, pp. 80-85, 2007.
[10] J. F. M. Manschot and A. J. M. Brakkee, "The Measurement and Modeling of the Mechanical Properties of Human Skin in vivo-I. The Measurement," J. Biomech., vol. 19, pp. 511-515, 1986.
[11] M. Mahy et al., "Evaluation of Uniform Color Spaces Developed After the Adoption of CIELAB and CIELUV," Color Res. Appl., vol. 19, pp. 105-121, 1994.
[12] R. M. Haralick et al., "Textural features for image classification," IEEE Trans. Syst., Man, Cybern., vol. 3, pp. 610-621, 1973.