Assistant robots are widely used in laparoscopic surgery to facilitate the camera holding and manipulation task. A variety of a hands-free operator interfaces have been implemented for user control of the robots, including voice commands, foot pedals, and eye and head motion tracking systems. This paper proposes a novel user control interface, based on processing of the laparoscopic images, that enables the robot to automatically adjust the view of the laparoscopic camera without disturbing the surgeon’s concentration. An effective marker-free detection method was investigated to track the instrument position in the laparoscopic images in real time so that the robot could center the instrument tip in the camera view. Considering several available methods it was found that a color space analysis, based on the quantitative comparison of the background’s and instrument’s pixels color contexts, provides the best results. The color contexts were presented in covariance matrix and mean values and analyzed using Mahalanobis distance measure in RGB color space. Tests on laparoscopic images with controlled conditions, e.g., sufficient light and low noises, revealed 86 percent correct detection with a processing rate of 3.7 frames per second on a conventional PC. Further work is going on to improve the algorithm.
- Design Engineering Division and Computers in Engineering Division
Marker-Free Detection of Instruments in Laparoscopic Images to Control a Cameraman Robot
- Views Icon Views
- Share Icon Share
- Search Site
Amini Khoiy, K, Mirbagheri, A, Farahmand, F, & Bagheri, S. "Marker-Free Detection of Instruments in Laparoscopic Images to Control a Cameraman Robot." Proceedings of the ASME 2010 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. Volume 3: 30th Computers and Information in Engineering Conference, Parts A and B. Montreal, Quebec, Canada. August 15–18, 2010. pp. 477-482. ASME. https://doi.org/10.1115/DETC2010-28452
Download citation file: