Assistant robots are widely used in laparoscopic surgery to facilitate the camera holding and manipulation task. A variety of a hands-free operator interfaces have been implemented for user control of the robots, including voice commands, foot pedals, and eye and head motion tracking systems. This paper proposes a novel user control interface, based on processing of the laparoscopic images, that enables the robot to automatically adjust the view of the laparoscopic camera without disturbing the surgeon’s concentration. An effective marker-free detection method was investigated to track the instrument position in the laparoscopic images in real time so that the robot could center the instrument tip in the camera view. Considering several available methods it was found that a color space analysis, based on the quantitative comparison of the background’s and instrument’s pixels color contexts, provides the best results. The color contexts were presented in covariance matrix and mean values and analyzed using Mahalanobis distance measure in RGB color space. Tests on laparoscopic images with controlled conditions, e.g., sufficient light and low noises, revealed 86 percent correct detection with a processing rate of 3.7 frames per second on a conventional PC. Further work is going on to improve the algorithm.

This content is only available via PDF.
You do not currently have access to this content.