COMPASS

Publications

Presentation for the IEEE International Conference on Robotics and Automation (ICRA) 2022

  • R. Hartwig, D. Ostler, J.-C. Rosenthal, H. Feussner, D. Wilhelm, and D. Wollherr, (2022): Constrained Visual-Inertial Localization With Application And Benchmark in Laparoscopic Surgery. IEEE International Conference on Robotics and Automation (ICRA) 2022. ArXiv preprint available: pdf
  • R. Hartwig, D. Ostler, J.-C. Rosenthal, H. Feussner, D. Wilhelm, and D. Wollherr, “MITI: SLAM benchmark for laparoscopic surgery,” TUM, 2021. [Online]. Available: https://mediatum.ub.tum.de/1621941, pdf

The BMBF-funded COMPASS project is developing new assistance functions for minimally invasive surgery. Stereoscopic video data will be converted into depth image maps and spatially located. For this purpose we implement a real-time video processing chain and a real-time spatial localization algorithm. We use different sensor modalities such as inertial measurement unit and optical tracking to guarantee localization but also aim to use visual odometry and SLAM. In addition we collect datasets for machine learning applications and train networks for semantic unterstanding of the environment. In the project we work closely with surgeons and use-case focused to enhance the surgical procedure and documentation. One specific goal is to develop assistant functions such as the recognition and classification of peritoneal carcinomatosis by means of machine learning and its mapping in the abdominal cavity. The project involves 4 companies, 4 research institutes and 2 university hospitals.

At MITI a modified SLAM algorithm was developed, that incorporates movement constraints of the camera as well as IMU measurements for an entire optical tracking free localization of the camera.

In the subsequent video sequence, the movement of the camera is reconstructed  by stereo-matched and tracked 2D feature points. In addition, an optimization algorithm iteratively minimizes the residuals emerging from camera, constraints and IMU measurements.

The green lines show the laparoscope, which is localized by the optical tracking system as a ground truth reference. In contrast, the red lines show the SLAM localization and the light blue lines are the poses obtained by pure angular velocity integration and movement constraints.

 

Contact

Regine Hartwig, M.Sc.