Augmented Reality: A New Tool To Improve Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In Vitro and In Vivo Results


Teber D., Guven S., Simpfendörfer T., Baumhauer M., Güven E. O., Yencilek F., ...Daha Fazla

European Urology, cilt.56, sa.2, ss.332-338, 2009 (SCI-Expanded, Scopus) identifier identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 56 Sayı: 2
  • Basım Tarihi: 2009
  • Doi Numarası: 10.1016/j.eururo.2009.05.017
  • Dergi Adı: European Urology
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus
  • Sayfa Sayıları: ss.332-338
  • Anahtar Kelimeler: Augmented Reality, Computer-Assisted Surgery, Laparoscopic Partial Nephrectomy, Soft Tissue Navigation
  • Hatay Mustafa Kemal Üniversitesi Adresli: Evet

Özet

Background: Use of an augmented reality (AR)-based soft tissue navigation system in urologic laparoscopic surgery is an evolving technique. Objective: To evaluate a novel soft tissue navigation system developed to enhance the surgeon's perception and to provide decision-making guidance directly before initiation of kidney resection for laparoscopic partial nephrectomy (LPN). Design, setting, and participants: Custom-designed navigation aids, a mobile C-arm capable of cone-beam imaging, and a standard personal computer were used. The feasibility and reproducibility of inside-out tracking principles were evaluated in a porcine model with an artificially created intraparenchymal tumor in vitro. The same algorithm was then incorporated into clinical practice during LPN. Interventions: Evaluation of a fully automated inside-out tracking system was repeated in exactly the same way for 10 different porcine renal units. Additionally, 10 patients underwent retroperitoneal LPNs under manual AR guidance by one surgeon. Measurements: The navigation errors and image-acquisition times were determined in vitro. The mean operative time, time to locate the tumor, and positive surgical margin were assessed in vivo. Results and limitations: The system was able to navigate and superpose the virtually created images and real-time images with an error margin of only 0.5 mm, and fully automated initial image acquisition took 40 ms. The mean operative time was 165 min (range: 135-195 min), and mean time to locate the tumor was 20 min (range: 13-27 min). None of the cases required conversion to open surgery. Definitive histology revealed tumor-free margins in all 10 cases. Conclusions: This novel AR tracking system proved to be functional with a reasonable margin of error and image-to-image registration time. Mounting the pre- or intraoperative imaging properties on real-time videoendoscopic images in a real-time manner will simplify and increase the precision of laparoscopic procedures. © 2009 European Association of Urology.