Publication
ARAS: LLM-Supported Augmented Reality Assistance System for Pancreatic Surgery
Hamraz Javaheri; Omid Ghamarnejad; Paul Lukowicz; Gregor Alexander Stavrou; Jakob Karolus
In: Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp-2024), Pages 176-180, ACM, 2024.
Abstract
The integration of Augmented Reality (AR) technology into surgical procedures offers significant potential to enhance clinical outcomes. Despite numerous lab-proven prototypes, deploying these systems in actual clinical settings demands specialized design and rigorous clinical evaluation to meet the high standards of complex medical fields. Our research highlights the intricate requirements emerging from clinical environments, particularly operating theaters. To address these challenges, we introduce ARAS, an operational AR assistance system for open pancreatic surgery. Through a user-centric design methodology, ARAS was iteratively developed and refined, ensuring its practical applicability and effectiveness in real-world surgical settings during clinical trials. ARAS encompasses two different modes designed for preoperative and intraoperative sessions. Preoperative mode enables surgeons to visualize patient data and perform resection on the patient's 3D reconstructed vessels and tumors, helping surgeons during surgical planning. Intraoperative mode allows in-situ and precise visualization of these reconstructed 3D models during the surgical procedure and supports surgeons in decision-making. ARAS also includes a novel interaction design for surgical AR systems. It uses LLMs to enable context-aware, intuitive, and natural communication between surgeons and the AR system. This demo showcases the capabilities of ARAS, emphasizing its potential to transform surgical practices through advanced AR and LLM-supported interactions.