Augmented reality remote-guided navigation in ENT surgery

work in progress

Truppe M. J., Pongracz F., Mayrbaeurl J., ARTMA Medizintechnik GmbH
Freysinger W., Gunkel A.R., Thumfart W. F., University Innsbruck, Dept. ENT

The ARTMA Virtual Patient ® System was first worldwide to introduce augmented reality in medicine for visualization of virtual anatomical structures in endoscopic surgery (Lit.1). Interventional Video Tomography (IVT), is a proprietary imaging modality invented and developed by Artma (Lit.2, PCT patent). Virtual computer generated structures are fused with the endoscopic video image in real time.


We have now extended this concept (Lit.3) for augmented reality remote-guided surgery. Recent developments in videoconferencing technology make it possible to broadcast video data over a network. Current medical concepts for remote stereotactic navigation have in common that at the local operating theater a certain degree of technical knowledge is necessary to correlate the anatomical structures of the patient to the operating field and the 3D digitizer.

Our technology overcomes this limitations. The patient image coordinate transformation is based on the IVT data set without use of a 3D digitizing probe.

The virtual representation of any surgical instrument tracked with 3D sensors is defined in the video overlay and independent of the physical sensor attachment. The only input needed for the system to visualize the stereotactic navigation data is live video data with synchronously recorded 3D sensor data.

This data is acquired at the local operating theater and is processed in our standard surgical navigation system (Lit.4). The IVT data set is simultaneously also transmitted over the network.

Therefore the steps needed to correlate the CT coordinate system with the IVT data set which represents the real patient can be performed at any remote location where this data is accessible on the network.


Prior to surgery a short IVT video sequence is captured with all 3D sensors already securely attached. This IVT sequence now contains video views of the patients anatomical region from many different perspectives with synchronously recorded 3D digitizer data. This is the only information needed to calibrate the system and initialize the video overlay of anatomical structures.

An anatomical marker identified with a cursor in the CT is also identified in any of the endoscope video images . After repeating this step at least six times a direct linear transformation calculates the original camera parameters (Lit.5). The backprojection is determined by the position of the endoscope camera imaging plane relative to the anatomical structure tracked by the 3D sensor.

Because the relative change of the projection parameters is stored for every different view in the IVT data set by means of the 3D sensors the backprojection is valid for the complete IVT sequence.
Assuming the sensor is fixed on the endoscope for the duration of the surgical procedure the backprojection computed for the IVT data set is also valid for the live video image.
This spatial relations are stored in a document. As long as the position of the imaging plane relative to the anatomical structure can be determined the backprojection is independent of the imaging modality. Additional intraoperative imaging devices (ultrasound, C-arm, microscope) equipped with a 3D sensor can therefore be added and integrated to visualize and define structures in volume imaging data, projective imaging data and in live video images simultaneously.


The IVT data (video and the digitizer data) is broadcasted in real-time on the network, the study document that contains the particular calibration and projection parameters valid for the imaging data is also transmitted. Any computer on the network has the identical data for processing as the system at the operating theater.

A surgeon at a remote network location acting as telementor simulates in the CT data set prior to surgery e.g. a surgical access path for the endoscope. During surgery the live video data from the endoscope is fused in real-time with these virtual data structures at the remote location.

This graphic overlay has now to be made available to the surgeon at the operating theater. The backprojection parameters for the virtual data structures at the remote site are identical to the operating theater. Therefore it is only necessary to transmit few drawing instructions over the network, the actual computation of the computer graphics and the video image fusion is computed at the local computer.

During the surgical procedure an unexpected complication might make necessary a change of the planned surgical access path. The telementor surgeon at the remote network location receives the live video from the operating theater, therefore he has available the same visual information as the surgeon. The telementor can adapt the simulation immediately relative to the CT and other medical imaging data of the patient. This becomes visible as spatial change of the overlay of the virtual structures on the live endoscope video at the operating theater and all other worksations connected.


(1) The Artma Virtual Patient: 3D monitoring of endoscopic surgery
Truppe, M., Medicine Meets Virtual Reality II, pp. 221, Jan 1994, San Diego

(2) Interventional Video Tomography
Michael J. Truppe, Ferenc Pongracz, Oliver Ploder, Arne Wagner, Rolf Ewers
SPIE Paper #: 2395-34, pp.150-152
Proceedings of Lasers in Surgery, 4-6 February 1995, San Jose

(3) Virtual image guided navigation in tumor surgery - technical innovation
Wagner, A., Ploder, O., Enislidis, G., Truppe, M., Ewers, R.
Journal of Cranio Maxillo-Facial Surgery (1995) 23, 271-273

(4) Application of the ARTMA image-guided navigation system to endonasal sinus surgery
Gunkel A.R., Freysinger W., Thumfart W. F., Truppe M. J.
Proceedings CAR95, pp. 1146-1151, June 21-24, 1995, Berlin

(5) Direct linear transformation into object space coordinates in close-range photogrammetry
Abdel-Aziz YI, Karara HM
Proceedings of the Symposium on Close-Range Photogrammetry 1971, University of Illinois 1: 1-18.

©1990,1996 Michael Truppe, MD. All rights reserved. Patents issued and pending. Last modified May 6, 1996.