In computer aided maxillofacial surgery we have shown the feasibility of computer navigation assistance for a wide variety of indications. Additional data of various imaging sources superimposed during operation with live video from the operating field provide help and useful information even for the experienced surgeon. Rather than the simple identification of target structures in a CT our goal is the intraoperative visualization of complex presurgical simulations of osteotomies of the midface. Because of the interdisciplinary nature the expert needed is not necessarily the surgeon. We present a system (Artma Virtual Patient) that enables a remote expert to observe the surgical procedure via the Internet and interactively modify the intraoperative visualization from his remote location. First results are presented and a live transmission of stereotactic and video data is established to the conference site.
The fundamental concept of the Virtual Patient System (Artma, Austria) is the real time fusion of different radiological, video and photographic, as well as solid 3-D model data to one coordinate system, combined with the visualization in the real time intraoperative situation [1, 2]. The patient's diagnostic data, referring to CT, MR, radiographs and photographs, also solid models (dental casts, stereolithographic skull models), are used to assess the individual anatomy and pathology and are set in relation for interactive on screen planning by image fusion. The visualization of anatomical landmarks, target structures and surgical approaches is realized by overlay graphics. All the information can be at hand during the intervention, ei-ther on the computer monitor or on the screens of a head up display.
Although we prefer a magnetic field digitizer (Polhemus) the system can be used with digitizers from Ascension or optical digitizers from Northern Digital and Image Guided Technologies. Most authors put a high emphasis on the digitizer technology used. Because of the dynamic nature of the measurement errors induced by metal objects to the magnetic field digitizer we cannot rely on laboratory benchmarks. For these reasons Artma has developed a method, Interventional Video Tomography [3, 4], for intraoperative control of digitizer accuracy. Instead of comparing the digitizer values against a calibration object they are compared against an optical coordinate reference system , thus separating digitizer error from other error sources. Further technical developments will allow for automatic adjustments.
The digitizer source is rigidly mounted on an operating microscope with integrated video output. First the position and orientation of the digitizer coordinate system relative to the focal plane of the microscope has to be determined. The representation of a stylus tip in optical space as x,y coordinate on the microscope video image is correlated to the x.y.z coordinates in digitizer space as measured by the tip of this stylus and a virtual camera model is generated [6, 7]. After this step any sensor position in digitizer space is projected in the focal plane of the microscope, a registration error induced by dynamic metal influence becomes visible immediately during surgery in real-time.
Sensors are rigidly attached to structures of intraoperative manipulation  and to surgical instruments. The position and movement relative to the skull base is continuously displayed in various radiographic data sets. In addition the virtual structures representing bone segments and instruments are visible as overlay in the live video source from the microscope and provide an augmented reality "see-through" visualization.
The expert in a remote location receives this video data almost in real time over standard transmission protocols (LAN, WAN, ISDN, Internet). But in addition also stereotactic navigation data is sent over the network as rigid body coordinates. The actual graphic overlay structures are computed on the remote computer, thus dramatically reducing the bandwidth necessary for transmission (Fig. 1). By teleconsulting, the composite images and overlapping graphics - instrument, target structure, landmark, contour - can be seen in connected clinics with the possibility of interactive graphical assistance.
To our knowledge, in the field of maxillofacial surgery we performed the first surgical teleconsultation in real time via telecommunication of stereotactic data in August 1996 in Austria. A patient, suffering from a posttraumatic deformity, was operated with the help of image guided surgery/augmented reality. An unilateral zygomatic-orbital fracture and a central midface fracture and comminuted maxillary fracture had led to a marked facial asymmetry and functional impairment, following insufficient reduction in primary repair. To restore facial symmetry and occlusal balance, a reosteotomy following the fracture lines in the zygoma and in the orbit, as well as a Le Fort I osteotomy was planned. The actual and desired position of the bony structures were superimposed as overlay graphics on radiological and video images in real time during surgery, tracked by 3-D sensors. Reduction could be performed according to contour graphics, displaying the intraoperative movements on CT scans. In teleconsultation, the position achieved could be discussed according to symmetry, hard/soft tissue relation and occlusal details with the possibility of on screen planning interaction and real time evaluation of the results over a distance of 500 km.
Another patient was operated in September 1996 with data transfer to Brussels, Belgium.
Image guided surgery by computer navigation has proven to be useful in two ways of application:
With intraoperative teleconsultation, not only preoperative planning but the actual intraoperative situation with the result achieved can be visualized in real time. Considerations, advises and remarks by teleconsulting surgeons are possible by interactive graphical planning on video-, CT-, or other radiographic images, displaying the intraoperative situs. They may adapt the situation according their personal experience in corresponding cases. Additional benefits of intraoperative teleconsultation for patient care as well as the impact on economy, education and training of surgeons will have to be the subject of further evaluation.
 A. Wagner, O. Ploder, G. Enislidis, M. Truppe, and R. Ewers, "Virtual image guided navigation in tumor surgery--technical innovation," J Craniomaxillofac Surg, vol. 23, pp. 217-3, 1995.
 A. Wagner, O. Ploder, G. Enislidis, M. Truppe, and R. Ewers, "Image-guided surgery," Int J Oral Maxillofac Surg, vol. 25, pp. 147-51, 1996.
 M. Truppe, "Method for representing moving bodies," European Patent Office 1991, patent EP0488987B1.
 M. Truppe, F. Pongracz, O. Ploder, A. Wagner, and R. Ewers, "Interventional Video To-mography," in Proceedings of Lasers in Surgery, vol. 2395. San Jose, CA: SPIE, 1995, pp. 150-152.
 M. Truppe, "Process for imaging the interior of bodies," European Patent Office 1993, patent WO94/03100.
 Y. Abdel-Aziz, Karara, HM., "Direct linear transformation into object space coordinates in close-range photogrammetry," presented at Proceedings of the Symposium on Close-Range Photogrammetry, 1971.
 R. Tsai, "A Versatile Camera Calibration Technique for High Accuracy 3D Machine Vision Metrology Using Off-The-Shelf TV Cameras and Lenses," IEEE Journal of Robotics and Automation, pp. 323-344, 1987.
 G. Bettega, V. Dessenne, B. Raphael, and P. Cinquin, "Computer-assisted mandibular condyle positioning in orthognathic surgery," J Oral Maxillofac Surg, vol. 54, pp. 553-8, 1996.