Research Interest
- Navigation and Computer Aided Surgery
- Evaluation of Navigation Systems in Medicine
- Augmented Reality in Medicine
- Hybrid User Interfaces for Surgery and Interventions
Publications
301 Moved Permanently
nginx
Teaching
Winter Term 2014/2015
Summer Term 2014
Winter Term 2013/2014
Summer Term 2013
Winter Term 2007/2008
Summer Term 2007
Winter Term 2006/2007
Summer Term 2006
Winter Term 2005/2006
Summer Term 2005
Winter Term 2004/2005
Summer Term 2004
Theses, SEPs, and IDPs under my (Co-)Supervision
Available:
Current:
Finished:
Awards
- April 18, 2005: "Werner von Siemens Excellence Award" for outstanding diploma thesis.
Design of an Intra-Operative Augmented Reality Navigation Tool for Robotically Assisted Minimally Invasive Cardiovascular Surgery . Supervised by Martin Bauer, Hesam Najafi, and Prof. Gudrun Klinker. In collaboration with Deutsche Herzzentrum München (Eva Schirmbeck and PD Dr. Robert Bauernschmitt). Follow this link to see images of the ceremony at Siemens Forum.
Other Activities
Research Projects
|
A common task during broncoscopy procedures is to biopsy peripheral lung tumors. The video bronchoscope is not capable to reach the peripheral lung nodes, but only the biopsy needle. Thus there is no video feedback, but only feedback of the current location of the biopsy tool by fluoroscopy imaging during the intervention. This exposes patient and surgical staff to additional radiation. Another drawback is that tumors can not be visualized on the fluoroscope images and they are only a projection, thus do not report the three dimensional position of the biopsy tool. Electromagnetic tracking is capable of tracking the tip of flexible instrument. A field generator with three orthogonal coils introduces current and thus generates a magnetic field. A sensor composed also of three orthogonal coils is capable to estimate its position and orientation with respect to a coordinate system defined by the field generator. Currently we investigate the combination of all available information for navigation and solutions to represent it in one unified user interface. This includes the measurements of the electromagnetic tracking system, the c-arm, techniques of virtual bronchoscopy, and other data. Furthermore, clinical evaluation is conducted. We define the clinical endpoint and show through studies that the procedure will benefit from the usage of the navigation system.
|
|
Intra-operative localization of non-superficial cancerous lesions in non-hollow organs like liver, kidney, etc is currently facilitated by intra-operative ultrasound (IOUS) and palpation. This yields a high rate of false positives due to benign abnormal regions and thus unnecessary resections with increased complications and morbidity. In this project we integrate functional nuclear information from gamma probes with IOUS, to provide a synchronized, real-time visualization that facilitates the detection of active tumors and metastases intra-operatively. The bet of this project is that the inclusion of an advanced, augmented visualization provides more reliability and confidence on classifying lesions prior to the resection.
|
|
Nuclear medicine imaging modalities assist commonly in surgical guidance given their functional nature. However, when used in the operating room they present limitations. Pre-operative tomographic 3D imaging can only serve as a vague guidance intra-operatively, due to movement, deformation and changes in anatomy since the time of imaging, while standard intra-operative nuclear measurements are limited to 1D or (in some cases) 2D images with no depth information. To resolve this problem we propose the synchronized acquisition of position, orientation and readings of gamma probes intra-operatively to reconstruct a 3D activity volume. In contrast to conventional emission tomography, here, in a first proof-of-concept, the reconstruction succeeds without requiring symmetry in the positions and angles of acquisition, which allows greater flexibility and thus opens doors towards 3D intra-operative nuclear imaging.
|
|
This work group aims at practical user interfaces for 3D imaging data in surgery and medical interventions. The usual monitor based visualization and mouse based interaction with 3D data will not present acceptable solutions. Here we study the use of head mounted displays and advanced interaction techniques as alternative solutions. Different issues such as depth perception in augmented reality environment and optimal data representation for a smooth and efficient integration into the surgical workflow are the focus of our research activities. Furthermore appropriate ways of interaction within the surgical environment are investigated.
|
|
In abdominal surgery, a laparoscopic ultrasound transducer is commonly used to detect lesions such as metastases. The determination and visualization of position and orientation of its flexible tip in relation to the patient or other surgical instruments can be of much help to (novice) surgeons utilizing the transducer intraoperatively. This difficult subject has recently been paid attention to by the scientific community. Electromagnetic tracking systems can be applied to track the flexible tip. However, the magnetic field can be distorted by ferromagnetic material. We present a new method based on optical tracking of the laparoscope and magneto-optic tracking of the transducer, which is able to automatically detect and correct field distortions. This is used for a smooth augmentation of the B-scan images of the transducer directly on the camera images in real time.
|
|
Optimal port placement is a delicate issue in minimally invasive endoscopic surgery. A good choice of the instruments' and endoscope's ports can avoid time-consuming consecutive new port placement. We present a novel method to intuitively and precisely plan the port placement. The patient is registered to its pre-operative CT by just moving the endoscope around fiducials, which are attached to the patient's thorax and are visible in its CT. Their 3D positions are automatically reconstructed. Without prior time-consuming segmentation, the pre-operative CT volume is directly rendered with respect to the endoscope or instruments. This enables the simulation of a camera flight through the patient's interior along the instruments' axes to easily validate possible ports.
|
|
In recent years, an increasing number of liver tumor indications were treated by minimally invasive laparoscopic resection. Besides the restricted view, a major issue in laparoscopic liver resection is the precise localization of the vessels to be divided. To navigate the surgeon to these vessels, pre-operative imaging data can hardly be used due to intra-operative organ deformations caused by appliance of carbon dioxide pneumoperitoneum and respiratory motion.
Therefore, we propose to use an optically tracked mobile C-arm providing cone-beam computed tomography imaging capability intra-operatively. After patient positioning, port placement, and carbon dioxide insufflation, the liver vessels are contrasted and a 3D volume is reconstructed during patient exhalation. Without any further need for patient registration, the volume can be directly augmented on the live laparoscope video. This augmentation provides the surgeon with essential aid in the localization of veins, arteries, and bile ducts to be divided or sealed.
Current research focuses on the intra-operative use and tracking of mobile C-arms as well as laparoscopic ultrasound, augmented visualization on the laparoscope's view, and methods to synchronize respiratory motion.
|
|
In minimally invasive tumor resection, the goal is to perform a minimal but complete removal of cancerous cells. In the last decades interventional beta probes supported the detection of remaining tumor cells. However, scanning the patient with an intraoperative probe and applying the treatment are not done simultaneously. The main contribution of this work is to extend the one dimensional signal of a nuclear probe to a four dimensional signal including the spatial information of the distal end of the probe. This signal can be then used to guide the surgeon in the resection of residual tissue and thus increase its spatial accuracy while allowing minimal impact on the patient.
|
|
In this project, we aim at discovering automatically the workflow of percutaneous vertebroplasty. The medical framework is quite different from a parallel project , where we analyze laparoscopic surgeries. Contrary to cholecystectomies where much information is provided by the surgical tools and by the endoscopic video, in vertebroplasties and kyphoplasties, we believe that the body and hand movement of the surgeon give a key insight into the surgical activity. Surgical movements like hammering of the trocar into the vertebra or the stirring of cement compounds are indicative of the current workflow phase. The objectives of this project are to acquire the workflow related signals using accelerometers, processing the raw signals and detecting recurrent patterns in order to objectively identify the low-level and high-level workflow of the procedure.
|
Coffee Consumption Histogram
301 Moved Permanently
nginx
The quantity of coffee I drink between 8am and 5pm (17).