Interdisciplinary project (IDP)Semiautomatic segmentation of tumors in Positron Emission Tomography supervised by Ralph Bundschuh and Prof. Dr. Nassir Navab Ziel der Arbeit ist die semiautomatische Tumorsegmentierung in Positronen Emissions Tomographie (PET) Daten. Der Basis-Teil des Projektes besteht aus der Implementierung von bereits ausgesuchten Segmentierungsalgorithmen, dem Auffinden weiterer Algorithmen (Literatur- Recherche) für die spezifische Problemstellung, der Erweiterung zur Analyse von dynamischen (4D) PET Daten.
In highly dynamic and complex high-risk domains - such as surgery - systematic training in the relevant skills is the basis for safe and high-quality performance. Traditionally, assessment and training in surgery traditionally concentrated upon proficiency and acquisition of surgeons' technical skills. As the fundamental impact of non-technical skills - such as communication and coordination - is increasingly acknowledged for safe delivery of surgeries comprehensive training approaches are missing. The overall aim of the project is to investigate a novel learning environment for the assessment and training of both technical and non-technical skills of entire multidisciplinary operating room (OR) teams.
The project Augmented Reality Supported Patient Education and Consultation (Augmented Reality unterstützte Operationsaufklärung) aims at developing Augmented Reality (AR) supported communication tools for patient education. The development of the targeted systems involves disciplines reaching from image registration, human computer interaction and in-situ visualization to instructional design and perceptual psychology. As a primary clinical application, we determined breast reconstruction in plastic surgery.
A tabletop system in medical environments can be used for interactive and collaborative analysis of patient data but also as a multimedia user interface within sterile space. For preoperative planning physicians in charge with a particular patient meet to discuss the medical case and plan further steps for therapy. For this reason, they could collaboratively view and browse through all kind of available medical imaging data with the tabletop system. Alternatively such a system could be a central interaction device for all kind of equippment within the OR requiring user input, however, can not be operated by the sterile surgeon. We believe that the projection of all kind of user and information interfaces on a sterile glass plane would facilitate the clinical workflow. This project is strongly related to the Tangible Interaction Surface for Collaboration between Humans project.
This work group aims at practical user interfaces for 3D imaging data in surgery and medical interventions. The usual monitor based visualization and mouse based interaction with 3D data will not present acceptable solutions. Here we study the use of head mounted displays and advanced interaction techniques as alternative solutions. Different issues such as depth perception in augmented reality environment and optimal data representation for a smooth and efficient integration into the surgical workflow are the focus of our research activities. Furthermore appropriate ways of interaction within the surgical environment are investigated.