ChristophBichlmeier

Chair for Computer Aided Medical Procedures & Augmented Reality
Lehrstuhl für Informatikanwendungen in der Medizin & Augmented Reality

THIS WEBPAGE IS DEPRECATED - please visit our new website

Christoph Bichlmeier (Dr. rer. nat., Dipl.-Inf. Univ.)
email
Skype My status
christoph.JPG

Theses and Projects under my (Co-)Supervision


Publications

301 Moved Permanently


nginx

Active Research Projects

NARVIS - navigated augmented reality visualization system

NARVIS - navigated augmented reality visualization system

Advanced visualization is getting increasingly important for the operation room of the future. The increasing number of available medical images must be presented to surgery team in new ways in order to support them rather than overloading with more information. In our project NARVIS we integrate an HMD-based (head mounted display) AR system into the operation room for 3D in situ visualization of computed tomography (CT) images. The final system aims at spinal surgery. The work is in close collaboration with our project partners “Klinikum für Unfallchirgie” at LMU, A.R.T. Weilheim, and Siemens Corporate Research in Princeton. The project is funded by Bayerische Forschungsstiftung.
ARAV Augmented Reality Aided Vertebroplasty

ARAV Augmented Reality Aided Vertebroplasty

In today’s ORs more and more operations are performed employing minimally invasive procedures. Surgical instruments are inserted through a tiny cut on the patient’s skin, the port to the inside of the patient. In some cases endoscope cameras record video images of the operation site that are presented on a monitor. As a consequence of this technique, the surgeon’s field of view is divided into several work spaces, the monitor, the patient and information of medical imaging data presented on a third station. The missing direct view on the workspace complicates intuitive control of surgical tools. In contrast to open surgery the physician has to collect information from several fields of view at the same time and fuse information mentally to create a complete model of his working space, the operation site. The minimally invasive intervention vertebroplasty was determined as a suitable medical application to bring an Head Mounted Display (HMD) into the OR for augmentation of surgical instruments and medical imaging data. In-situ visualization with an HMD presents all available imaging data and navigational information in one field of view. The objective of vertebroplasty is the insertion of cement into weak and brittle vertebrae through a trocar for stabilization. In this case the view on the inside of the patient is not provided by an endoscope camera. However, since the operation is performed under a CT scanner, imaging data is permanently updated to check position of the trocar and amount of inserted cement. Imaging data is presented on a monitor and has to be mapped mentally by the surgeon on the real operation site.
Improving Depth Perception and Perception of Layout for In-Situ Visualization in Medical Augmented Reality

Improving Depth Perception and Perception of Layout for In-Situ Visualization in Medical Augmented Reality

In-situ visualization in medical augmented reality (AR) using for instance a video see-through head mounted display (HMD) and an optical tracking system enables the stereoscopic view on visualized CT data registered with the real anatomy of a patient. Data can aligned with the required accuracy and the surgeons do not have to analyze data on an external monitor or images attached to the wall somewhere in the operating room. Thanks to a medical AR system like mentioned before, surgeons get a direct view onto and also ”into” the patient. Mental registration of medical imagery with the operation site is not necessary anymore. In addition surgical instruments can be augmented inside the human body. Bringing medical imagery and surgical instruments in the same field of action provides the most intuitive way to understand the patient’s anatomy within the region of interest and allows for the development of completely new generations of surgical navigation systems.
Unfortunately, this method of presenting medical data suffers from a serious lack. Virtual imagery, such as a volume rendered spinal column, can only be displayed superimposed on real objects. If virtual entities of the scene are expected behind real ones, like the virtual spinal column beneath the real skin surface, this problem implicates incorrect perception of the viewed objects respective their distance to the observer. The strong visual depth cue interposition is responsible for misleading depth perception. This project aims at the development and evaluation of methods to improve depth perception for in-situ visualization in medical AR. Its intention is to provide an extended view onto the human body that allows an intuitive localization of visualized bones and tissue.
Virtual Mirror: Interaction Paradigm for Augmented Reality Applications

Virtual Mirror: Interaction Paradigm for Augmented Reality Applications

Augmented Reality offers a higher degree of freedom for the programmer than classical visualization of volume data on a screen. The existing paradigms for interaction with 3D objects are not satisfactory for particular applications since the majority of them rotate and move the object of interest. The classic manipulation of virtual objects cannot be used while keeping real and virtual spaces in alignment within an AR environment. This project introduces a simple and efficient interaction paradigm allowing the users to interact with 3D objects and visualize them from arbitrary viewpoints without disturbing the in-situ visualization, or requiring the user to change the viewpoint. We present a virtual, tangible mirror as a new paradigm for interaction with 3D models. The concept borrows its visualization paradigm in some sense from methodology used by dentists to examine the oral cavity without constantly changing their own viewpoint or moving the patients head. The virtual mirror improves the understanding of complex structures, enables completely new concepts to support navigational aid for different tasks and provides the user with intuitive views on physically restricted areas.
Augmented Reality Supported Patient Education and Consultation

Augmented Reality Supported Patient Education and Consultation

The project Augmented Reality Supported Patient Education and Consultation (Augmented Reality unterstützte Operationsaufklärung) aims at developing Augmented Reality (AR) supported communication tools for patient education. The development of the targeted systems involves disciplines reaching from image registration, human computer interaction and in-situ visualization to instructional design and perceptual psychology. As a primary clinical application, we determined breast reconstruction in plastic surgery.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

MeTaTop A Multi Sensory Table Top System for Medical Procedures

A tabletop system in medical environments can be used for interactive and collaborative analysis of patient data but also as a multimedia user interface within sterile space. For preoperative planning physicians in charge with a particular patient meet to discuss the medical case and plan further steps for therapy. For this reason, they could collaboratively view and browse through all kind of available medical imaging data with the tabletop system. Alternatively such a system could be a central interaction device for all kind of equippment within the OR requiring user input, however, can not be operated by the sterile surgeon. We believe that the projection of all kind of user and information interfaces on a sterile glass plane would facilitate the clinical workflow.
This project is strongly related to the Tangible Interaction Surface for Collaboration between Humans project.
3D user interfaces for medical interventions

3D user interfaces for medical interventions

This work group aims at practical user interfaces for 3D imaging data in surgery and medical interventions. The usual monitor based visualization and mouse based interaction with 3D data will not present acceptable solutions. Here we study the use of head mounted displays and advanced interaction techniques as alternative solutions. Different issues such as depth perception in augmented reality environment and optimal data representation for a smooth and efficient integration into the surgical workflow are the focus of our research activities. Furthermore appropriate ways of interaction within the surgical environment are investigated.
Laparoscope Augmentation for Minimally Invasive Liver Resection

Laparoscope Augmentation for Minimally Invasive Liver Resection

In recent years, an increasing number of liver tumor indications were treated by minimally invasive laparoscopic resection. Besides the restricted view, a major issue in laparoscopic liver resection is the precise localization of the vessels to be divided. To navigate the surgeon to these vessels, pre-operative imaging data can hardly be used due to intra-operative organ deformations caused by appliance of carbon dioxide pneumoperitoneum and respiratory motion.

Therefore, we propose to use an optically tracked mobile C-arm providing cone-beam computed tomography imaging capability intra-operatively. After patient positioning, port placement, and carbon dioxide insufflation, the liver vessels are contrasted and a 3D volume is reconstructed during patient exhalation. Without any further need for patient registration, the volume can be directly augmented on the live laparoscope video. This augmentation provides the surgeon with essential aid in the localization of veins, arteries, and bile ducts to be divided or sealed.

Current research focuses on the intra-operative use and tracking of mobile C-arms as well as laparoscopic ultrasound, augmented visualization on the laparoscope's view, and methods to synchronize respiratory motion.

Teaching

Conference Activities

http://campwww.informatik.tu-muenchen.de/AMIARCS09/doku.php AMI-ARCS 2009: 5th Workshop on Augmented environments for Medical Imaging and Computer-aided Surgery

Award

siemens_award_2006_icon.jpg Werner von Siemens Excellence Award 2006
for my diploma thesis Advanced 3D Visualization for Intra Operative Augmented Reality (.pdf)
supervised by Tobias Sielhorst & advised by Prof. Nassir Navab
mbpw 2009 Munich Business Plan Competition 2009 (Developer Stage) for the business plan on A Simulator System for Team-Oriented, Surgical Education and Training of Complete Intraoperativen Procedures together with Philipp Stefan, MD Sandro Heining and Prof. Nassir Navab

301 Moved Permanently


nginx

Going Abroad for Research Visits

simiosys

Beginning from Oct. 2008 I spent three month in Orlando, Florida at Christopher Stapletons lab Simiosys to study the application of instructional design for Medical Augmented Reality technology.

Address in Orlando:
3280 Progress Drive
Orlando Florida 32826
Phone: 407 965 1356
Fax: 407 658 5059

Stuff




UsersForm
Title: Dr.
Firstname: Christoph
Middlename: Paul
Lastname: Bichlmeier
Picture: bichlmeierchristophicon.png
Birthday:  
Nationality: Bavaria
Languages: English, German, Spanish, Bavarian, Portuguese
Groups: Medical Imaging, Computer-Aided Surgery, Medical Augmented Reality, 3D Information Presentation, 3D Interaction
Expertise: Computer-Aided Surgery, Medical Augmented Reality, 3D Information Presentation, 3D Interaction
Position: External Collaborator
Status: Alumni
Emailbefore: christoph.bichlmeier
Emailafter: gmx.de
Room: NARVIS lab
Telephone: +49 89 5160 4368
Alumniactivity:  
Defensedate: 1 December 2010
Thesistitle: Immersive, Interactive and Contextual In-Situ Visualization for Medical Applications
Alumnihomepage:  
Personalvideo01:  
Personalvideotext01:  
Personalvideopreview01:  
Personalvideo02:  
Personalvideotext02:  
Personalvideopreview02:  


Edit | Attach | Refresh | Diffs | More | Revision r1.101 - 25 Nov 2011 - 15:58 - ChristophBichlmeier

Lehrstuhl für Computer Aided Medical Procedures & Augmented Reality    rss.gif