NeuroSim - The Prototype of a Neurosurgical Training Simulator

Similar documents
Virtual and Augmented Reality techniques embedded and based on a Operative Microscope. Training for Neurosurgery.

Scopis Hybrid Navigation with Augmented Reality

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Using Web-Based Computer Graphics to Teach Surgery

Current Status and Future of Medical Virtual Reality

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Robot assisted craniofacial surgery: first clinical evaluation

Improving Depth Perception in Medical AR

Novel machine interface for scaled telesurgery

International Journal of Informative & Futuristic Research ISSN:

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

FRAUNHOFER INSTITUTE FOR INTEGRATED CIRCUITS IIS. MANUAL PANORAMIC MICROSCOPY WITH istix

Medical Robotics. Part II: SURGICAL ROBOTICS

ience e Schoo School of Computer Science Bangor University

Unpredictable movement performance of Virtual Reality headsets

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

FRAUNHOFER INSTITUTE FOR OPEN COMMUNICATION SYSTEMS FOKUS COMPETENCE CENTER VISCOM

Simendo laparoscopy. product information

Surgical robot simulation with BBZ console

Image Guided Robotic Assisted Surgical Training System using LabVIEW and CompactRIO

Enhancing Shipboard Maintenance with Augmented Reality

Virtual and Augmented Reality Applications

HCI Design in the OR: A Gesturing Case-Study"

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Project Abstract Submission : Entry # 456. Part 1 - Team. Part 2 - Project. Team Leader Name. Maroua Filali. Team Leader .

Haptic Display for a Virtual Reality Simulator for Flexible Endoscopy

Proposal for Robot Assistance for Neurosurgery

Methods for Haptic Feedback in Teleoperated Robotic Surgery

A Practical Guide to Frozen Section Technique

Omni-Directional Catadioptric Acquisition System

RENDERING MEDICAL INTERVENTIONS VIRTUAL AND ROBOT

Enhancing Medical Communication Training Using Motion Capture, Perspective Taking and Virtual Reality

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

Advancements in Gesture Recognition Technology

Classifying 3D Input Devices

Robots in the Field of Medicine

A Modular and Generic Virtual Reality Training Framework for Micro-Robotic Cell Injection Systems

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

OPHTHALMIC SURGICAL MODELS

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

BodyViz fact sheet. BodyViz 2321 North Loop Drive, Suite 110 Ames, IA x555 www. bodyviz.com

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

RASim Prototype User Manual

Construction of visualization system for scientific experiments

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

EnSight in Virtual and Mixed Reality Environments

Introduction to Virtual Reality (based on a talk by Bill Mark)

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

Haptic Reproduction and Interactive Visualization of a Beating Heart Based on Cardiac Morphology

Integrated Technology Concept for Robotic On-Orbit Servicing Systems

Surgeon-Tool Force/Torque Signatures - Evaluation of Surgical Skills in Minimally Invasive Surgery

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Autonomous Surgical Robotics

Modeling and Simulation: Linking Entertainment & Defense

The Holographic Human for surgical navigation using Microsoft HoloLens

Air Marshalling with the Kinect

Computer Assisted Abdominal

A Training Simulator for the Angioplasty Intervention with a Web Portal for the Virtual Environment Searching

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION

Augmented Reality in Medicine

VR Headset for Endoscopy and Microsurgery

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Computer Assisted Medical Interventions

VR System Input & Tracking

Medical Robotics LBR Med

Optimized CT metal artifact reduction using the Metal Deletion Technique (MDT)

iwindow Concept of an intelligent window for machine tools using augmented reality

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Model of Firearms Simulator Based on o Serious Game ond Sensor Technology

Wireless In Vivo Communications and Networking

Robotics Institute. University of Valencia

Waves Nx VIRTUAL REALITY AUDIO

Group 5 Project Proposal Prototype of a Micro-Surgical Tool Tracker

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

Available online at ScienceDirect. Tobias Teich, Falko Roessler, Daniel Kretz, Susan Franke *

Vision with Precision Webinar Series Augmented & Virtual Reality Aaron Behman, Xilinx Mark Beccue, Tractica. Copyright 2016 Xilinx

Creating an Infrastructure to Address HCMDSS Challenges Introduction Enabling Technologies for Future Medical Devices

A Modular Architecture for an Interactive Real-Time Simulation and Training Environment for Satellite On-Orbit Servicing

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

da Vinci Skills Simulator

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

MIVS Tel:

Intuitive User Interfaces in Maritime Navigation

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

AUGMENTED REALITY AS AN AID FOR THE USE OF MACHINE TOOLS

Horizon. Your innovative partner in medicine technology.

Communication Requirements of VR & Telemedicine

Team Breaking Bat Architecture Design Specification. Virtual Slugger

SMart wearable Robotic Teleoperated surgery

Smart Space - An Indoor Positioning Framework

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

Transcription:

NeuroSim - The Prototype of a Neurosurgical Training Simulator Florian BEIER a,1,stephandiederich a,kirstenschmieder b and Reinhard MÄNNER a,c a Institute for Computational Medicine, University of Heidelberg b Department of Neurosurgery, Medical Faculty Mannheim, University of Heidelberg c Department of Computer Science V, University of Heidelberg Abstract. We present NeuroSim, the prototype of a training simulator for open surgical interventions on the human brain. The simulator is based on virtual reality and uses real-time simulation algorithms to interact with models generated from MRTor CT-datasets. NeuroSim provides a native interface by using a real surgical microscope and original instruments tracked by a combination of inertial measurement units and optical tracking. Conclusively an immersive environment is generated. In a first step the navigation in an open surgery setup as well as the hand-eye coordination through a microscope can be trained. Due to its modular design further training modules and extensions can be integrated. NeuroSim has been developed in cooperation with the neurosurgical clinic of the University of Heidelberg and the VRmagic GmbH in Mannheim. Keywords. Virtual Reality, Medical Training Simulator, Neurosurgery Introduction Neurosurgical interventions on the human brain are complicated and highly risky. Although minimal invasive techniques are used more often, there is still need for open surgical interventions, which can be accomplished only by very well trained and experienced surgeons. See one, do one, teach one is the most common axiomforacquiring medical skills although this method might endanger patients. Another possibility is the training on plastic models, living animals or dead bodies. So thereisagreatneedforan efficient training environment that is realistic without involving real patients or animals. Virtual reality (VR) can be used in order to implement such a training system. Apart from the properties mentioned, VR-simulators have several advantages: Surgical tasks are reproducible and can be trained at any time, even ifthecaseisrare. The surgeon s skills are measured objectively and the result canbecomparedtoother users. Although there are some groups that are developing neurosurgical simulators [DeMauro08,NeuroTouch], we are not aware of any project that combinesthenative interface of a moveable surgical microscope with original instruments. 1 Corresponding Author: Florian Beier, Institute for Computational Medicine, University of Heidelberg, Germany, E-mail: florian.beier@ziti.uni-heidelberg.de

We present NeuroSim, a VR-based simulator, that uses original instruments and a real surgical microscope. The first training module features anabstracttaskinorderto train basic skills. The software design is modular and based on training modules, so further tasks like tumor resection or aneurysm clipping can be added. 1. Methods While developing NeuroSim our main focus was to combine a realistic interface with an immersive real-time simulation. Our setup consists of a phantom of the head,originalinstruments, a surgical microscope, several cameras and a standard personal computer (see figure 1). NeuroSim uses a modular software platform which includes a plugin structure and is easily extendable. (a) Surgical microscope (b) Optics carrier and phantom of the head Figure 1. NeuroSim 1.1. Instrument Tracking The phantom of the head hosts an optical tracking system (see figure 2(a)) which consists of three CMOS cameras, several white LEDs and one FPGA (field programmable gate array). Passive color markersare attached to the tip of the original instruments. The FPGA gathers and preprocesses the data from the cameras in order to reduce latency and the amount of data being transferred to the PC [Koepfle04]. Only one color per instrument is used, the reconstruction is done by a relational method described in [Koepfle07]. An inertial measurement unit connected via USB and consisting of three accelerometers and three gyroscopes is tied to the instruments (see figure 2(b)) in order to estimate their orientation and gather data that can be used to stabilize the optical tracking. Sensor fusion combines the data from the optical tracking, the gyroscopes and the accelerometers in order to determine the position and orientation of the instruments and to filter glitches. Future work will include a more sophisticated sensor fusion that uses the inertial measurement unit to make the tracking more robust in cases of occluded markers. In addition, more instruments such as a needle holder or scissors will be integrated in the system.

1.2. Surgical Microscope (a) Tracking system (b) Instrument with inertial measurement unit Figure 2. Instrument Tracking Almost all surgical interventions on the human brain requiremicrosurgicalskillsandare performed with a neurosurgical microscope that can be freely positionedabovetheoperating field. Position and orientation of the microscope as well as the state of the pistol grip buttons like zoom or focus have to be determined. NeuroSim uses the mechanical and electrical part of a real surgical microscope to provide anativemoveableinterface. Atrackingsystem,mountedonthemicroscope(seefigure3(a)), is used to track active infrared markers that are integrated in the phantom of the head (see figure 3(b) and 3(c)). The inside-out tracking takes advantage of the fact that the optical axis of the microscope is always positioned in such a way that the camera system points towards the phantom. The use of infrared markers reduces the negative influence of changing light environments and guarantees a stable tracking. Each pistol grip includes a joystick that controls the precise movement of the microscope on two axes. As the tracking system is directly attached to the head of the microscope, its movement is already included in the tracking process. The optical oculars are substituted by a stereo display (see figure 3(d)) in which the computer generated scene is shown in 3D. All devices mentioned can be added to an original surgical microscope, so that costs for a future product can be reduced. Buttons like focus and zoom will be readout via a CAN-bus interface in a future process. 1.3. Model generation The models used in NeuroSim are generated from MRT- or CT-images. The generation is done in three steps: First the raw images are segmented, then asurfacemodelisextracted and, in a last step, the surface is used to generate a tetrahedron mesh. The first abstract training module uses a part of the brain as a background tissue that can be deformed by interacting with the instruments. For the medical training modules that will be implemented next, more complex models of the brain are generated from different datasets. For the generation of vessels, CTangiography datasets will be used. As a result, many different but still realistic sets of models will be available.

(a) Trackingsystem (b) Inside of the phantom (c) Infrared LED marker (d) Stereo display Figure 3. Microscope tracking and setup 1.4. Simulation Real-time tissue modelling is based on a high-performance and reusable framework developed within the ViPA group which was presented in [Grimm05]. The framework is currently being developed in cooperation with the VRmagic GmbH and the ViPA group. The simulation used in the first training module is based on an approach presented by [Teschner04] which has been modified in order to support real-time cutting of tetrahedrons and can be accelerated using GPUs. 1.5. Simulator Framework The simulator is based on a modular software framework developed within the ViPA group. It allows rapid prototyping of medical simulators by using a plugin based architecture. Highly reusable plugins form the basis of the framework and can be shared across different simulators. The plugin themselves are decoupled via an abstract interface layer. Communication is done via message-passing, so single components like input interfaces (e.g. tracking device) can easily be swapped or simulated by other devices (e.g. keyboard). Persistence and record/replay functionality can be included in the frame-

work. The VR itself uses a similarly modular but more lightweight approach called component based entity system, where entities in the VR are aggregated from components. This approach offers highly reusable components and allows an object in the VR to be constructed via a graphical editor or simple text files. 2. Results By putting all the components described above together, the prototype of a neurosurgical simulator was created. The first training module consists of a rigid-body-simulation of several small spheres. These spheres have to be broached withtheinstruments.ifthetip of theinstrumentdoesnottouch the sphereperpendicularto its surface, the sphere slides away and the instrument does not enter. If the position of the tip inside the sphere is near the center, the color of the sphere turns slowly from red to green (see figure 4). Some of the spheres are positioned behind the skull, outside the volumethat is initially visible.in order to see all spheres through the microscope, the microscope has to be repositioned during the procedure. Although the task is quite abstract, itmeetsseveraldemands:first, the trainee has to get familiar with the positioning of the surgical microscope. He or she has to navigate it in a way so that all spheres are visible. Second, the indirect and steady handling of the instruments is trained. Figure 4. Abstract training module 3. Conclusions We presented the prototype of a neurosurgical training simulator. Through the combination of original instruments and a real surgical microscope, NeuroSim is able to create an immersive environment. Thus we were able to perform abstract tasks. By doing that, several basic skills that are the fundament of a successful surgerycan be trained. Current development includes training modules focusing on medical content like the suturing of two blood vessels and a more complex sensor fusion for the instrument tracking. Due to the modular platform design more training modules can be added easily. It is planned to add modules for tumor resection and aneurysm clipping. Furthermore, brain models will be generated from real datasets in order to build up a case database. Finally, an objective evaluation will be integrated.

Acknowledgements This work is kindly supported by Leica Microsystems 2,sponsoroftheneurosurgical microscope, and VRmagic GmbH 3. References [DeMauro08] A. De Mauro, J. Raczkowsky, R. Wirtz, H. Wörn. Development of a Microscope Embedded Training System for Neurosurgery,LectureNotesinComputerScience,Volume5104,2008 [Grimm05] J. Grimm. Interaktive Echtzeitmodellierung von biologischem Gewebe fürvirtuellerealitätenin der medizinischen Ausbildung, PhDthesis,UniversityofMannheim,DepartmentforMathematics and Computer Science, 2005. [Koepfle04] A. Köpfle, M. Schill, M. Rautmann, M. Schwarz, A. Pott, A. Wagner, R. Männer, E. Badreddin, P. Weiser, H. P. Scharf. Occlusion-Robust, Low-Latency Optical Tracking usingamodular Scalable System Archituecture, MedicalRobotics, Navigation& VisualizationMRNV, Remagen, Germany, March 2004. [Koepfle07] A. Köpfle, F. Beier, C. Wagner, R. Männer. Real-time Marker-based Tracking of a Non-rigid Object, StudHealthTechnolInform125(2007),232-234,Publishedby IOS Press. [NeuroTouch] http://www.nrc-cnrc.gc.ca/eng/dimensions/issue2/virtual_surgery.html [Teschner04] M. Teschner, B. Heidelberger, M. Mueller, M. Gross. AVersatileandRobustModelforGeometrically Complex Deformable Solids, Proc.ComputerGraphicsInternationalCGI 04,Crete,Greece, pp. 312-319, June 16-19, 2004. 2 http://www.leica-microsystems.com 3 http://www.vrmagic.com