Improving Depth Perception in Medical AR

Similar documents
The Tangible Virtual Mirror: New Visualization Paradigm for Navigated Surgery

Virtually Extended Surgical Drilling Device: Virtual Mirror for Navigated Spine Surgery

The Visible Korean Human Phantom: Realistic Test & Development Environments for Medical Augmented Reality

Parallax-Free Long Bone X-ray Image Stitching

An Augmented Reality Navigation System with a Single-Camera Tracker: System Design and Needle Biopsy Phantom Trial

User Interface for Medical Augmented Reality

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Interoperative Guidance via Medical Augmented Reality. Martin Schulze March 25, 2007

HCI Design in the OR: A Gesturing Case-Study"

Communication Requirements of VR & Telemedicine

MRI IS a medical imaging technique commonly used in

Term Paper Augmented Reality in surgery

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

Novel machine interface for scaled telesurgery

Scopis Hybrid Navigation with Augmented Reality

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Immersive Augmented Reality Display System Using a Large Semi-transparent Mirror

AUGMENTED REALITY IN VOLUMETRIC MEDICAL IMAGING USING STEREOSCOPIC 3D DISPLAY

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Regan Mandryk. Depth and Space Perception

Perceived depth is enhanced with parallax scanning

BodyViz fact sheet. BodyViz 2321 North Loop Drive, Suite 110 Ames, IA x555 www. bodyviz.com

Analysis of Depth Perception with Virtual Mask in Stereoscopic AR

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

The Holographic Human for surgical navigation using Microsoft HoloLens

Simple Figures and Perceptions in Depth (2): Stereo Capture

Fast Perception-Based Depth of Field Rendering

Augmented Reality in Medicine

The Human Visual System!

VR based HCI Techniques & Application. November 29, 2002

Geographic information systems and virtual reality Ivan Trenchev, Leonid Kirilov

Localized Space Display

Virtual Reality. NBAY 6120 April 4, 2016 Donald P. Greenberg Lecture 9

An Augmented Reality Application for the Enhancement of Surgical Decisions

doi: /

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

NeuroSim - The Prototype of a Neurosurgical Training Simulator

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Introduction to Virtual Reality (based on a talk by Bill Mark)

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Evaluating System Capabilities and User Performance in the Battlefield Augmented Reality System

Instant Hole * (Windows onto Reality) Terry S. Yoo. T. Marc Olano

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

2D, 3D CT Intervention, and CT Fluoroscopy

Paper on: Optical Camouflage

Virtual Reality Technology and Convergence. NBA 6120 February 14, 2018 Donald P. Greenberg Lecture 7

Enhancing Fish Tank VR

Building a gesture based information display

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Medical Robotics. Part II: SURGICAL ROBOTICS

INTERIOUR DESIGN USING AUGMENTED REALITY

High-Resolution Stereoscopic Surgical Display Using Parallel Integral Videography and Multi-projector

Extending the Sonic Flashlight to Real Time Tomographic Holography

Usability and Playability Issues for ARQuake

Powered by MAX. Uroskop Omnia Max. Sharper images made smarter. siemens.com/uroskop-omnia-max

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Magnified Real-Time Tomographic Reflection

Interior Design using Augmented Reality Environment

Reviews of Virtual Reality and Computer World

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

User Interfaces in Panoramic Augmented Reality Environments

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Chapter 1 - Introduction

COPYRIGHTED MATERIAL. Overview

Scene layout from ground contact, occlusion, and motion parallax

R (2) Controlling System Application with hands by identifying movements through Camera

COPYRIGHTED MATERIAL OVERVIEW 1

Head Mounted Display Optics II!

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

A C A D / C A M. Virtual Reality/Augmented Reality. December 10, Sung-Hoon Ahn

Toward an Augmented Reality System for Violin Learning Support

Computational Near-Eye Displays: Engineering the Interface Between our Visual System and the Digital World. Gordon Wetzstein Stanford University

Virtual and Augmented Reality: Applications and Issues in a Smart City Context

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Using virtual reality for medical diagnosis, training and education

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

Application of 3D Terrain Representation System for Highway Landscape Design

Multi-Access Biplane Lab

USABILITY AND PLAYABILITY ISSUES FOR ARQUAKE

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

Virtual Reality Technology and Convergence. NBAY 6120 March 20, 2018 Donald P. Greenberg Lecture 7

Auditory and Visio-Temporal Distance Coding for 3-Dimensional Perception in Medical Augmented Reality

Development of a Virtual Simulation Environment for Radiation Treatment Planning

A Low Cost Optical See-Through HMD - Do-it-yourself

Output Devices - Visual

Vision: Distance & Size Perception

ARK: Augmented Reality Kiosk*

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - I

Evaluation of Haptic Virtual Fixtures in Psychomotor Skill Development for Robotic Surgical Training

Virtual Reality. Lecture #11 NBA 6120 Donald P. Greenberg September 30, 2015

Pursuit of X-ray Vision for Augmented Reality

Optical camouflage technology

VICs: A Modular Vision-Based HCI Framework

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

Mohammad Akram Khan 2 India

Using Web-Based Computer Graphics to Teach Surgery

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Transcription:

Improving Depth Perception in Medical AR A Virtual Vision Panel to the Inside of the Patient Christoph Bichlmeier 1, Tobias Sielhorst 1, Sandro M. Heining 2, Nassir Navab 1 1 Chair for Computer Aided Medical Procedures (CAMP), I-16, Technische Universität München, Boltzmannstraße 3, 85748 Garching, Germany 2 Trauma Surgery Department, Klinikum Innenstadt, LMU München, Nußbaumstraße 20, 80336 München, Germany Email: bichlmei@cs.tum.edu Abstract. We present the in-situ visualization of medical data taken from CT or MRI scans in real-time using a video see-through head mounted display (HMD). One of the challenges to improve acceptance of augmented reality (AR) for medical purpose is to overcome the misleading depth perception. This problem is caused by a restriction of such systems. Virtual entities of the AR scene can only be presented superimposed onto real imagery. Occlusion is the most effective depth cue [1] and let e.g. a correctly positioned visualization of the spinal column appear in front of the real skin. We present a technique to handle this problem and introduce a Virtual Window superimposed onto the real skin of the patient to create the feeling of getting a view on the inside of the patient. Due to motion of the observer the frame of the window covers and uncovers fragments of the visualized bones and tissue and enables the depth cues motion parallax and occlusion, which correct the perceptive misinformation. An earlier experiment has shown the perceptive advantage of the window. Therefore seven different visualization modes of the spinal column were evaluated regarding depth perception. This paper introduces the technical realization of the window. 1 Introduction Real-time in-situ visualization of medical data is getting increasing attention and has been a subject of intensive research and development during the last decade [2], [3], [4]. Watching a stack of radiography is time and space consuming within the firm work flow in an operating room (OR). Physicians have to associate the imagery of anatomical regions with their proper position on the patient. Medical augmented reality allows for the examination of medical imagery like radiography right on the patient. Three dimensional visualizations can be observed by moving with a head mounted display around the AR scene. Several systems [5, 2, 6] that are custom made for medical procedures tend to meet the requirements for accuracy and to integrate their display devices seamlessly into the operational work flow.

218 Fig. 1. Opaque surface model occludes real thorax. Therefore it is perceived in front of the body although the vertebrae is positioned correctly. Even if the visualization is semi-transparent like the direct volume rendered vertebrae we do not perceive the bones at their proper position. Right figure shows some components of our AR setup including a plastic phantom and the HMD 2 State of the art and new contribution Depth perception has become a major issue of current research in medical AR. Virtual data is superimposed on real imagery and visual depth perception is disturbed (Fig. 1). The problem has been identified as early as 14 years ago in the first publication about medical augmented reality [7]. This group tasked the problem by rendering a synthetic hole... around ultrasound images in an attempt to avoid conflicting visual cues. In an earlier paper Tobias Sielhorst et al. described an experiment that evaluated seven different visualization modes for the spinal column regarding depth perception [8]. This paper describes the technical realization of one of the winners of the evaluation. This is a virtual window that can be overlaid onto the skin and provides a bordered view onto the spinal column inside the patient. Due to the virtual window depth perception of the visualized medical data can be corrected. 3 Method Medical data taken from a CT or MRI scan is presented using a stereoscopic video see-through HMD. The whole tracking system that allows for tracking the observer wearing the HMD, the patient and several surgical instruments is described at [8]. We use direct volume rendering and presegmented surface models to visualize the data. 3.1 Position the window Placing the window to get the desired view into the patient can be performed without touching or moving the patient. While positioning the window, the observer wearing the HMD views a frame (Fig. 2) and guides it to the area of interest by moving his or her head. When the frame is at the desired position, the window can be set by key press. The size is adjustable by mouse interaction, which can be performed by an assistant on an external monitor that shows a copy of the imagery presented by the displays of the HMD. The window adopts

219 the shape of the skin. Therefore we add an augmentation of the skin presented as a surface model. The frame of the window defines the borders of a structured 2D grid consisting of a certain number of grid points. For every grid point a socalled picking algorithm examines the depth buffer at its corresponding pixel and recalculates three dimensional information of the nearest virtual object, which is in our case the surface model of the skin. After determination of their position in 3D space, the grid points are connected to compose a transparent surface. When the window surface is defined, it is used to mask the part of the scene, which is inside the thorax. Therefore we employ the so-called stencil buffer. The stencil buffer is an additional buffer besides the color buffer and depth buffer found on modern computer graphics hardware and can be used to limit the area of rendering. In our application the area is limited to the window when the visualized tissue or bones are drawn. Finally the window surface itself is rendered. 3.2 Window design & perceptive advantage The window was equipped with some design features to intensify the depth cues. Certain material parameters let the window appear like glass. Highlight effects due to the virtual light conditions support depth perception. Highlights on the window change the color of objects behind the window or even partially occlude these objects. The window plane is mapped with a simply structured texture, which enhances the depth cue motion parallax. Due to motion of the observer the texture on the window seams to move relatively faster than objects behind the window. The background of the virtual objects seen through the window can be set to transparent or opaque. Cutting et al. summarized the most important binocular and monocular depth cues [1]. Our AR scene is perceived binocularly with the two color cameras mounted on the HMD. Stereopsis is realized by the slightly different perspectives of the two cameras. Convergence is predefined by the orientation of the cameras. The window enhances perceptive information about depth because it partially occludes the vertebrae. The frame of the window covers and uncovers parts of the spinal column while the observer is moving. The latter depth cue motion parallax is after occlusion and stereopsis the third most effective source of information about depth [1]. 4 Results The virtual window helps to overcome the misleading depth perception caused by the superimposed virtual spinal column onto the real thorax. Regarding depth perception an earlier experiment [8] compared seven different visualization modes of the spinal column including the virtual window. The virtual window was evaluated as one of the best methods. The method of posing the window interactively into the scene has the advantage that the surgeon or personnel of the OR do not have to touch the patient or use a further instrument that has to be kept sterile and wasts space. The observer wearing the HMD can easily position and

220 Fig. 2. Volume rendered spinal column and setup of the window. Frame can be guided by head movement to the required area Fig. 3. Sequence shows the window from different perspectives with a surface model of the spinal column reposition the window by moving his or her head. Figures 3 show a sequence while the observer is moving the HMD respective the thorax with the attached window. 5 Discussion We presented the virtual window regarding spine surgery to provide a intuitive view on the visualization of the vertebrae. However, the window can be used for further medical application, which will be part of our future work. Future work will also concern the optimization of setting up the window to avoid wasting precious time in the medical work flow, variation and evaluation of different designs, i.e. shape of the window and structure of the texture mapped on the window plane, to achieve the best depth perception and integration of augmented surgical instruments. 6 Acknowledgment Special thanks to Frank Sauer, Ali Khamene, and Sebastian Vogt from Siemens Corporate Research (SCR) for the design, setup, and implementation of the insitu visualization system RAMP they provided us. Thanks to A.R.T. GmbH for providing cameras and software for the outside-in tracking system. We also

221 want to express our gratitude to the radiologists and surgeons of the Klinikum Innenstadt München for their precious contribution in obtaining medical data and evaluating our systems. Thanks also to Joerg Traub, Marco Feuerstein, Stefan Wiesner and Philipp Stefan of the NARVIS group for their support. This work was granted by the BFS within the NARVIS project (www.narvis.org). References 1. Cutting JE, Vishton PM. Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In: Epstein W, Rogers S, editors. Perception of Space and Motion; 1995. 69 117. 2. Birkfellner W, Figl M, Huber K, et al. A head-mounted operating binocular for augmented reality visualization in medicine: Design and initial evaluation. IEEE Trans Med Imaging 2002;21(8):991 997. 3. King AP, Edwards PJ, Maurer CR Jr, det al. Stereo augmented reality in the surgical microscope 2000;9(4):360 368. 4. Sauer F, Khamene A, Bascle B, Vogt S, Rubinob GJ. Augmented reality visualization in imri operating room: System description and pre-clinical testing. Procs SPIE 2002;4681:446 454. 5. Sauer F, Khamene A, Vogt S. An augmented reality navigation system with a single-camera tracker: System design and needle biopsy phantom trial. LNCS 2002;2489:116 124. 6. King AP, Edwards PJ, Maurer CR Jr, et al. Design and evaluation of a system for microscope-assisted guided interventions. IEEE Trans Med Imaging 2000;19(11):1082 1093. 7. Bajura M, Fuchs H, Ohbuchi R. Merging virtual objects with the real world: Seeing ultrasound imagery within the patient. In: Procs Computer Graphics and Interactive Techniques. ACM Press; 1992. 203 210. 8. Sielhorst T, Bichlmeier C, Heining SM, Navab N. Depth perception a major issue in medical AR: Evaluation study by twenty surgeons. In: Procs MICCAI; 2006.