Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

Similar documents
Scopis Hybrid Navigation with Augmented Reality

Improving Depth Perception in Medical AR

Computer Assisted Abdominal

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Parallax-Free Long Bone X-ray Image Stitching

Evaluation of a portable image overlay projector for the visualisation of surgical navigation data: phantom studies

A Comparison Between Camera Calibration Software Toolboxes

Virtual and Augmented Reality Applications

Augmented Reality in Medicine

AR 2 kanoid: Augmented Reality ARkanoid

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

An Augmented Reality Application for the Enhancement of Surgical Decisions

Catadioptric Stereo For Robot Localization

Communication Requirements of VR & Telemedicine

Augmented Navigation Patricia Sazama, Xuetong Sun, Derek Juba, and Amitabh Varshney

Augmented Reality to Localize Individual Organ in Surgical Procedure

Checkerboard Tracker for Camera Calibration. Andrew DeKelaita EE368

Virtual and Augmented Reality techniques embedded and based on a Operative Microscope. Training for Neurosurgery.

Term Paper Augmented Reality in surgery

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

Integration of a real-time video grabber component with the open source image-guided surgery toolkit IGSTK

Robots in the Field of Medicine

The Holographic Human for surgical navigation using Microsoft HoloLens

Novel machine interface for scaled telesurgery

Designing tracking software for image-guided surgery applications: IGSTK experience

Haptic Feedback in Mixed-Reality Environment

Fracture fixation providing absolute or relative stability, as required by the personality of the fracture, the patient, and the injury.

NeuroSim - The Prototype of a Neurosurgical Training Simulator

Advanced Augmented Reality Telestration Techniques With Applications In Laparoscopic And Robotic Surgery

Scanned Image Segmentation and Detection Using MSER Algorithm

A Geometric Correction Method of Plane Image Based on OpenCV

Uncertainty in CT Metrology: Visualizations for Exploration and Analysis of Geometric Tolerances

Measurements of the Level of Surgical Expertise Using Flight Path Analysis from da Vinci Robotic Surgical System

Using virtual reality for medical diagnosis, training and education

BEAMFORMING WITH KINECT V2

Quality control of Gamma Camera. By Dr/ Ibrahim Elsayed Saad 242 NMT

PERCEPTUAL EFFECTS IN ALIGNING VIRTUAL AND REAL OBJECTS IN AUGMENTED REALITY DISPLAYS

Image Interpretation System for Informed Consent to Patients by Use of a Skeletal Tracking

(10) Patent No.: US 8,504,136 Bl

Goal of this Section. Capturing Reflectance From Theory to Practice. Acquisition Basics. How can we measure material properties? Special Purpose Tools

ReVRSR: Remote Virtual Reality for Service Robots

MED-LIFE: A DIAGNOSTIC AID FOR MEDICAL IMAGERY

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Digital images. Digital Image Processing Fundamentals. Digital images. Varieties of digital images. Dr. Edmund Lam. ELEC4245: Digital Image Processing

User Interfaces in Panoramic Augmented Reality Environments

5HDO 7LPH 6XUJLFDO 6LPXODWLRQ ZLWK +DSWLF 6HQVDWLRQ DV &ROODERUDWHG :RUNV EHWZHHQ -DSDQ DQG *HUPDQ\

Virtual Reality as Human Interface and its application to Medical Ultrasonic diagnosis

Surgical navigation display system using volume rendering of intraoperatively scanned CT images

A Virtual Interactive Navigation System for Orthopaedic Surgical Interventions

A new operative gamma camera for Sentinel Lymph Node procedure

Toward an Augmented Reality System for Violin Learning Support

Computer Assisted Medical Interventions

VisuaLax: Visually Relaxing Augmented Reality Application Using Music and Visual Therapy

Augmented reality based visualization of organ reflex points on palm for palm based reflexology treatments

Development of a Virtual Simulation Environment for Radiation Treatment Planning

Augmented reality for underwater activities with the use of the DOLPHYN

Evaluation of Haptic Virtual Fixtures in Psychomotor Skill Development for Robotic Surgical Training

Dynamic Distortion Correction for Endoscopy Systems with Exchangeable Optics

Passive calibration board for alignment of VIS-NIR, SWIR and LWIR images

High-Resolution Stereoscopic Surgical Display Using Parallel Integral Videography and Multi-projector

MRI IS a medical imaging technique commonly used in

AUGMENTED REALITY IN VOLUMETRIC MEDICAL IMAGING USING STEREOSCOPIC 3D DISPLAY

A Study On Preprocessing A Mammogram Image Using Adaptive Median Filter

RENDERING MEDICAL INTERVENTIONS VIRTUAL AND ROBOT

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Stroboscopic illumination scheme for seamless 3D endoscopy

Blood Vessel Detection in Images from Laser-Heated Skin

Medical Images Analysis and Processing

Method for out-of-focus camera calibration

The Evaluation of Collimator Alignment of Diagnostic X-ray Tube Using Computed Radiography System

Chapter 1 - Introduction

Automatic High Dynamic Range Image Generation for Dynamic Scenes

Application of 3D Terrain Representation System for Highway Landscape Design

SMart wearable Robotic Teleoperated surgery

Haptic Reproduction and Interactive Visualization of a Beating Heart Based on Cardiac Morphology

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Watermarking patient data in encrypted medical images

Annotation Overlay with a Wearable Computer Using Augmented Reality

THE USE OF OPEN REDUCtion

Real Time Deconvolution of In-Vivo Ultrasound Images

Multi-Access Biplane Lab

An Augmented Reality Navigation System with a Single-Camera Tracker: System Design and Needle Biopsy Phantom Trial

TRACKING ROBUSTNESS AND GREEN VIEW INDEX ESTIMATION OF AUGMENTED AND DIMINISHED REALITY FOR ENVIRONMENTAL DESIGN.

Image Processing & Projective geometry

Physics limited resolution of videoscopes Pushing the limits of resolution and why optics know-how is now critical

VisionGauge OnLine Standard Edition Spec Sheet

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

INTERIOUR DESIGN USING AUGMENTED REALITY

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Air-filled type Immersive Projection Display

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges

Tactile Sensation Imaging for Artificial Palpation

ACTIVE, A PLATFORM FOR BUILDING INTELLIGENT OPERATING ROOMS

Computing for Engineers in Python

Transcription:

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery Matteo Fusaglia 1, Daphne Wallach 1, Matthias Peterhans 1, Guido Beldi 2, Stefan Weber 1 1 Artorg Center University of Bern 2 Department for Visceral Surgery and Medicine Inselspital University of Bern 1. Purpose Benefits of laparoscopic liver surgery with respect to open surgery are well known and include reduced patient trauma and perioperative blood loss. However, video images acquired through an optical device (i.e. endoscope), are the only source of visual guidance into the body cavities. Thus, drawbacks such as the keyhole view of the operating field and 2-dimensional video-optic representation of the operative situs limit the diffusion of this technique [3][5]. To overcome these limitations, instrument guidance systems dedicated for laparoscopic surgery have been proposed [3]. Through tracking of the laparoscopic camera and instruments, as well as registering the liver to an available image data set (CT, MRI, MeVis), we developed an augmented reality (AR) framework in which the endoscope's video stream is augmented with relevant medical information (i.e. positions of tumors). In order to provide a clinically applicable instrument guidance system (IGS) for laparoscopic liver surgery, the accuracy of the AR framework plays an important role. This work aims at evaluating the accuracy of the image overlay of the endoscopic image and the pre-operative data set. 2. Methods An IGS for open liver surgery (CAScination, CH) is extended to integrate a calibrated view of a laparoscopic camera [4]. A standard laparoscopic optics with 30 inclination is used and connected to video camera module (Karl Storz Endoskope, GER). The video signal is integrated into the instrument guidance system. Instrument tracking is provided by an optical tracking system (Polaris Vicra, Northern Digital Inc., Canada), tracking the distal ends of the laparoscopic instruments. MeVis planning data is used as 3D medical image data input, and is registered to the patient through a locally-rigid landmarked-based registration [4]. Then, in order to achieve an accurate image overlay of the planning data and the endoscope video stream, the endoscope camera is calibrated using an optical-tracked, Zhang based calibration. Virtual images are finally rendered using a virtual camera defined by the endoscope s intrinsic and extrinsic parameters.

Figure 1: IGS functional model. The calibration process uses a checkerboard composed of a rectangular grid (9 x 7 black-white pattern) attached on a metal plate together with four passive markers. While the endoscope remains in the same position, n different images of the checkerboard at different angles are acquired together with the checkerboard position with respect to the tracking device coordinate system [6]. The calibration is performed by using OpenCV library [7], yielding the instrinsic parameters of the camera and the extrinsic parameters relating the checkerboard coordinate system to the coordinate system of the endoscope camera for each checkerboard position. In order to track the endoscope, the rigid body transformation from the coordinate system of the tracker attached to the endoscope to the coordinate system of the endoscope camera is required. For each extrinsic parameter, this transformation is obtained by solving the following: ( endoscope T camera ) = ( tracker T endoscope ) -1. ( tracker T checkerboard ). ( camera T checkerboard ) -1 where tracker T endoscope is the transform relating the tracking device coordinate system to the endoscope's coordinate system and tracker T checkerboard is the transform relating the tracking device coordinate system to the checkerboard's coordinate system (see Fig. 1). These two transforms are given by the optical tracking system.

The last transform, camera T checkerboard relates the coordinate system of the endoscope camera to the checkerboard's coordinate system and is given by the extrinsic parameters. Since the transformation endoscope T camera is a static parameter, the error introduced during the calibration phase yields in variance across the transformations. We selected the transformation minimizing the reprojection error, as the distance between the corners of the grid in the 2D image of the checkerboard and in the 3D checkerboard model, reprojected on the 2D image using the extrinsic parameters [2]. Finally, the virtual camera's point of view is defined by setting its position and intrinsic parameters to those of the endoscope camera. The endoscope image is undistorted using the distortion maps calculated during the calibration procedure and superimposed with the virtual camera view of the 3D image of the planning data [2]. The accuracy of the system was evaluated through two measures. First, in order to evaluate the accuracy of the calibration, the reprojection error was computed for each checkerboard orientation. Since the Camera Calibration Toolbox for MATLAB provides useful and intuitive errors views, and computes the calibration in the same way as the OpenCV library, the reprojection error was computed with the former. The uncertainty corresponding to the calibrated extrinsic parameters, computed as three times the standard deviations of the reprojection errors, was also calculated [1][2]. Then, the overall accuracy of the augmented reality system was evaluated by using a rapid prototyped model of a human liver with a superimposed 1-cm surface grid (Figure 2). After calibration of the endoscope camera, augmented reality images were created by superimposing the endoscope image with a 3D image of the model, using 3 different orientations of the endoscope with respect to the model, and 2 distances between the endoscope camera and the model (approximately 2 cm and 35 cm) [2]. For each AR image, the discrepancies between the grids in the endoscope image and in the 3D image were then measured in 8 different nodes: 4 in the center of the image, and 4 in the borders. Figure 2: Rapid prototype model of a human liver.

3. Results Table I presents the uncertainty corresponding to the calibrated extrinsic parameters, computed as three times the standard deviations of the reprojection errors. Table I: Calibration uncertainty. R(ω x, ω y, ω z ) [radians] T(X,Y,Z) [mm] E max (0.003,0.003,0.005) (1.1,1.2,1.1) E min (0.002,0.002,0.004) (0.7,0.6,0.7) E median (0.002,0.003,0.004) (0.9,0.8,0.8) Fig.3 depicts the error in pixels of each reprojected 2D point for the calibration yielding the lowest reprojection error. The different colors represent the different images used to perform the calibration [1]. Figure 3: Reprojection error of the extrinsic parameters related to the best calibration. Fig.4 shows the box plots related to the misalignment of the superimposition between the endoscope image and the 3D image depicted in Fig. 5. The minimal and maximal values, as well as the

lower quartile, median, and upper quartile, were computed for the central and margin nodes. Figure 4: Box plot of the misalignment (in mm) between 3D and endoscopic image. Figure 5: Image overlay of the AR framework. 4. Conclusion The accuracy evaluation and the results of the AR framework of our system was presented. We showed that the endoscope image could be overlaid with a 3D image with a mean error of 3.5 mm ± 1.9 mm. Successful application of image overlay in laparoscopic IGS can potentially lead to better orientation for the surgeon, better identification of structures at risk and to better outcomes. In the future we aim to increase the accuracy of the image overlay and to provide a wide range of AR methodologies and techniques.

5. References 1. J.-Y. Bouguet, Camera calibration toolbox for Matlab (2010). [Online]. Available: http://www.vision.caltech.edu/bouguetj/calib_doc. 2. KA. Gavaghan, M. Peterhans, T. Oliveira-Santos, S. Weber (2011) A portable image overlay projection device for computer-aided open liver surgery. in: IEEE Trans Biomed Eng 58:1855 1864. 3. S.A. Nicolau, L. Goffin, L. Soler. "A Low Cost and Accurate Guidance System for Laparoscopic Surgery: Validation on an Abdominal Phantom." Proceedings of the ACM symposium on Virtual reality software and technology - VRST '05. Monterey, CA, USA: ACM, 2005. 124-133. 4. M. Peterhans, A. vom Berg, Dagon, D. Inderbitzin, C. Baur, D. Candinas, and S. Weber, A navigation system for open liver surgery: Design, workflow, and first clinical applications, IJMRCAS vol. 7, no. 1, pp. 7 16, March 2011. 5. C. Simillis, V.A. Constantinides, P.P. Tekkis, A. Darzi, R. Lovegrove, L. Jiao, A. Antoniou. "Laparoscopic versus open hepatic resections for benign and malignant neoplasms a metaanalysis." Surgery, 2007: 203-211. 6. Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, no. 11, pp. 1330 1334, November. 2000. 7. Camera Calibration and 3D Reconstruction. [Online]. Available: http://opencv.willowgarage.com/documentation/camera_calibration_and_3d_reconstru ction.html.