Augmented Reality in Medicine

Similar documents
Scopis Hybrid Navigation with Augmented Reality

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Improving Depth Perception in Medical AR

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Toward an Augmented Reality System for Violin Learning Support

Novel machine interface for scaled telesurgery

An Augmented Reality Application for the Enhancement of Surgical Decisions

Augmented Reality to Localize Individual Organ in Surgical Procedure

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Parallax-Free Long Bone X-ray Image Stitching

NeuroSim - The Prototype of a Neurosurgical Training Simulator

Image Interpretation System for Informed Consent to Patients by Use of a Skeletal Tracking

Medical Images Analysis and Processing

Augmented Navigation Patricia Sazama, Xuetong Sun, Derek Juba, and Amitabh Varshney

Computer Assisted Abdominal

A Survey of Mobile Augmentation for Mobile Augmented Reality System

AR 2 kanoid: Augmented Reality ARkanoid

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

Virtual and Augmented Reality Applications

The Holographic Human for surgical navigation using Microsoft HoloLens

Enhanced Functionality of High-Speed Image Processing Engine SUREengine PRO. Sharpness (spatial resolution) Graininess (noise intensity)

2D, 3D CT Intervention, and CT Fluoroscopy

Robone: Next Generation Orthopedic Surgical Device Final Report

Università di Roma La Sapienza. Medical Robotics. A Teleoperation System for Research in MIRS. Marilena Vendittelli

HUMAN Robot Cooperation Techniques in Surgery

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

ISCW 2001 Tutorial. An Introduction to Augmented Reality

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

Robot assisted craniofacial surgery: first clinical evaluation

BodyViz fact sheet. BodyViz 2321 North Loop Drive, Suite 110 Ames, IA x555 www. bodyviz.com

Virtual and Augmented Reality techniques embedded and based on a Operative Microscope. Training for Neurosurgery.

Blood Vessel Tree Reconstruction in Retinal OCT Data

Augmented Reality Applications for Nuclear Power Plant Maintenance Work

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

iwindow Concept of an intelligent window for machine tools using augmented reality

Development of a Virtual Simulation Environment for Radiation Treatment Planning

TREND OF SURGICAL ROBOT TECHNOLOGY AND ITS INDUSTRIAL OUTLOOK

Multi-Access Biplane Lab

RENDERING MEDICAL INTERVENTIONS VIRTUAL AND ROBOT

Evaluation of Haptic Virtual Fixtures in Psychomotor Skill Development for Robotic Surgical Training

Robots in the Field of Medicine

Dynamic Distortion Correction for Endoscopy Systems with Exchangeable Optics

Computer Assisted Medical Interventions

INTERIOUR DESIGN USING AUGMENTED REALITY

Augmented Reality And Ubiquitous Computing using HCI

THE USE OF OPEN REDUCtion

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

Epona Medical simulation products catalog Version 1.0

YUMI IWASHITA

Introduction to Virtual Reality (based on a talk by Bill Mark)

3D and Sequential Representations of Spatial Relationships among Photos

A TELE-INSTRUCTION SYSTEM FOR ULTRASOUND PROBE OPERATION BASED ON SHARED AR TECHNOLOGY

A Virtual Interactive Navigation System for Orthopaedic Surgical Interventions

Analysis of retinal images for retinal projection type super multiview 3D head-mounted display

Augmented Reality: Its Applications and Use of Wireless Technologies

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Optimization of Energy Modulation Filter for Dual Energy CBCT Using Geant4 Monte-Carlo Simulation

MIVS Tel:

Chapter 1 - Introduction

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

MRI Phase Mismapping Image Artifact Correction

Medical Robotics. Part II: SURGICAL ROBOTICS

Maximum Performance, Minimum Space

Feasibility of photoacoustic image guidance for telerobotic endonasal transsphenoidal surgery

Enhancing Shipboard Maintenance with Augmented Reality

3D brain MR angiography displayed by a multi-autostereoscopic screen

Creating an Infrastructure to Address HCMDSS Challenges Introduction Enabling Technologies for Future Medical Devices

ARK: Augmented Reality Kiosk*

SMart wearable Robotic Teleoperated surgery

Augmented Reality Advancement X-Ray Imaging Medical Reality scanning

X-RAYS - NO UNAUTHORISED ENTRY

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

CHAPTER 2 COMMISSIONING OF KILO-VOLTAGE CONE BEAM COMPUTED TOMOGRAPHY FOR IMAGE-GUIDED RADIOTHERAPY

Magnified Real-Time Tomographic Reflection

Proposal for Robot Assistance for Neurosurgery

A Training Simulator for the Angioplasty Intervention with a Web Portal for the Virtual Environment Searching

Small Occupancy Robotic Mechanisms for Endoscopic Surgery

Integration of a real-time video grabber component with the open source image-guided surgery toolkit IGSTK

R/F. Comparison of Long View Radiography Systems. 1. Introduction. 2. Methods of Long View Radiography

Group 5 Project Proposal Prototype of a Micro-Surgical Tool Tracker

Elemental Image Generation Method with the Correction of Mismatch Error by Sub-pixel Sampling between Lens and Pixel in Integral Imaging

Interior Design using Augmented Reality Environment

User Interfaces in Panoramic Augmented Reality Environments

Industrial Use of Mixed Reality in VRVis Projects

Using Web-Based Computer Graphics to Teach Surgery

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

Planmeca Romexis. quick guide. Viewer EN _2

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

Digital Reality TM changes everything

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Application of 3D Terrain Representation System for Highway Landscape Design

Available online at ScienceDirect. Procedia Computer Science 56 (2015 )

Coded Aperture for Projector and Camera for Robust 3D measurement

PERCEPTUAL EFFECTS IN ALIGNING VIRTUAL AND REAL OBJECTS IN AUGMENTED REALITY DISPLAYS

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

Transcription:

Review Augmented Reality in Medicine https://doi.org/10.7599/hmr.2016.36.4.242 pissn 1738-429X eissn 2234-4446 Ho-Gun Ha, Jaesung Hong Department of Robotics Engineering, Daegu Gyeongbuk Institute of Science and Technology (DGIST), Daegu, South Korea. Augmented reality is popular in various fields, and the importance of this technology has been increasing. Its medical application has also been widely studied. Particularly, augmented reality can be a more promising technique of a surgery which requires great precision. This paper introduces an overview of augmented reality and reviews the recent applications in medicine. After describing the basic concepts, brief characteristics of the three components that comprise augmented reality are provided. Various applications implemented in the authors laboratory are reviewed. Key words: augmented reality; camera calibration; image registration Corresponding Author: Jaesung Hong Department of Robotics Engineering Building, DGIST, 333.Techno jungangdaero, Hyeonpung-myeon, Dalseong-gun, Daegu, 711-873, REPUBLIC OF KOREA Tel: +82-53-785-6240 Fax: +82-53-785-6209 E-mail: jhong@dgist.ac.kr Received 7 October 2016 Revised 19 October 2016 Accepted 24 October 2016 This is an Open Access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0) which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original work is properly cited. INTRODUCTION Augmented reality (AR) is used in many fields including medicine, education, manufacturing, and entertainment. With advances in optics, computer systems, and surgical instruments, AR application to medicine is being vigorously researched. Particularly, as surgery using laparoscopy, endoscopy, or catheterized intervention have increased, AR takes an important role in many medical applications [1-4]. AR denotes a technique to combine a real world and virtual objects which are artificially generated digital content by a computer [5]. As another aspect of AR is a registration between the real world and virtual objects, it aims to estimate threedimensional (3D) position of virtual objects related to the real world. Therefore, AR can allow the user to see 3D virtual objects superimposed upon the real world. With the help of AR in medicine, a surgeon can see hidden organs inside a body and improve the perception of treatment procedure by interacting with the real world. After a brief description of three components of the medical AR, its applications will be presented. TECHNOLOGY FOR AUGMENTED REALITY AR in medicine mainly comprises three technical parts such as camera calibration, patient registration, and object tracking [5-7]. 1. Camera Calibration Generally, real world objects are captured by a camera and reproduced on a display. AR merges virtual objects with the real world, which requires transformation between the camera and real world coordinates. Before estimating the transformation, the characteristics of the camera must be defined. Pinhole model is a simple camera model that maps the 3D real world onto twodimensional (2D) coordinates called the image plane. 3D points are mapped onto the 2D image plane by translating the point on a straight line towards the camera center until it intersects the 242 2016 Hanyang University College of Medicine Institute of Medical Science http://www.e-hmr.org

image plane [8,9]. This mapping is called perspective projection, and the transformation between the image and real world coordinates can be represented as a projection matrix. Thus, camera calibration is the estimation of the projection matrix parameters for a pinhole model [10-12]. 2. Patient Registration Patient data for preoperative planning is 3D volume data taken from computed tomography (CT) or magnetic resonance imaging (MRI). Since it provides a view of the internal anatomy and target points for the surgeon, patient data should be registered with respect to a patient of the real world coordinates, which is called patient registration [5]. Point based registration is a reliable solution, where registration is performed with fiducials affixed on the patient. One set, consisting of more than four fiducial points, is registered to another set of corresponding points using a rigid transformation. However, the accuracy of fiducial based registration varies depending on the number of fiducials and measurement quality of each fiducial position, as well as their spatial arrangement [2]. To improve registration accuracy, iterative closest point (ICP) based surface matching is often used in combination with point based registration [13-15]. However, careful selection and collection of 3D surface data is critical for final accuracy, usually expressed in terms of target registration error (TRE). explain the characteristics of each surgery, then the AR system configuration and results. 1. Cardiac Intervention A surgical navigation system was proposed to guide chronic total occlusion intervention. Conventional intervention for chronic total occlusion of the coronary artery depends highly on 2D X-ray images and the surgeon s experience. Therefore, large displacements between the surgeon s hand-eye coordination may be generated by discrepancies in position or orientation between the patient and the acquired images [2]. This can lead to misidentification of the coronary artery or incorrect positioning of the stenosis on the coronary artery [5]. The proposed system merged the 3D CT angiography model with X-ray images to provide 3D anatomical information to the surgeon [19]. Fig. 1 shows the cardiac intervention AR system components. A commercial optical tracking system was used to track the location of markers attached to the patient and C-arm device. The different system coordinates were unified using a transformation matrix. 3. Object Tracking Object tracking is to estimate the spatial position of the camera or marker on surgical instruments, and is an essential component of a medical AR system. In AR tracking, the relative position of an object on the basis of the camera position is generally calculated. When given a calibrated camera with known intrinsic parameters, the relative position can be determined as a set of three or more paired points between the 3D and projected 2D coordinates [11,12,16-18]. Combining these technologies, we can implement an AR system that overlays virtual objects on the endoscope or surgical microscope view. MEDICAL APPLICATIONS OF AUGMENTED REALITY This section reviews four AR systems in medical applications: cardiac, bone tumor resection, sinus, and spinal surgery. These AR systems are major components of surgical navigation systems proposed by the authors laboratory. We briefly Fig.1. Proposed AR system configuration for cardiac intervention Fig. 2 shows the prototype software for the proposed system, combining AR and virtual reality (VR). The CT angiograph is overlaid onto the X-ray image and the VR images are positioned beside it. We expect that surgeons can easily understand anatomical information that is occluded in the original X-ray image, as well as the vascular anatomy and relative instrument location using the prototype proposed system. The system can also minimize X-ray exposure and injection of contrast medium, since fluoroscopy is less required than in conventional surgery. http://www.e-hmr.org 243

Fig. 2. Prototype software for the proposed cardiac intervention AR system, including the computed tomographic angiography model overlaid onto the X-ray image and VR image Although several challenges still remain to apply the proposed navigation system to clinical use, the system is a promising alternative to fluoroscopy guided chronic total occlusion intervention. 2. Bone Tumor Resection Surgery An AR navigation system for bone tumor resection was proposed. Resection of a pelvic tumor is a major surgical challenge because blood vessels are complexly intertwined with nerves. Safety resection margins should be confirmed intraoperatively during the surgery. An AR navigation system would be helpful, providing intuitive visualized information of resection margins [20]. The AR system for bone tumor resection surgery included an embedded camera on a tablet PC to track the patient and tools. AR visualization was also displayed on the tablet PC. Fig. 3 shows the proposed AR system configuration, which includes just a tablet PC and multi-faced reference markers with no external optical tracking system. The tablet camera tracks and realizes patient and tool reference marker positions. Registration between patient and 2D camera image was achieved using the paired point registration method. The transformation to align two point sets was calculated after matching four to six anatomical and artificial markers. Their relationships between the camera and markers are then defined based on the registration using the perspective n-points algorithm. Fig.3. Proposed AR system configuration for bone tumor resection surgery Fig. 4 shows a resection margin calculated from a reconstructed 3D tumor model and a captured scene for bone tumor AR navigation. To build a resection margin, dilation operation in image processing is applied to a reconstructed 3D tumor model and resection margin is represented as red contours. Fig. 4(b) shows the 3D tumor which is represented as green is overlaid on the image from a tablet PC embedded camera with its resection margin. In the experiment, the bone tumor resection was simulated with 10 mm resection margins to an artificial bone tumor made of a cement. The mean safety margin was 12.28 and 10.26 mm for the conventional and proposed AR method, respectively. The safety margin sizes were significantly different (t-test, p < 0.05), and the AR method was 244 http://www.e-hmr.org

a b Fig. 4. Proposed AR system for bone tumor resection surgery; (a) resection margin of 3D tumor model and (b) captured scene from the proposed AR system. Fig. 5. Proposed AR based surgical navigation system for sinus surgery therefore significantly is closer to the desired 10 mm than the conventional method. 3. Sinus Surgery Sinus surgery is also endoscopic surgery. The main problem is difficulty locating a surgical instrument to a specific object seen through the endoscope [6]. Since the access route to paranasal sinuses is complex, complications such as blindness and cerebrospinal fluid leak can occur due to damage of the orbit and skull base. To solve these problems, an integrated system was developed consisting of an AR based surgical navigation system and endoscope holder [21]. The proposed AR navigation system for sinus surgery was similar to those used for conventional surgery. The proposed system comprises three processes: patient image registration, camera calibration, and camera based tracking, which were provided by paired point registration, pinhole model based calibration, and perspective n-points algorithm, respectively. The endoscope holder system consists of a 3 degrees of freedom (DOF) stackable parallel mechanism and 2-DOF end- http://www.e-hmr.org 245

effector. The 3-DOF stackable parallel mechanism combined a five-bar with two parallelograms, and the 2-DOF end-effector controlled the endoscope position. The system also included a brake, to hold the endoscope at any location the surgeon desired [21]. Fig. 5, which is the proposed AR navigation system, shows 2D multi-planar reconstruction (MPR) images (axial, coronal, and sagittal plane), along with the AR, and VR views. Warning and automatic transparency adjustment functions were also implemented. If the tip of the surgical instrument gets too close to a target, an alert sound is generated. The transparency of augmented objects is automatically changed according to the distance to the surgical instrument. C : Camera O : Patient(or phantom) P : Optical tracker(polaris) CM : Passive marker(camera) PM : Passive marker(patient) Fig. 6. Configuration of the position tracking section in the proposed AR system 4. Spinal Surgery For spinal surgery, the most important consideration is to correctly localize the surgical instrument inside the patient s anatomy. Therefore, AR based surgical navigation systems, which assist the surgeon to recognize patient anatomical structures, are widely accepted and have become a very an important research topic in this field [22]. However, although AR provides more intuitive visual information, inaccurate depth perception is a major issue. To improve the depth perception, an integrated VR and AR system was proposed that displayed in a single window with aligned view axes and provided the distance between surgical instruments and target organs [18]. The proposed VR and AR switchable surgical navigation system consists of position tracking and visualization sections. In the position tracking section as shown in fig. 6, the transformation ( o ct) between the camera and patient is calculated by an optical tracker, and is updated with tracking data in real time. The visualization section used an open source visualization library and graphic processing unit(gpu) based depth peeling technique to display translucent objects [18]. The user can switch from AR to VR by rotating the virtual camera around target objects, providing visualization of patient anatomy depth. Fig. 7 shows the proposed VR and AR switchable surgical navigation system. When the virtual camera is positioned within the range of a camera image, surgical navigation system is operated in AR mode. Otherwise, it is a b Fig. 7. Proposed AR and VR switchable surgical navigation system for spinal surgery; (a) AR mode and (b) VR mode 246 http://www.e-hmr.org

operated in VR mode. In addition, the depth which means the minimum distance between the tip of a surgical instrument and the nearest point of the target is also displayed on the screen. CONCLUSION Surgery is changing from open procedures to minimally invasive approaches. AR technology has a great potential to assist this change, and is becoming more important. The largest advantage is to visualize the region of interest, such as tumors, blood vessels, and nerves which are often invisible or obscured to direct vision. We believe AR will only become more ubiquitous in future medicine. Research on precise camera calibration and patient to image registration will help provide robust AR for clinical applications. ACKNOWLEDGMENTS This work was supported by the R&D Program of the DGIST Convergence Science Center (12-BD-0402) and, in part, by the Health and Medical R&D Program of the Ministry of Health and Welfare of Korea (HI13C1634) and by the Technology Innovation Program (10040097) funded by the Ministry of Trade, Industry and Energy Republic of Korea (MOTIE, Korea) and by the Robot industry fusion core technology development project of the Ministry of Trade, Industry & Energy of KOREA (No. 10052980). REFERENCES 1. Azuma R, Baillot Y, Behringer R, Feiner S, Julier S, MacIntyre B. Recent advances in augmented reality. IEEE computer graphics and applications 2001;21:34-47. 2. Sielhorst T, Feuerstein M, Navab N. Advanced medical displays: A literature review of augmented reality. Journal of Display Technology 2008;4:451-67. 3. Silva R, Oliveira JC, Giraldi GA. Introduction to augmented reality. National Laboratory for Scientific Computation, Av Getulio Vargas 2003. 4. Carmigniani J, Furht B, Anisetti M, Ceravolo P, Damiani E, Ivkovic M. Augmented reality technologies, systems and applications. Multimedia Tools and Applications 2011;51:341-77. 5. Fischer J, Neff M, Freudenstein D, Bartz D. Medical augmented reality based on commercial image guided surgery. In: Eurographics Symposium on Virtual Environments (EGVE); 2004:83-6. 6. De Paolis LT, Aloisio G. Augmented reality in minimally invasive surgery. InAdvances in Biomedical Sensing, Measurements, Instrumentation and Systems: Springer; 2010:305-20. 7. Lamata P, Freudenthal A, Cano A, Kalkofen D, Schmalstieg D, Naerum E, et al. Augmented reality for minimally invasive surgery: overview and some recent advances. INTECH Open Access Publisher; 2010. 8. Ha H, Bok Y, Joo K, Jung J, So Kweon I. Accurate Camera Calibration Robust to Defocus Using a Smartphone. In: Proceedings of the IEEE International Conference on Computer Vision; 2015:828-36. 9. Poling B. A Tutorial On Camera Models. 10. Medioni G, Kang SB. Emerging topics in computer vision. Prentice Hall PTR; 2004. 11. Riba Pi E. Implementation of a 3D pose estimation algorithm. 2015. 12. Moreno D, Taubin G. Simple, accurate, and robust projector-camera calibration. In: 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission: IEEE; 2012:464-71. 13. Besl PJ, McKay ND. Method for registration of 3-D shapes. In: Robotics-DL tentative: International Society for Optics and Photonics; 1992:586-606. 14. Zhang Z. Iterative point matching for registration of free-form curves and surfaces. International journal of computer vision 1994;13:119-52. 15. Sharp GC, Lee SW, Wehe DK. Invariant features and the registration of rigid bodies. In: Robotics and Automation, 1999 Proceedings 1999 IEEE International Conference on: IEEE; 1999:932-7. 16. Balachandran R, Fitzpatrick JM. Iterative solution for rigid-body pointbased registration with anisotropic weighting. In: SPIE Medical Imaging: International Society for Optics and Photonics; 2009:72613D-D-10. 17. Burschka D, Mair E. Direct pose estimation with a monocular camera. In: International Workshop on Robot Vision: Springer; 2008:440-53. 18. Choi H, Cho B, Masamune K, Hashizume M, Hong J. An effective visualization technique for depth perception in augmented reality based surgical navigation. The International Journal of Medical Robotics and Computer Assisted Surgery 2015. 19. Jeon S, Hwangbo S, Hong J. A Surgical Navigation System to Assist in Chronic Total Occlusion Intervention. In: URAI 2016. 20. Choi H, Park Y, Cho H, Hong J. An augmented reality based simple navigation system for pelvic tumor resection. Pro. of American Acadamy of Orthopaedic Surgeons, 2016. 21. Lee S, Yoon H-S, Park J, Chung Y-S, Hong J, Yi B-J. A Surgical Navigation and Endoscope Holder Integrated System for Sinus Surgery. Proc. of The 11th Asian Conference on Computer Aided Surgery, 2015. 22. Jeon S, Kim J, Hong J. Surgical navigation system for assisting epiduroscopic laser neural decompression (ELND) procedure. its clinical application in 14 patients. In: Computer Assisted Radiology and Surgery; 2014. http://www.e-hmr.org 247