Novel machine interface for scaled telesurgery

Similar documents
ROBOTIC assistants are currently being introduced into

Magnified Real-Time Tomographic Reflection

Extending the Sonic Flashlight to Real Time Tomographic Holography

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Medical Robotics. Part II: SURGICAL ROBOTICS

Computer Assisted Medical Interventions

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

Scopis Hybrid Navigation with Augmented Reality

Small Occupancy Robotic Mechanisms for Endoscopic Surgery

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Haptics CS327A

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Haptic Virtual Fixtures for Robot-Assisted Manipulation

FALL 2014, Issue No. 32 ROBOTICS AT OUR FINGERTIPS

Chapter 1 Introduction to Robotics

Differences in Fitts Law Task Performance Based on Environment Scaling

Università di Roma La Sapienza. Medical Robotics. A Teleoperation System for Research in MIRS. Marilena Vendittelli

Here I present more details about the methods of the experiments which are. described in the main text, and describe two additional examinations which

Improving Depth Perception in Medical AR

Parallax-Free Long Bone X-ray Image Stitching

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Robots in the Field of Medicine

Performance Issues in Collaborative Haptic Training

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

HUMAN Robot Cooperation Techniques in Surgery

Can technological solutions support user experience, learning, and operation outcome in robotic surgery?

Elements of Haptic Interfaces

SMart wearable Robotic Teleoperated surgery

Surgical robot simulation with BBZ console

MEAM 520. Haptic Rendering and Teleoperation

Chapter 1. Introduction

Bibliography. Conclusion

MEAM 520. Haptic Rendering and Teleoperation

Haptic holography/touching the ethereal Page, Michael

Haptic Feedback in Laparoscopic and Robotic Surgery

Using Web-Based Computer Graphics to Teach Surgery

Harvard BioRobotics Laboratory Technical Report

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

Mobile Manipulation in der Telerobotik

Haptic Holography/Touching the Ethereal

Evaluation of Haptic Virtual Fixtures in Psychomotor Skill Development for Robotic Surgical Training

Spatial Representations From Perception and Cognitive Mediation

Image Guided Robotic Assisted Surgical Training System using LabVIEW and CompactRIO

The Holographic Human for surgical navigation using Microsoft HoloLens

Information and Program

AC : MEDICAL ROBOTICS LABORATORY FOR BIOMEDICAL ENGINEERS

Virtual and Augmented Reality techniques embedded and based on a Operative Microscope. Training for Neurosurgery.

Haptics in Military Applications. Lauri Immonen

Virtual and Augmented Reality Applications

Determination of Focal Length of A Converging Lens and Mirror

Applications of Optics

Force feedback interfaces & applications

NeuroSim - The Prototype of a Neurosurgical Training Simulator

Perceptual Overlays for Teaching Advanced Driving Skills

VR based HCI Techniques & Application. November 29, 2002

Robone: Next Generation Orthopedic Surgical Device Final Report

Creating an Infrastructure to Address HCMDSS Challenges Introduction Enabling Technologies for Future Medical Devices

Phantom-Based Haptic Interaction

da Vinci Skills Simulator

Haptic control in a virtual environment

Mechatronics Project Report

E X P E R I M E N T 12

Term Paper Augmented Reality in surgery

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

User Interfaces in Panoramic Augmented Reality Environments

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Paper on: Optical Camouflage

Surgical Assist Devices & Systems aka Surgical Robots

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

History of Virtual Reality. Trends & Milestones

Inexpensive Monocular Pico-Projector-based Augmented Reality Display for Surgical Microscope

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Application of Force Feedback in Robot Assisted Minimally Invasive Surgery

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

A flexible microassembly system based on hybrid manipulation scheme for manufacturing photonics components

Telexistence and Retro-reflective Projection Technology (RPT)

The Haptic Impendance Control through Virtual Environment Force Compensation

Russell and Norvig: an active, artificial agent. continuum of physical configurations and motions

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

PERCEPTUAL EFFECTS IN ALIGNING VIRTUAL AND REAL OBJECTS IN AUGMENTED REALITY DISPLAYS

Interactive Virtual Environments

Towards robotic heart surgery: Introduction of autonomous procedures into an experimental surgical telemanipulator system

Robotics, telepresence and minimal access surgery - A short and selective history

used to diagnose and treat medical conditions. State the precautions necessary when X ray machines and CT scanners are used.

Available theses in industrial robotics (October 2016) Prof. Paolo Rocco Prof. Andrea Maria Zanchettin

Virtual Environments. CSCI 420 Computer Graphics Lecture 25. History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics

Köhler Illumination: A simple interpretation

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

Robotics: Evolution, Technology and Applications

Physics 208 Spring 2008 Lab 2: Lenses and the eye

1/22/13. Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

Trends & Milestones. History of Virtual Reality. Sensorama (1956) Visually Coupled Systems. Heilig s HMD (1960)

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

Benefits of using haptic devices in textile architecture

Force Feedback Mechatronics in Medecine, Healthcare and Rehabilitation

Application of Gain Scheduling Technique to a 6-Axis Articulated Robot using LabVIEW R

Transcription:

Novel machine interface for scaled telesurgery S. Clanton, D. Wang, Y. Matsuoka, D. Shelton, G. Stetten SPIE Medical Imaging, vol. 5367, pp. 697-704. San Diego, Feb. 2004.

A Novel Machine Interface for Scaled Telesurgery Sam Clanton a,b,c, David Wang a,c, Yoky Matsuoka b, Damion Shelton b, and George Stetten a,b,c a Department of Bioengineering, University of Pittsburgh, Pittsburgh PA 15261, USA; b Robotics Institute, Carnegie Mellon University, Pittsburgh PA 15213, USA; c University of Pittsburgh Medical Center, Pittsburgh PA 15261, USA ABSTRACT We have developed a system architecture that will allow a surgeon to employ direct hand-eye coordination to conduct medical procedures in a remote microscopic environment. In this system, a scaled real-time video image of the workspace of a small robotic arm, taken from a surgical microscope camera, is visually superimposed on the natural workspace of a surgeon via a half-silvered mirror. The robot arm holds a small tool, such as a microsurgical needle holder or microsurgical forceps, and the surgeon grasps a second tool connected to a position encoder, in this case, a second robot arm. The views of the local and remote environments are superimposed such that the tools in the local and remote environments are visually merged. The position encoder and small robot arm are linked such that movement of the tool by the operator produces scaled-down movement by the small robot tool. To the surgeon, it seems that his hands and the tool he or she is holding are moving and interacting with the remote environment, which is really microscopic and at a distance. Our current work focuses on using a position-controlled master-slave robot linkage with two 3 degree of freedom haptic devices, and we are pursuing the use of a 6-to-7 degree of freedom master-slave linkage to produce more realistic interaction. Keywords: surgery, registration, microscopic, telepresence, robot, image-guided, projection, virtual, haptics 1. INTRODUCTION Robotic assistants are currently being introduced into surgery because they hold the promise of aiding or enhancing the capabilities of surgeons to perform more effectively in certain circumstances. One class of surgical assistant is designed to transfer the motions of a surgeon to a different location or effective modality. These are used to establish operator telepresence for a surgeon at a remote location, allow procedures to be conducted less invasively, or to otherwise enhance surgical performance. The purpose of our research is to create a new human interface for such a system, one which allows an operator to interact more naturally with a workspace located at a distance and of arbitrary size. Our intent is, as much as possible, to make operating at a different location or scale as easy and natural as performing more traditional local surgery. The work described in this paper builds on research we have conducted in the real-time superimposition of medical images with a natural view of the patient. The interface that we employ to create the illusion of telepresence is based on the Sonic Flashlight, 1 3 a device that enhances the visualization of ultrasound data. The Sonic Flashlight combines an ultrasound transducer, display, and partially-silvered mirror in a novel way to merge an ultrasound image of the interior of the patient with a natural view of the patient s exterior. The ultrasound image is reflected by the partially-silvered mirror, which is overlaid on the operator s direct view of the patient. The ultrasound image and natural view of the patient are combined in such a way that normal stereoscopic vision applies, and the images merge correctly regardless of the viewpoint of the observer. Many approaches to merging medical images with natural sight rely on tracking the patient and observer in order to display the merged medical image at an appropriate angle and location. By strategically placing the mirror, transducer, and display, however, the need for tracking the patient and observer is eliminated. The image of the ultrasound slice, displayed at the correct size, can be reflected such that the virtual image of the interior is merged in the correct location within the patient. The ultrasound data appears to emanate from its actual location. Further author information: Send correspondence to Sam Clanton: E-mail: sclanton@oeic.net, Telephone: 1 412 268 8407

For our current research, we extend the general approach of the Sonic Flashlight to create a system by which an operator can employ direct hand-eye coordination to interact with a remote environment, possibly at a different scale. In the Sonic Flashlight, an ultrasound image is registered with a direct view of the surface of the patient. In the new system, the image of a remote effector in the operating field of a patient or other workspace, at an arbitrary level of magnification, is merged with the direct view of a mock effector instrument held by the operator. A mock effector is a tool in the workspace of the operator linked to the motion of the actual remote effector. The mock effector combines an appropriately scaled version of the actual effector instrument with a manipulator handle designed for optimal use in the hand of the operator. The mock effector is electromechanically or otherwise linked to the actual effector such that motion of the mock instrument will cause equivalent, scaled motion of the actual instrument in the remote workspace. As the image of the real effector is merged with the operator s view of the mock effector and his or her own hands, it appears to the operator as if he or she were interacting directly with the remote (and perhaps differently scaled) real effector environment. Given the eventual goal of integrating a force feedback, or haptic, interface, into the instrument linkage, this setup will become effectively an immersive environment for performing remote interventional procedures (medical or otherwise), allowing the system operator to use the integrated senses of sight, touch, and proprioception to perform the remote procedures in a natural way. This approach is not limited to using visible-light views of the remote environment. A system which combines different imaging modalities or overlays (such as real-time ultrasound, CT, MRI, or surgical plan diagrams) into the merged remote display can also be envisioned. Figure 1. A schematic for the prototype remote manipulator system. In our past work, our lab has demonstrated a simple version of our system with magnified ultrasound. 4 We have also implemented a system based on light microscopy (Figs. 2, 3) which featured the basic desired image merge characteristics of the interface. Although the electromechanical linkage of the system was not yet implemented in that version, the system produced the correct visual illusion of interaction with an environment at 40x magnification. Our current prototype system, dubbed the Micropainter (Figs. 4-7) has been created as an implementation of the interface again using light microscopy, but this time implementing computer-mediated electromechanical motion transfer. We chose to demonstrate the basic image merge and motion transfer capabilities of the system through an apparatus by which we could draw very small pictures remotely. Although this application is not directly relevant to medicine, it demonstrates the basic concepts. It is easy to envision the substitution of commonly used medical tools (e.g. forceps, scalpel, microsurgical needle holder) for the paintbrush in the system we have implemented.

Figure 2. First image merge experiment local mock and remote real effectors. micropipette, is shown piercing caviar egg. Effector, in this case a pulled glass Figure 3. View from operator s perspective of first image merge experiment.

2. CONSTRUCTION OF THE INTERFACE SYSTEM For this system, we employed two SensAble Technologies Phantom haptic interface devices as the input and effector devices. A SensAble 1.5 Premium model Phantom operating passively with 3 degrees-of-freedom joint angle encoding was used as the input, and a Premium 1.0 model Phantom haptic interface operating with 3 active degrees of freedom was used to effect the motion of the real paintbrush. A video camera (VDI IR Pro) was attached to a Zeiss OPMI-1 surgical microscope, whose objective was placed over the workspace of the effector Phantom. A small paintbrush was attached to the end of the effector Phantom interface, and a piece of paper was placed within the reachable extent of the brush (Fig. 4). A small blob of tempura paint was placed on the paper. Figure 4. The effector robot and attached instrument. The digital video image taken from the surgical microscope was routed through a computer, which scaled the image appropriately for projection onto the mock effector workspace. The mock effector workspace, where all of the operator interaction occurred, consisted of a half-silvered mirror (34 x 23 cm) mounted 38 cm over a piece of paper placed within the reachable extent of the passive 1.5 Premium model Phantom. The input Phantom was placed such that its tip was viewable by an operator standing in front of the half-silvered mirror, and looking through it. An LCD screen (ViewSonic ViewPanel VP201) was mounted 38 cm above the half-silvered mirror, such that a projection of the image from the surgical microscope would effect the image merge feature essential to our immersion scheme (Fig 5). By aligning the display, mirror, and mock effector precisely, the virtual image of the actual effector workspace merged with the operator workspace below the mirror occupied by the mock effector. The mock effector was placed at the exact distance with respect to the mirror such that the illusion of immersion in the remote environment was valid from any angle of view for the operator or, for that matter, for multiple observers simultaneously. The image of the tip of the real effector and the operator s natural view of the mock effector were visually merged such that the appearance of working with the mock effector in the remote miniature environment was established. The active (effector) Phantom was set to move at 1/10 scale with regards to the motion of the mock effector actuated by the user. So, for example, a linear motion of 1 cm in the mock effector environment would cause a movement of 1 mm of the paintbrush in the real effector environment. This scale

Figure 5. A view of the entire experimental setup. was convenient for the particular robotic and optical instruments employed in this proof-of-concept. However this general system could be used at any particular difference of scale. The scale of the projection of the miniature (effector) environment was calibrated visually by adjusting the projected digital image size such that the image merge was maintained throughout the range of motion of the mock effector. The system was then used to perform Chinese calligraphy with a paintbrush (Fig. 7), enabling the user to paint very small characters, among other things, while giving the impression of painting much larger characters (roughly 20 cm square). Note the relative size of the penny to the drawing in Fig. 7. The image registration that is pictured was valid from any point of view. It is interesting to note that the SensAble Technologies Phantom device used in the system is normally used as a haptic interface device rather than as an effector robot. To implement the scaled motion transfer feature of the system with the Phantom, a PID controller was implemented to control the effector Phantom. Periodic procedures monitored the position of the input and output instruments, and a third periodic procedure used the PID controller to adjust a force on the output Phantom such that it would move to the correct scaled position. The PID parameters that controlled the position of the phantom were adjusted such that it would quickly and accurately track the input Phantom. The output Phantom consistently achieved a position within 1/2 mm of the correct scaled-down position of the input Phantom within the plane of drawing. Since 3 degrees of freedom total were available for manipulation of the robot, only position information about the tip location, without the tool orientation, could be transferred. Since the input and output devices were kinematically different, and working at different scales, the orientation of the tools between robots was skewed to some degree as the tools moved to the extents of their drawing planes. The image merge of the position of the tool tips in the plane of drawing was correct, but the out-of-plane location of the tools was slightly skewed at drawing locations away from center. The version of the system currently under development, which will employ a 7 degree-of-freedom output robot (see below) does not have this limitation.

Figure 6. The view from below the half-silvered mirror 3. DISCUSSION AND FUTURE DIRECTIONS A photo of the system, taken from the point of view of the user, is shown in figure 7. Although the application of remote microscopic calligraphy lacks clinical relevance, the motion transfer, remote visualization, and immersive feel of the system are directly transferable to many different fields where remote and/or differently scaled manipulation of an environment is desired. This system has possible application in many areas of medicine, microbiology, and engineering. One can imagine a version in which forceps and needle holder motions are transferred to perform microsurgery, where an operator could experimentally manipulate individual cells with a robotically controlled micropipette, or where a user could perform microscopic machine fabrication in an engineering context. An important limitation of the current system that should be noted is that the visual merge is only viewpoint independent in the plane of the painting. For a tomographic imaging modality such as ultrasound, the visual merge is accurate at the instrument tip if the image contains it. For a non-tomographic modality such as light microscopy, the merge is only completely accurate at the focal plane of the microscope. One can compensate by adjusting the focal plane of the microscope as the effector tool moves toward or away from the microscope, which is equivalent to projecting a different plane of a tomographic modality. Our current system does not yet employ such a compensatory mechanism, but one is planned for future versions. Currently, we are working on implementing a six degree of freedom version of the project, using a Barrett Technologies Whole Arm Manipulator as the effector robot (Fig 8). A SensAble technologies Phantom with an extra 3 degree of freedom end-tip orientation encoder will be used as the input device. With six degrees of freedom input and output, the orientation of the actual effector, in addition to its position, can be made to match that of the mock effector. With such a system, more realistic interaction with a variety of tool types may be achieved while preserving tool orientation. We are also pursuing the implementation of a system that corrects the out-of-plane skew of the current system. To do this, the focal plane of the microscope would be adjusted as the tool is moved toward and away from the objective. In the actual clinical tool, of course, we would greatly reduce the size of the manipulator system in order to interact with much smaller environments.

Figure 7. From the point of view of the operator. Figure 8. The Whole Arm Manipulator, which has 7 total degrees of freedom.

REFERENCES 1. G. Stetten and V. Chib, Overlaying ultrasound images on direct vision, Journal of Ultrasound in Medicine 20(3), pp. 235 240, 2001. 2. G. Stetten, A. Cois, W. Chang, D. Shelton, R. Tamburo, J. Castellucci, and O. vonramm, C-mode virtual image display for a matrix array ultrasound sonic flashlight, in MICCAI 2003, Lecture Notes in Computer Science 2879, pp. 336 343, Springer-Verlag, 2003. 3. G. Stetten, System and method for location merging of real-time tomographic slice images with human vision, U.S. Patent no 6,599,247, 2003. 4. G. Stetten and V. Chib, Magnified real-time tomographic reflection, in MICCAI 2001, Lecture Notes in Computer Science 2208, pp. 683 690, Springer-Verlag, 2001.