VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

Similar documents
VR Headset for Endoscopy and Microsurgery

VR Headset for Endoscopy and Microsurgery

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

VR Headset for Endoscopy and Microsurgery

revolutionizing Subhead Can Be Placed Here healthcare Anders Gronstedt, Ph.D., President, Gronstedt Group September 22, 2017

Virtual Universe Pro. Player Player 2018 for Virtual Universe Pro

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

MANPADS VIRTUAL REALITY SIMULATOR

INTRODUCING THE VIRTUAL REALITY FLIGHT SIMULATOR FOR SURGEONS

P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications. Gate Review

ADVANCED WHACK A MOLE VR

Signature redacted. redacted _. Signature. redacted. A Cross-Platform Virtual Reality Experience AUG LIBRARIES ARCHIVES

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Virtual Reality: The next big transformational learning technology. Kallidus VR in L&D Study. kallidus.com/vr

OPHTHALMIC SURGICAL MODELS

immersive visualization workflow

Haptics in Military Applications. Lauri Immonen

Virtual Reality in E-Learning Redefining the Learning Experience

4/23/16. Virtual Reality. Virtual reality. Virtual reality is a hot topic today. Virtual reality

your LEARNING EXPERIENCE

Haptic presentation of 3D objects in virtual reality for the visually disabled

SUNY Immersive Augmented Reality Classroom. IITG Grant Dr. Ibrahim Yucel Dr. Michael J. Reale

Unpredictable movement performance of Virtual Reality headsets

Assignment 5: Virtual Reality Design

Using Hybrid Reality to Explore Scientific Exploration Scenarios

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Neonatal Intubation Simulation with Virtual Reality and Haptic Feedback

NeuroSim - The Prototype of a Neurosurgical Training Simulator

Current Status and Future of Medical Virtual Reality

SMart wearable Robotic Teleoperated surgery

Medical Robotics. Part II: SURGICAL ROBOTICS

Virtual Reality as a Teaching Aid for Anatomy. Dr. Laura Mason and Dr. Marc Holmes

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Virtual Reality for Real Estate a case study

THE DAWN OF A VIRTUAL ERA

CS 354R: Computer Game Technology

Software Design Document

5HDO 7LPH 6XUJLFDO 6LPXODWLRQ ZLWK +DSWLF 6HQVDWLRQ DV &ROODERUDWHG :RUNV EHWZHHQ -DSDQ DQG *HUPDQ\

Using Web-Based Computer Graphics to Teach Surgery

CS Game Programming, Fall 2014

Learning technology trends and implications

One Size Doesn't Fit All Aligning VR Environments to Workflows

pcon.planner PRO Plugin VR-Viewer

Virtual Reality Simulator for Carpal Tunnel Release Surgery

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges

Edward Waller Joseph Chaput Presented at the IAEA International Conference on Physical Protection of Nuclear Material and Facilities

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Software Requirements Specification

Image Guided Robotic Assisted Surgical Training System using LabVIEW and CompactRIO

The value of VR for professionals. Sébastien Cb MiddleVR.com

Oculus Rift Development Kit 2

Virtual Environments. CSCI 420 Computer Graphics Lecture 25. History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics

Open surgery SIMULATION

RASim Prototype User Manual

ReWalking Project. Redirected Walking Toolkit Demo. Advisor: Miri Ben-Chen Students: Maya Fleischer, Vasily Vitchevsky. Introduction Equipment

Learning Phacoemulsification Surgery In Virtual Reality Course ESCRS: Sept. 6, 2010,

Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Enhancing Medical Communication Training Using Motion Capture, Perspective Taking and Virtual Reality

1/22/13. Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

Virtual Reality in Plant Design and Operations

Virtual Reality Game using Oculus Rift

ISSUE #6 / FALL 2017

The Use of Virtual Reality System for Education in Rural Areas

Real Estate Marketing

Voice Control of da Vinci

Best Practices for Virtual Reality in Higher Education

Journey through Game Design

WHAT IS ENGINEERING? Lending a Hand: Teaching Forces through Assistive Device Design Activity EDP Assistive Hand Device Presentation

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

An affordable VR environment

More Efficient and Intuitive PLM by Integrated AR/VR. Round Table Session Georg Fiechtner

HUMAN Robot Cooperation Techniques in Surgery

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld

Project Abstract Submission : Entry # 456. Part 1 - Team. Part 2 - Project. Team Leader Name. Maroua Filali. Team Leader .

Visualizing the future of field service


Immersive Simulation in Instructional Design Studios

Virtual Reality as Innovative Approach to the Interior Designing

Chapter 1 - Introduction

Introduction To Immersive Virtual Environments (aka Virtual Reality) Scott Kuhl Michigan Tech

Telemanipulation and Telestration for Microsurgery Summary

Yale University Art Students Explore Painting in 3D with VR and Tilt Brush

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Novel machine interface for scaled telesurgery

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

Future of Museum VR/AR

Three-Dimensional Engine Simulators with Unity3D Game Software

Virtual Reality Game using Oculus Rift

Force feedback interfaces & applications

interactive laboratory

Tobii Pro VR Analytics Product Description

Virtual Reality and Natural Interactions

COMPUTER GAME DESIGN (GAME)

Rubik s Cube Trainer Project

Right Angle Screwdriver

Surgical robot simulation with BBZ console

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Transcription:

VR for Microsurgery Design Document Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Email: med-vr@iastate.edu Website: Team Members/Role: Maggie Hollander Leader Eric Edwards Communication Leader Sam Glancy Webmaster/Communications David Weinstein - Webmaster David Robideau Key Concept Holder Kyle Rohlfing Key Concept Holder Revised: 10/23/16

Contents 1 Introduction 1.1 Project statement 1.2 Purpose 1.3 Goals 2 Deliverables 3 Design 3.1 System specifications 3.1.1 Non-functional 3.1.2 Functional 3.2 PROPOSED DESIGN/METHOD 3.3 DESIGN ANALYSIS 4 Testing/Development 4.1 INTERFACE specifications 4.2 Hardware/software 4.2 Process 5 Results 6 Conclusions 7 References PAGE 1

1 Introduction 1.1 PROJECT STATEMENT The goal of the VR for Microsurgery team is to create an interactive microsurgery simulation in a virtual reality environment. We will be making use of the HTC Vive virtual reality headset in order to display a fully interactive simulation to the user. The users will have the opportunity to train themselves on over a microsurgical procedure as well as to run test simulations that assess his or her performance during the simulation. These simulations will allow a surgeon in training to develop the skills and practice the techniques necessary for performing microsurgical procedures all while being immersed in a realistic and interactive virtual environment. 1.2 PURPOSE Currently, most microsurgical training falls into one of two camps. The first is training performed in a physical environment. This type of training is highly common and typically involves the trainee practicing and performing microsurgical procedures under an operating microscope. Generally, these training procedures are performed on either mannequins or non-living animals. Training in this way has both pros and cons. On the positive side, it allows trainees to practice using a real operating microscope and with real microsurgical instruments, the same kind they will be using in the field. It also allows for the user to have the tactic sensation of interacting with real or synthetic blood vessels, nerves, and tissue. On the other hand, training in a physical environment has its drawbacks, most notably cost. Operating microscopes are incredibly expensive as are mannequins. On top of that, this sort of physical training is limited in its ability for a person to gain feedback or review his or her performance. The second type of microsurgical training that is performed is training conducted in a virtual environment. This is the type of training method our team plans to develop. While various virtual training simulators have been developed in the past, they have suffered from some significant flaws that have limited their effectiveness--flaws we intend to overcome. The first and most significant flaw these existing technologies have is a lack of realism in their simulation. Until very recently, the technology did not exist that could effectively render a realistic surgical environment while being able to effectively receive input from a user and provide realistic feedback. The second problem many previous attempts at virtual reality simulations faced was cost. Until recently programmable virtual reality devices were not commercially available at a large scale. Developers were frequently forced to develop their own tools, adding to their cost. We feel we are in a unique position to overcome many of the issues other forms of microsurgical training face. For starters, with the advent of programmable virtual reality headsets (like the HTC Vive), the cost of creating a virtual training simulator has dropped PAGE 2

dramatically just in the last few years. On top of this, huge strides have been made recently in the ability to create and render near photo-real environments in a virtual space, and computational power has increased to the point that interacting with these environments requires far less specialized equipment. Our product would also solve the problem that most hands-on training environments face, where in getting feedback and reviewing the performance of a trainee is very difficult. In our system, instructors would have the ability to see a trainee s procedure in real time on an external display as well as have the ability to review a recorded version of the procedure. The system will also allow a user to test themselves by performing a procedure and receiving graded feedback from the system. All of these benefits will allow our tool to be both cost effective and useful for training microsurgical procedures. 1.3 GOALS 1) Create an interactive microsurgery simulator in a virtual reality environment 2) Provide a framework to allow Implementation of microsurgery training lessons for a variety of surgical procedures 3) Develop recreatable test simulations that provide feedback for surgeons on their performance during a procedure simulation 2 Deliverables 2.1 FIRST SEMESTER DELIVERABLES Interactive virtual environment All models created/obtained All models and elements placed in the virtual environment Environment is able to be interacted with while using the HTC Vive headset and controllers Design diagram of how the simulator will be implemented Finalized Project Plan Finalized Design Document 2.2 SECOND SEMESTER DELIVERABLES One complex operation implemented from start to finish UI accessible and functional from main menu to end of simulation Performance metric system implemented User-friendly system in place to practice short, isolated procedures under varying conditions (e.g. practice a specific incision with more relaxed accuracy judgement) Allow second user perspective during operations/replays Updated design document Instructor review functionality implemented Testing completed PAGE 3

PAGE 4

3 Design Include any/all possible methods of approach to solving the problem. Discuss what you have done so far. What have you tried/implemented/tested etc. We want to know what you have done. 3.1 SYSTEM SPECIFICATIONS The system we are designing is an attempt to create a virtual environment capable of assisting in the training of microsurgical procedures. For our project, the user will be able to perform a complex, microsurgical procedure on a virtual eyeball from start to finish using an HTC Vive. The system will walk a user through a simulated cataract surgery. The user will interact with accurately modeled eyes, microsurgical tools, and surgical microscope. While one user is in the simulation, a second user will have the ability to view or replay the simulation from a separate perspective in order to review and evaluate the procedure. Simulations will also implement a performance metric system in order to give real time and post simulation grades and evaluations of the simulated procedure. 3.1.1 Non-functional Due to the intensity of VR simulations, certain resource, timing, and usability constraints must be met by the system: Minimize motion sickness with a consistently high output frame rate (> 90fps) Adhere to realistic procedure timelines to simulate real-world operating pressures Deliver haptic feedback at appropriate times - early/late haptics can lead to confusion and improper reflex building Reconstruct and replay recorded procedures in a timely manner to encourage performance reviews and metrics 3.1.2 Functional In order to exceed the capabilities of other existing solutions while maintaining low costs and ease-of-access, this system must meet several requirements, namely: Accurately simulate a full operating environment including tools, lighting, and patient monitors Recreate common microsurgery procedure with timing constraints, limited available resources, and realistic failure consequences Provide an intuitive recording/playback environment for grading and performance metrics Deliver realistic haptic feedback to help build muscle memory Create an extendable platform for microsurgical training on multiple different organs PAGE 5

3.2 PROPOSED DESIGN/METHOD For the design of this project we have decided to focus on creating a simulation for one type of microsurgery, namely cataract surgery, as well as develop the systems surround the surgery simulation. These systems include the ability to effectively and believable interact with the models in the environment, record and replay a previously attempted simulation, automatically grade and evaluate user performance of a procedure, and develop an intuitive user interface for the system as a whole. 3.3 DESIGN ANALYSIS So far we have been working to develop the systems and models that the user will be interacting with during the simulation. After some trial and error, and multiple concept redesigned we believe we are at a point where we can effectively interact with our eyeball model and allow its sections to have realistic physics in order to give the user accurate feedback. While we do foresee minor hurdles in terms of how realistically we can get the model to behave, we believe our current implementation will ultimately be effective. Outside of the progress we have made on effective interaction with our models, we have also created or purchased nearly all of the models we will be using for our simulations, including the eyeball model we will be operating on, tools we will be using to perform the operation, and other items that exist inside the surgery room. As work continues we will continue to develop the remaining models that we need, and work to effectively rig and interact with our existing models. We have also been able to develop the system we will be using to record and replay a procedure. We still need to work to improve the performance of the system when recording and we need to implement the recording and replay abilities into our user interface. Finally, we have made strides in the user interface we are developing for the system. Parts of the UI have already been developed including the menu for changing the tool a user is using during a procedure, and the home screen for the system. As we progress we will continue to implement UI features to interact with the system we develop. PAGE 6

4 Testing/Development 4.1 INTERFACE SPECIFICATIONS Discuss any hardware/software interfacing that you are working on for your project. This section is decided by team advisor/client. 4.2 HARDWARE/SOFTWARE For our testing phase we will be utilizing the computer stations inside the Virtual Reality Applications center. These computers have the graphics capabilities to run the programs that we will be taking advantage of throughout the project. Also inside the Virtual Reality Applications Center, we will be using the HTC Vive for testing purposes of this project. The Vive provides us with perfect opportunities to see how our product is operating in a real environment and gives us the ability to fix anything on site. We will be using Unity3D in order to run our product on the HTC Vive. Many of our test cases will rely on being able to function in the virtual reality space through the HTC Vive and fixing any errors inside of Unity. 4.2 PROCESS The majority of our testing will focus around the ability to provide life-like and real training for surgeons. The feel and the look of the operating room will be tested to provide the surgeon with the feeling of being in an actual operating room. Testing of different scenarios for failure and success will be tested by using the Vive and conducting an operation. Throughout the operation, making sure that correct cuts provide the correct outcomes will be a large part of what will make this project successful. Teaching incorrect cuts on a cataract surgery could prove disastrous in a real world situation. Thorough testing of the recording/playback environment for grading and performance metrics will also be completed through the Vive. Extensive testing on the location of cutting, stitching, and other surgical operations will be a large concern for us. Accurate locations will be needed for grading the surgeon and the overall success of the operations. These tests will also be conducted using the HTC Vive by making surgical actions on a body part and then seeing what feedback the Vive has given to Unity3D in relation to where we believe the actual location is. The overall testing process will largely focus on how smoothly the operation feels in the HTC Vive environment and how much information we can track and give as feedback for the surgeon in order to make them better. PAGE 7

5 Results This semester we have tried many different ways to accomplish our goal and we have also been able to successfully implement a few different utilities into our project that we will continue to expand on and make better next semester. Our biggest roadblock has come from finding the best way to simulate cutting of the eye in the virtual reality world so that it seems realistic and does not constrain the user too much. We originally knew this part of the project would be difficult, but while talking with our adviser and two graduate students who have helped guide us, it became much more apparent how difficult it actually is. Our initial plan was to implement the ability to cut the eye in real time and be able to manipulate the pieces at the same time. After some initial testing of this, we found this may be too difficult. We talked with the graduate students, and they set us up to meet with Vijay, a faculty member working in the VRAC department. We explained our problem to him as best we could and talked about using volumetric data to work on our problem as this is what he is fairly specialized in. He informed us that if we were to go that route, we would need to devote a lot more time to this project than we currently have available, even going til May. This was clearly not the best solution for our project, so we had to go back to the drawing board. After this meeting we found an asset in the Unity store that we hoped would be useful in accomplishing our goal without needing too much work. However after playing around with it a good bit we found that it is still too much in its infancy to be what we need for cutting the eye, however it may be useful for the liquids involved. We currently have another asset that we are researching that we believe may be more receptive to what we need for cutting. However while we research this we have decided to continue to work with the tools we have in order to cut the eye in a receptive manner. Other goals of ours that we have been working to simulate in the environment is the use of a microscope in microsurgery, modeling tools created in blender that mimic the tools doctors use in surgery and then implementing them into the unity scene, we also have started working towards allowing the ability to replay past trainings, there is a homepage that you can decide what feature you want to use, and we are also working towards a multi-user space where an instructor and trainee may work together. All of these have been going fairly smooth and shouldn t take too much more time to be finished. Another larger goal we have and hope to include by the end of our project will be the ability to measure how well a trainee accomplishes the surgery with respect to a metric-based system. PAGE 8

6 Conclusions Throughout the last few months, we have been putting together the beginnings of our training system. This has brought to light several issues that we realized we would need to solve. Some of which include accurately manipulating meshes as if they were a real world object. The Unity game engine has a lot of restrictions and limitations when trying to simulate this interaction. However, being that we have begun working early, we have been able to understand the pitfalls and shortcomings of this system. We created a temporary environment in which the surgery would take place, started to implement the physics involved in the process of performing a surgery, implemented the start to a recording system, created/acquired realistic models for the organs and tools used in the surgery, and added a simple microscope system. With all of these things, we can be able to make a microsurgery simulation that is accurate and educational for anyone practicing microsurgery. This system will allow users to understand the process involved and practice their skills in an environment that is far more accessible and reliable than anything that was available in the past. With what we have done so far, when we put all of these things together, it will provide a basic system and a proof-of-concept for us that will confirm that our approach can and will work. By taking each of the major features of our system one by one and implementing a proof-of-concept system for it, we can be able to solve a large majority of the problems early on and then focus our drive towards making those features more polished and accurate to the surgery. By taking this approach, we can be sure to address any major issues early on and have enough time to do research and devise a plan to get around those issues should they arrive. 7 References None PAGE 9