Hand Tracking and Visualization in a Virtual Reality Simulation

Similar documents
Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Overview of current developments in haptic APIs

Virtual Grasping Using a Data Glove

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Haptic presentation of 3D objects in virtual reality for the visually disabled

DATA GLOVES USING VIRTUAL REALITY

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar

Input devices and interaction. Ruth Aylett

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Haptic Feedback in Mixed-Reality Environment

Classification for Motion Game Based on EEG Sensing

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Peter Berkelman. ACHI/DigitalWorld

FORCE FEEDBACK. Roope Raisamo

Gesture Recognition with Real World Environment using Kinect: A Review

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality

6 System architecture

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

Ultrasonic Calibration of a Magnetic Tracker in a Virtual Reality Space

Extended Kalman Filtering

Advancements in Gesture Recognition Technology

CSCI 599 Physically Based Modeling for Interactive Simulation and Games Topic: Haptics

FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT

KINECT CONTROLLED HUMANOID AND HELICOPTER

Classifying 3D Input Devices

BW-IMU200 Serials. Low-cost Inertial Measurement Unit. Technical Manual

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Toward an Augmented Reality System for Violin Learning Support

AUTONOMOUS MOTION CONTROLLED HAND-ARM ROBOTIC SYSTEM

CS 354R: Computer Game Technology

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Development of K-Touch TM Haptic API for Various Datasets

Initial Project and Group Identification Document September 15, Sense Glove. Now you really do have the power in your hands!

R (2) Controlling System Application with hands by identifying movements through Camera

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

A TECHNIQUE TO EVALUATE THE IMPACT OF FLEX CABLE PHASE INSTABILITY ON mm-wave PLANAR NEAR-FIELD MEASUREMENT ACCURACIES

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Classifying 3D Input Devices

Image Manipulation Interface using Depth-based Hand Gesture

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

EKT 314/4 LABORATORIES SHEET

Air Marshalling with the Kinect

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN

SELF STABILIZING PLATFORM

Localization (Position Estimation) Problem in WSN

Application of 3D Terrain Representation System for Highway Landscape Design

CHAPTER 1. INTRODUCTION 16

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Haptic Feedback Technology

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

VR System Input & Tracking

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

ReVRSR: Remote Virtual Reality for Service Robots

Building a bimanual gesture based 3D user interface for Blender


A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML

Haptic control in a virtual environment

Mobile Motion: Multimodal Device Augmentation for Musical Applications

Elements of Haptic Interfaces

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Debugging a Boundary-Scan I 2 C Script Test with the BusPro - I and I2C Exerciser Software: A Case Study

2D Floor-Mapping Car

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

Team Breaking Bat Architecture Design Specification. Virtual Slugger

3-Degrees of Freedom Robotic ARM Controller for Various Applications

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Cody Narber, M.S. Department of Computer Science, George Mason University

VR Haptic Interfaces for Teleoperation : an Evaluation Study

A Kinect-based 3D hand-gesture interface for 3D databases

Haptics-Augmented Physics Simulation: Coriolis Effect

Learning Actions from Demonstration

Multi-Modal User Interaction

LDOR: Laser Directed Object Retrieving Robot. Final Report

Development of excavator training simulator using leap motion controller

WHITE PAPER Need for Gesture Recognition. April 2014

A Study on Motion-Based UI for Running Games with Kinect

Touch Probe Cycles itnc 530

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

How to Create a Touchless Slider for Human Interface Applications

ZX Distance and Gesture Sensor Hookup Guide

AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

Touch Probe Cycles TNC 426 TNC 430

Lamb Wave Ultrasonic Stylus

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

IMU: Get started with Arduino and the MPU 6050 Sensor!

Hand Data Glove: A Wearable Real-Time Device for Human- Computer Interaction

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface

Transcription:

FridayPM1SystemsA&D.2 Hand Tracking and Visualization in a Virtual Reality Simulation Charles R. Cameron, Louis W. DiValentin, Rohini Manaktala, Adam C. McElhaney, Christopher H. Nostrand, Owen J. Quinlan, Lauren N. Sharpe, Adam C. Slagle, Charles D. Wood, Yang Yang Zheng, and Gregory J. Gerling, Member, IEEE Abstract- Tracking a user s hand for 3D rendering and visualization creates a sense of presence in virtual reality environments. At present, tracking devices built for both research and consumer use are increasingly accessible, with ever improving spatiotemporal accuracy. This work seeks to contribute a new design which offers an ability to track the fingers and palm. The hand tracking method described herein ties absolute tracking of the user s palm with relative tracking of individual fingers. A virtually rendered image of the movements of the user s hand is displayed in near real-time to a virtual environment developed with the X3D ISO standard for representing 3D graphics. The tracking system was verified using experiments designed to confirm the accuracy and usability of the device. Experiment 1 tested the absolute positioning capability by tracking the subject s palm. Experiment 2 tested the relative positioning capability by tracking the subject s fingers. The results of the experiments indicate that the tracking component of the virtual reality system is able to accurately detect subjects interaction with objects in the virtual environment. C I. MOTIVATION ENTRAL to the successful impression of user realism in a virtual reality environment is the extent to which a user feels immersed in the experience. Immersion is a central topic of simulation design that has been studied extensively [1-3]. A significant factor in immersion is the accurate and timely tracking and representation of a user s hand in the virtual environment. If the user can see a virtual rendering of his or her hand and its movements relative to the movements of other objects, there is a much better chance that the user will feel that the virtual hand embodies his intentions. Currently, several means of tracking the hand and body are found in the market, such as the CyberGlove II, 5DT Data Glove Ultra, and Microsoft Kinect, though each is Manuscript received April 4, 2011. C. R. Cameron, A.C. McElhaney, and A.C. Slagle are fourth year students in Mechanical Engineering at the University of Virginia, Charlottesville, 22904 (email: crc9s@virginia.edu, acm9g, acs9p). L. W. DiValentin, O. J. Quinlan, L. N. Sharpe, and C. D. Wood are fourth year students in Systems and Information Engineering at the University of Virginia, Charlottesville, 22904 (email: lwd5d@virginia.edu, ojq5a, lns3c, cdw4j). R. Manaktala is a fourth year student in Biomedical Engineering at the University of Virginia, Charlottesville, 22904 (email: rm9xe@virginia.edu). C. H. Nostrand and Y. Y. Zheng are fourth year students in Computer Science (CS) at the University of Virginia, Charlottesville, 22904 (email: chn8sh@virginia.edu, yyz5w). G. J. Gerling is an assistant professor in SIE at the University of Virginia, Charlottesville, VA 22904 (e-mail: gg7th@virginia.edu). limited in scope of usage to one extent or another, several are quite costly, and others are difficult to integrate with virtual environments with graphics and haptics rendering engines. A low-cost, adaptable, and accurate tracking solution will lead to a more intuitive and effective virtual environment. II. BACKGROUND The field of motion tracking has recently witnessed increased attention. Several companies have developed products used by CGI animators, video game developers, and sign language specialists to accurately track the position of a user s hand or body. Devices vary in capability to track increasing degrees of freedom and to track with higher spatial and temporal accuracy, from gloves with one sensor per finger which measure finger flexure [4], to those with 40 covering the entire hand [5]. Acceleglove, for instance, is a glove which tracks acceleration of the individual fingers. It uses six 3-axis micro electro-mechanical systems (MEMS) accelerometers to capture relative position of the palm and each of the fingertips [6]. These commercial products can provide a great deal of accuracy. The CyberGlove II, for instance, has flexure sensors which are accurate to within 1 degree [7]. While existing systems are adept at tracking the position of the fingers, they fall short in their ability to determine the absolute position of the hand. However, individual components which could be combined to develop a more accurate hand tracking system currently exist. Flock of Birds is an absolute tracking system that uses a magnetic field to track a sensor with six degrees of freedom (X, Y, Z, pitch, roll, yaw) [8]. Specifically in the video game industry, motion tracking has been introduced in new peripheral input devices. Each of the major console developers (Microsoft, Nintendo, and Sony) have created a device (Kinect, Wii, and PlayStation Move, respectively) which allow the player to move a controller or their body and have the game respond appropriately. The Wii and PlayStation Move both only have the ability to track a single point within a small field (3-8 feet) [9], limiting their applicability to more complex tasks. The Kinect, which uses 3D motion imaging [10], has the ability to track multiple degrees of freedom of a user s entire body without an additional controller, which could greatly increase the number and complexity of tasks that can be tracked with similar hardware. Currently, these devices cannot track the hand and fingers to the spatial accuracy desired at the level of individual fingers when the user is situated close to the display, and others show significant temporal delay. 978-1-4577-0447-5/11/$26.00 2011 IEEE 127 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 1

Research demonstrates the need for real-time performance coupled with acceptable realism and presence in a simulation [11]. However, often an increase in realism of a simulation necessitates a decrease in performance time (and vice-versa) due to limited computing power. By using less-complicated, but still sufficiently accurate, tracking devices and techniques, it should be possible to maintain acceptable levels of presence, realism, and performance time. III. HIGH LEVEL DESCRIPTION In order to develop a simulator, two main components had to be created. One is a glove that can both track the position of the user s hand in real time and provide feedback to the user when an event occurs in the simulation. The other is the virtual environment in which the user s hand is displayed, and tasks can be assigned and carried out. Equally important to the system is the flow of data between the glove and the environment. Since various components of the system have different capabilities and limitations, a data transfer system is required to integrate the many sources of data and combine them to work cooperatively in the virtual environment. The interaction between the user and the virtual environment also requires accurate collision detection. Having rapid data transfer, response time, and refresh rates provides the user with realistic interactions in the virtual environment. These interactions must occur in real time to provide the user with the sense that the simulation is real; otherwise the user will not be sufficiently immersed in the environment. The virtual environment (VE) was developed using the H3D API, and implements low level data communication using Python and C#. By detecting when collisions have occurred within the VE, the user is provided with the appropriate force response. B. Devices 1) Flock of Birds Flock of Birds (FOB) (Ascension Technology Corporation) is a six-degree-of-freedom motion tracking system that measures the position and orientation of sensors through a transmitter. It can simultaneously track and measure all of its sensors by transmitting a pulsed direct current magnetic field. When a sensor is within ±1.22m of its transmitter, the sensor can make from 22 to 144 measurements per second of its current position and orientation [12]. It also has the angular range of ±180 degrees Azimuth and Roll and ±90 degrees elevation. In addition, it has the static positional accuracy of 2.13cm root mean square (RMS) and the static angular accuracy of 0.5 degree RMS. It has the positional resolution of 9.14cm at 3.66m and the angular resolution of 0.1 degree RMS at 3.66m. Data output from a transmitter may be delayed up to 2 milliseconds. FOB provides the values for orientation and rotation in several different formats, including Euler angles, rotation matrices, and quaternions. It can be configured to use a single transmitter and sensor or a complex combination of transmitters and sensors, making it fit well with different types of applications. IV. REQUIREMENTS A number of requirements were developed to ensure a tracking and visualization system that is an effective component of a virtual reality simulation. 1. The device shall have absolute tracking a. The device shall have absolute tracking within a 3 meter boundary radius b. The device shall be accurate to within 1 cm c. The device shall track in three dimensions 2. The device shall have rotational tracking a. The device shall show the rotation of the yaw b. The device shall show the rotation of the pitch c. The device shall show the rotation of the roll d. The device shall rotate 360 degrees 3. The device shall have relative tracking of the fingers with 180 degrees of motion V. METHODS A. Overview To track the 3D position of the fingers and palm of the hand, an instrumented glove was developed containing multiple devices. These include the absolute tracking device Flock of Birds and the finger flexure measuring Flex Sensors. Fig. 1. (a) Flock of Birds. (b) The box on the left of the monitor is the transmitter, and the device placed on top of the palm is the sensor. 978-1-4577-0447-5/11/$26.00 2011 IEEE 128 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 2

We use FOB with a single transmitter and sensor to constantly track the current position and orientation of the glove so that it can be properly simulated and displayed in the virtual environment (VE). We use a C++ program to extract the values of the sensor s position and orientation from the FOB. Because the coordinate system of the FOB and our VE is different, we need to find a way to convert the position and rotation values from the FOB to values that reflect the VE s coordinate system. For the position values, we need to swap their axes and negate and scale them. For rotation values, we need to obtain the rotation matrix from the FOB and use it to switch axes. 2) Flex Sensors Flex sensors (Spectra Symbol, Flex Sensor 11.43cm (SEN-08606), Salt Lake City, UT) monitor finger flexion. These sensors are only 0.64cm wide and 11.43cm long, allowing them to lie directly along the finger unobstructed. As the sensors are flexed, the resistance across the sensor increases. This allows the recording of the rate the increase in resistance to the bending angle. By measuring only one axis of bending, the sensors are very dependable and are less prone to noise than accelerometers. Inside the flex sensors are carbon resistive elements in a thin flexible substrate. When the substrate is bent the sensor produces a resistance directly related to the bend radius. In a prototype glove, 5 flex sensors, one for each finger of a hand, are wired as variable analog voltage dividers on an Arduino microcontroller circuit board (ATMega1280). The Arduino then reads a variable voltage from 0-5 Volts based on the bending of the finger. This data, represented in a fraction of a byte, is then scaled to the proximal flexion angles of the human fingers. The Arduino then sends a byte across the serial port where an X3D program deciphers the byte and moves the hand to the correct position. The proper wiring schematic for the flex sensors are represented in Fig. 2. Using Autodesk Inventor, several rings were designed with a slit to hold the flex sensors in place along the hand. A rapid prototyping system (uprint) was used to produce plastic models to hold 5 different flex sensors along each finger of the hand. The user s actions of finger flexion and extension refer to bending towards the palm, while finger abduction and adduction refer to the fingers bending towards each other. The disadvantage of the flex sensors is that they are only able to measure flexion and extension angles. Abduction cannot be measured because the sensors can only bend along one axis. The flex sensor has a life cycle of over 1 million cycles, can operate in the temperature range of -35 degrees C to 80 degrees C, and experience resistances from 60,000 to 110,000 Ohms. The operating range typically occurs within a 90 degree bend. Therefore, the Arduino readings will be scaled to fit the optimal bending of the hand. One other disadvantage of flex sensors is that they can only measure the fingers bend in unison with all of the joints; it cannot portray the independent bending of a single joint. Regardless, typical grasping movements involve a natural bending arc and the motion capture capabilities of the Flex Sensors is sufficient to display this. Fig. 2. Electrical diagram of Arduino circuit board (left) and 5 flex sensors (right) connected to analog input lines. C. Virtual Reality Visualization Haptic software development platforms allow users to create applications that can display 3D figures in a virtual environment and interact with users through force feedback. Some of these platforms that are available in the market are OpenHaptics, GHOST, CHAI 3D, and MHaptic. To develop our simulation, we used H3DAPI, an open source haptics platform. This platform renders 3D figures using OpenGL and X3D. OpenGL and X3D are two widely-used graphic standards for producing computer graphics. H3DAPI can support many different haptics devices that are available in the market, such as the SensAble PHANTOM, Force Dimension Omega, and Novint Falcon, because it is device independent [13]. It also works well with custom made devices. Its device independency is one of main reasons we use it instead of other haptics software platforms. Fig. 3. A model of the hand in the virtual environment. 978-1-4577-0447-5/11/$26.00 2011 IEEE 129 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 3

Using H3DAPI, we created a model of the hand by grouping a box with fourteen cylinders (see Fig. 3). Each individual cylinder represents a segment of the finger. A group of two to three fingers represent a single finger. The palm is represented by a box with a height of 6.35cm, width of 6.35cm, and thickness of 1.59cm. All cylinders have a radius of 0.73cm. Their heights vary from 2.06cm to 3.18cm. Several factors are necessary to simulate a moving hand. First, the palm and the fingers need to move at exactly the same time. They also should never be disjointed. The bending of each finger is simulated as rotations of each cylinder and group of cylinders, and the point of rotation is the bottom of each cylinder. H3DAPI is written in the C++ programming language. In addition to C++, programmers can also use the X3D modeling language and Python scripting language to develop new software applications using this platform. We use X3D and Python to program the graphics component. We use X3D to create basic 3D shapes such as the box and the cylinders. H3DAPI represents each of these shapes as a node. Furthermore, we use X3D to define their colors, sizes, positions, and rotations. We use Python to process different data in the VE. The written Python script communicates with various devices through User Datagram Protocol (UDP) sockets. It also performs all of the calculations needed to map the values received from these devices to the values that display 3D figures in the VE. H3DAPI also represents this Python script as a node in the X3D file. This node is linked with the figures in the VE to affect their attributes such as position and rotation to animate them. D. Data Communication The data transfer system used for tracking hand movements to generate an accurate visualization utilizes a set of two datagram sockets with associated input control programs, as shown in Fig. 4. Sockets present a means of data transfer between multiple languages and systems. A UDP Datagram Socket Protocol can be used to stream data between programs not written in the same language. This works by designating a port number and passing data into an abstract address in a computer's memory. A listener in another program can execute an action whenever a new value is pushed to that address, effectively allowing programs of different languages to communicate. Tracking data is received from two sources on the glove, and turned into movement on a screen. The system must then also recognize when a collision has occurred in the virtual environment and send the appropriate feedback to the glove. For the Flock of Birds, a C++ control program uses custom input methods to read from the serial port and convert the input into six values. The program interprets these values as three Cartesian coordinates and three Euler angles and concatenates them into a single string of values. This string is delivered to a datagram socket every time the values register on the serial port. For the Flex Sensors, an Arduino microprocessor has been programmed to map the changes in resistance over the sensors into a set of integers valued from 0 to 255. The Arduino then concatenates these integers into a string and sends it to the serial port. A C# program delivers the string to a datagram socket every time the Arduino sends a set of values. The simulation then uses the values from both ports to set the position, orientation, and Fig.4. Data transfer system 978-1-4577-0447-5/11/$26.00 2011 IEEE 130 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 4

E. Collision Detection Application of feedback at the appropriate time is essential to the user s sense of immersion in the virtual environment. To give this feedback, the system must detect when a user contacts an object in the virtual environment. Important factors for the implementation of collision detection are the spatial accuracy of the virtual in the environment, the temporal accuracy of the interaction between objects (the speed at which a collision can be detected), and the speed at which feedback can be delivered to the glove. Two alternative collision detection methods are the creation of a custom algorithm or the importation of an existing library. The trade-off between the two is versatility over convenience. Using a custom algorithm allows full access to the algorithm implementation and rapid customization to specific tasks. However, a custom algorithm adds programming code complexity and uncertainty unless the algorithm is appropriately validated. On the other hand, use of existing libraries is convenient but it restricts the functionality of collision detection to that library. Because of the restrictions of the existing libraries, the tracking system that was developed necessitated the use of a custom algorithm. The algorithm that was developed for collision detection only has to be functional for simple geometric shapes. It uses the absolute positions of specific points on an object (e.g., the center) and its dimensions to calculate the distance between two objects. In the case of collision detection between two spheres, the equation for collision is A. Setup 1) Experiment #1: Absolute Tracking Test Fig. 5. Screen shot of experimental setup #1. As shown in Fig. 5, the setup consists of twelve identically sized spheres (radius = 0.875 inches) labeled A-L. Each subject was asked to complete a pattern by virtually touching the labeled spheres in a specific order. The order used for the experiment was A, L, K, G, E, F, C, D, G, F, B, K, J, I, C, D, E, H, I, B, K, F, L, K, H, B, A, C. 2) Experiment #2: Relative Tracking Test (1) where S1 and S2 are the position vectors of each sphere and r1 and r2 are the radii of the spheres. If the distance between the objects is less than the objects boundaries (e.g., if the distance between two spheres is less than the sum of their radii) then the two objects have collided. VI. EXPERIMENTS The degree to which a user feels immersed in a virtual reality simulation depends on the accuracy and timing of tracking, rendering and representation of the user s hand and fingers, and the detection of collisions in the virtual environment. The first experiment tested the FOB s ability to track the absolute position of the subject s palm. The second experiment tested the bend sensors ability to track the relative position of the subject s fingers. These experiments also tested the subjects interaction accuracy, ease of manipulation and agility in the virtual environment. The primary goal of these experiments was to test the tracking, visualization and collision detection capabilities of the virtual reality system. Fig. 6. Screen shot of experimental setup #2. As shown in Fig. 6, the hand positioned in the black background is the virtual representation of the subject s hand. The nine finger positions were randomly selected one at a time and required the subject to recreate them. B. Procedures For the first experiment, the subjects followed a designated pattern by moving their virtual hand to different lettered spheres until they completed the pattern. The time it took for the subjects to complete this task was recorded. For the second experiment, the subjects were told to mimic particular finger positions. The time it took for the subject to complete each finger position and for it to be visualized in the virtual environment was recorded. C. Participants There were five male subjects between the ages of 20-22 years of age attending UVa that participated in the experiments. All of them were right hand dominant. They were also not familiar with the tracking, visualization and collision detection capabilities used in the experiments. 978-1-4577-0447-5/11/$26.00 2011 IEEE 131 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 5

D. Results For the first experiment, the average task completion time was 45.89 ± 6.35 seconds. For the second experiment, the average task completion time was 40.46 ± 7.89 seconds. Based on the experimental results, four conclusions can be made: 1) Visual representation of the subject s hand and fingers in the virtual environment is realistically depicted 2) FOB is able to accurately detect the absolute position of the subject s hand, 3) Flex Sensors are precisely able to track the relative position of the subject s fingers 4) Collision detection is functional in both experiments contributing to the subject s percieved sense of reality in the simulations Project Natal for Xbox 360: Groundbreaking optical-sensing and recognition technologies to aid gesture control platform., Mar. 2010. [11] J. Seo and G.J. Kim, Design for presence: a structured approach to virtual reality system design, Presence: Teleoperators & Virtual Environments, vol. 11, 2002, pp. 378 403. [12] Ascension Technology Corporation, Flock of Birds: Installation and Operation Guide, 2002. [13] SenseGraphics, H3DAPI: The Open Source Platform for Multimodal Development. VII. DISCUSSION While the system is currently operational, albeit in a rudimentary form, the task of accurately tracking the hand has been accomplished through the use of various devices. In the future, this tracking and displaying capability will become a key feature in the development of simulations. This technology can be applied in a variety of fields including medical simulations. In the case of medicine, accurate tracking of each of the fingers and hand are critical to the user s sense of immersion, and therefore the success of the simulation as a whole and its use as a learning tool. REFERENCES [1] R. Pausch, D. Proffitt, and G. Williams, Quantifying immersion in virtual reality, Proceedings of the 24th annual conference on Computer graphics and interactive techniques, 1997, pp. 13 18. [2] F.P. Brooks, What's real about virtual reality?, IEEE Computer Graphics and Applications, 1999, pp. 16 27. [3] K. Mania and A. Chalmers, The effects of levels of immersion on memory and presence in virtual environments: A reality centered approach, CyberPsychology & Behavior, vol. 4, 2001, pp. 247 264. [4] 5DT Data Glove 5 Ultra, Mar. 2011. Available: http://www.5dt.com/products/pdataglove5u.html [5] ShapeHand, 2009.Available: http://www.fingermotion-capture.com [6] AcceleGlove FAQs, 2010. Available: http://www.acceleglove.com/faq.asp [7] CyberGlove II Specifications, 2010. Available: http://www.cyberglovesystems.com/products/cyberglo ve-ii/specifications [8] Flock of Birds. Available: http://www.ascensiontech.com/realtime/rtflockofbirds.php [9] Nintendo- Wii Sensor Bar Settings, 2011. Available: http://www.nintendo.com/consumer/systems/wii/en_na /settingssensorbar.jsp [10] PrimeSense Supplies 3-D-Sensing Technology to 978-1-4577-0447-5/11/$26.00 2011 IEEE 132 SERC RT19_CAMERON_UVa_2011SEIDS_CHARLOTTESVILLE,VA_2011APRIL Page 6