A Movement Based Method for Haptic Interaction

Similar documents
The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Computer Haptics and Applications

PROPRIOCEPTION AND FORCE FEEDBACK

Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction

Overview of current developments in haptic APIs

Exploring Haptics in Digital Waveguide Instruments

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

FORCE FEEDBACK. Roope Raisamo

An Experimental Study of the Limitations of Mobile Haptic Interfaces

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

AHAPTIC interface is a kinesthetic link between a human

Elements of Haptic Interfaces

Haptics CS327A

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Haptic Virtual Fixtures for Robot-Assisted Manipulation

Modelling of Haptic Vibration Textures with Infinite-Impulse-Response Filters

Force feedback interfaces & applications

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Networked haptic cooperation using remote dynamic proxies

Friction & Workspaces

Visuo-Haptic Interface for Teleoperation of Mobile Robot Exploration Tasks

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Development of K-Touch TM Haptic API for Various Datasets

Haptic Display of Contact Location

An Improved Path Planning Method Based on Artificial Potential Field for a Mobile Robot

Haptic Rendering: Introductory Concepts

A Generic Force-Server for Haptic Devices

Abstract. 1. Introduction

Automatic Control Motion control Advanced control techniques

Cody Narber, M.S. Department of Computer Science, George Mason University

Haptic presentation of 3D objects in virtual reality for the visually disabled

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Haptic Rendering: Introductory Concepts

From Encoding Sound to Encoding Touch

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

RECENT advances in nanotechnology have enabled

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

Benefits of using haptic devices in textile architecture

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design

Safe and Efficient Autonomous Navigation in the Presence of Humans at Control Level

Game Design 1. Unit 1: Games and Gameplay. Learning Objectives. After studying this unit, you will be able to:

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1

Haptic Rendering CPSC / Sonny Chan University of Calgary

The Effect of Haptic Degrees of Freedom on Task Performance in Virtual Surgical Environments

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 1. Improving Contact Realism Through Event-Based Haptic Feedback

CRYPTOSHOOTER MULTI AGENT BASED SECRET COMMUNICATION IN AUGMENTED VIRTUALITY

Haptic interaction. Ruth Aylett

Haptic Models of an Automotive Turn-Signal Switch: Identification and Playback Results

CONTACT FORCE PERCEPTION WITH AN UNGROUNDED HAPTIC INTERFACE

Exploring 3D in Flash

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Peter Berkelman. ACHI/DigitalWorld

Haptics ME7960, Sect. 007 Lect. 6: Device Design I

Haptic Display of Multiple Scalar Fields on a Surface

2. Introduction to Computer Haptics

Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou

CONTENTS. Cambridge University Press Vibration of Mechanical Systems Alok Sinha Table of Contents More information

Aerospace Sensor Suite

College Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we

Haptic interaction. Ruth Aylett

Perceptibility of Haptic Digital Watermarking of Virtual Textures

Phantom-X. Unnur Gretarsdottir, Federico Barbagli and Kenneth Salisbury

Spanning large workspaces using small haptic devices

Novel machine interface for scaled telesurgery

Solution of Pipeline Vibration Problems By New Field-Measurement Technique

Improvement of Robot Path Planning Using Particle. Swarm Optimization in Dynamic Environments. with Mobile Obstacles and Target

Force display using a hybrid haptic device composed of motors and brakes

Performance Issues in Collaborative Haptic Training

INTRODUCTION TO GAME AI

PanPhonics Panels in Active Control of Sound

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Ball Balancing on a Beam

Dynamic Kinesthetic Boundary for Haptic Teleoperation of Aerial Robotic Vehicles

Interactive System for Origami Creation

Unity Game Development Essentials

May Edited by: Roemi E. Fernández Héctor Montes

A Novel Transform for Ultra-Wideband Multi-Static Imaging Radar

Differences in Fitts Law Task Performance Based on Environment Scaling

Haptic Feedback in Mixed-Reality Environment

Exploring Surround Haptics Displays

Module 1: Introduction to Experimental Techniques Lecture 2: Sources of error. The Lecture Contains: Sources of Error in Measurement

Fuzzy Logic Based Force-Feedback for Obstacle Collision Avoidance of Robot Manipulators

Live Hand Gesture Recognition using an Android Device

Effects of Longitudinal Skin Stretch on the Perception of Friction

Interactive Modeling and Authoring of Climbing Plants

MHaptic : a Haptic Manipulation Library for Generic Virtual Environments

Visual Influence of a Primarily Haptic Environment

Haptic Rendering of Large-Scale VEs

The Design of Teaching System Based on Virtual Reality Technology Li Dongxu

A Hybrid Actuation Approach for Haptic Devices

Developing Frogger Player Intelligence Using NEAT and a Score Driven Fitness Function

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Dimensional Reduction of High-Frequency Accelerations for Haptic Rendering

Stable Haptic Rendering in Virtual Environment

Phantom-Based Haptic Interaction

Transcription:

Spring 2014 Haptics Class Project Paper presented at the University of South Florida, April 30, 2014 A Movement Based Method for Haptic Interaction Matthew Clevenger Abstract An abundance of haptic rendering schemes have been developed to help further presence and immersion in virtual environments. Increasing realism is the main driving force behind this development. Schemes have progressed from simple one-dimensional springs to six degree of freedom particle models, with a wealth of surface and material properties. Because of their relative unimportance, haptic rendering schemes for video games have been relatively ignored. Video games generally do not need to have ultra-realistic haptic feedback. Less realistic, more entertaining feedback is often beneficial. In opposition to most haptic rendering schemes, which use position control, movement based haptic feedback functions under acceleration control, a common control scheme in video games. The presented method displays normal force, friction, virtual object weight, water drag, water inertia, and impact forces. Force direction is calculated using the projection and rejection of the user s input on the collision normal allowing virtual objects to be oriented in any direction. A user study was conducted comparing movement based feedback to constraintbased feedback and no feedback. Movement based feedback was favored by a third of the subjects and given the highest rating for enthusiasm by half of the subjects. This work presents a novel haptic rendering method that can easily be applied to existing video games. I. INTRODUCTION Most haptic rendering algorithms follow a position correspondence model [1], [2], [3]. The virtual position is determined by the physical position of the device, possibly with scaling. This has the benefit of following a predictable pattern where moving the device to a certain physical position always corresponds with the same virtual position. In most cases, the reachable virtual workspace is predefined at design time. This is desirable in most applications as it provides a natural, intuitive means of interaction. There are, ever, situations where a predefined virtual workspace is cumbersome or undesirable. In large environments a means of navigation is necessary. The proposed movement based method allows the user to navigate through the environment while also providing haptic feedback. The workspace is infinite along any controllable direction. Video games make extensive use of the visual and auditory modalities, but with the exception of limited vibratory feedback, largely exclude haptic interaction. A richer sense of presence and immersion could be achieved by adding haptic feedback. In this paper the proposed movement based haptic rendering method is applied to a two-dimensional side-scrolling video game. Section II provides a summary of related work. Section III details the movement based haptic rendering technique. The haptic interface and implementation details are described in Section IV. A user study, conducted to compare the presented method with existing methods, is detailed in Section V. The results of the experiment are presented and discussed in Sections VI and VII, followed by concluding remarks in Section VIII. A. Haptic Rendering II. BACKGROUND The penalty method applies a force proportional and opposite to the amount of penetration into a virtual volume. The simplicity of this approach has led to extensive study and expansion. There are, ever, a number of drawbacks to this method [4]. It is often difficult to determine which exterior surface to associate with a given volume when multiple primitives touch or intersect. Force discontinuities appear when approaching other surfaces of the same object. Lastly, thin objects are unable to generate sufficient force to prevent the device from passing through [2]. To overcome these limitations, Zilles and Salisbury [1] proposed a constraintbased method. This method employs a god-object which is constrained by the virtual environment and controlled by physics. Vector field force shading [5], analogous to Phong shading for graphic display, was incorporated into haptic rendering by Morgenbesser and Srinivasan. Ruspini et al. [2] extended these ideas with a finite virtual proxy. The virtual proxy is able to model force shading, friction, surface stiffness, and texture. The finite size of the virtual proxy also prevents it from slipping through any tiny numerical gaps present in most polygonal meshes. In addition to rigid objects, uids [6] have also been simulated using unified particle models. III. MOVEMENT BASED HAPTIC RENDERING Unlike the position based control of the constraint-based method, movement based rendering functions under acceleration control. Acceleration control is commonly used for navigation among physics driven video games. Acceleration is applied based on deviation from center, similar to a joystick. More acceleration is applied as the device moves further from center. As the acceleration is high and the maximum velocity low, the scheme approximates velocity control. This method of input is analogous to arrow key input. Joysticks and arrow keys are very common input methods for games so the control scheme can easily be retrofitted to apply movement based haptic rendering. Rendered forces are based on the projection and the rejection of the input vector on the collision normal. The input vector is a normalized Euclidean representation of the position of the haptic device centered at the device origin. Moving the haptic device to the furthest right would result in an input vector of [1, 0, 0], the furthest downward would

result in [0, -1, 0]. The collision normal is the normalized sum of the normals of the impacting colliders. The projection is the orthogonal projection of the input vector onto a line parallel to the collision normal. The projection is a vector parallel to the collision normal: p= i n n n 2 Input (1) The rejection is the orthogonal projection of the input vector onto the plane orthogonal to the collision normal: r=i (2) p Friction Weight Input Vector Rejection Fig. 2. Projection Collision Normal Fig. 1. Projection and rejection of the input vector on the collision normal. The rendered normal force is simply the negative of the virtual object stiffness multiplied by the projection. The friction force is the negative of the virtual damping multiplied by the rejection. Virtual object weight is proportional to the projection of the collision normal on the downward direction if the projection is not antiparallel. Water drag and inertia are applied when in water. Drag is the negative of the virtual drag coefficient multiplied by the player s velocity. Water inertia is a constant force applied along the ow direction. Impact forces are generated based on decaying sinusoids [7]. Fn = k p Ff = b r W = m g (n d) d Fwd = d v Fwi = w f Normal Force (3) (4) (5) (6) (7) Normal force, friction, and weight applied to the user. around 200-300 Hz [8]. As such haptic rendering should update at 1000 Hz [9]. Because of the discrepancy between the two update rates the normal, friction, weight, drag, and inertia forces are summed and smoothed to reduce erratic motions. The impact force is overlaid raw in order to better capture the transient response. A dynamic link library is used with interoperation to control the Omni from managed code. The C code used to control the Omni is wrapped in a dynamic link library and called in C# through Platform Invocation Services. Methods to initialize, deactivate, get position, and set force were developed. In order to avoid complicated marshaling only blittable types are used. The position is returned using three separate functions, one for each axis, instead of an array. In addition to complex marshalling, returning an array requires the user to release the array memory manually, an unnecessary complication as the size of position is held constant at three. Input IV. I MPLEMENTATION The movement based haptic rendering method was implemented on a Sensable Phantom Omni. A two-dimensional side-scrolling video game was developed to assess the interaction method. Unity was used as the game engine. Secondary game assets were included to provide a more holistic, entertaining gaming experience. The developed game utilized Unity s existing physics engine for collision detection. Unity s physics engine updates at 50 frames per second. Humans are able to perceive vibrations in the finger up to 5-10 khz with a maximum sensitivity Water Inertia Water Drag Fig. 3. Water drag and inertia applied when the user enters water.

Fig. 4. Implemented scene with secondary game assets. The user is able to interact with static and dynamic obstacles in the virtual world. Collision with any object generates forces which are governed by the movement based haptic rendering algorithm. All objects generate normal force and friction. Dynamics objects impart their weight and impact forces. Water confers drag and inertial forces. V. E VALUATING I NTERACTION A subjective human experiment was developed to test the partiality and aptness of the proposed movement based haptic interaction method. Subjects were presented with three different interaction methods and asked to rate them on a ten point scale based on six different criteria. The three interaction methods presented were: no feedback, movement based feedback, and constraint-based feedback. The six criteria are listed in Table I. In the methods with haptic feedback only normal force was rendered. Other than removing the extraneous forces, the movement based feedback method presented in the experiment was equivalent to what was described in Section III. The constraint-based feedback method followed that described in [2] with only normal force rendered at a frame rate of 50 Hz. The no feedback method was equivalent to the movement based feedback method only with no feedback rendered. The movement based method and the no feedback method were both acceleration control. The constraint-based method was position control. At the beginning or end of each experiment data on the subject s age, gender, handedness, and gaming experience was collected. Before beginning the experiment, subjects were introduced to the system by allowing them to explore two demonstration levels. When they felt comfortable with the system the experimentation level was loaded. The experimentation level consisted of a maze where subjects were asked to navigate from the center to the lower right colliding with walls as they went. The different interaction methods were presented once in a random order. After completing and rating each interface subjects were asked to explain any differences they felt between the three methods and if they had a favorite interface. The entire experiment lasted less than five minutes. VI. R ESULTS The user study included 10 subjects, aged in their 20s or 30s with one female and nine males. One subject was left handed. The experiment was performed with the left hand due to space constraints. Five subjects had little to no gaming experience where the remaining five had some, no one had extensive gaming experience. The average rating for each criteria for each interface, compiled across users, is sn in Figure 6. Higher values correspond to a better rating. The total mean rating incorporating all the criteria is sn by the dashed lines. TABLE I C RITERIA Realism Ease Aptness Intuitiveness Enthusiasm Partiality realistic is the control scheme easy is it to control the character appropriate does the control feel natural is the control much did you enjoy this method much do you prefer this method

Constraint Movement None Fig. 5. Experimentation level. A simple maze user s were asked to traverse from the center to the lower right. 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 10 Total Mean Rating VII. DISCUSSION Statistically significant differences were present in the total mean rating among all three interfaces, F (2, 163) = 34.73, p < 0.001, as sn in Figure 7. The constraint-based method had by far the highest total mean rating, 9.13, followed by the movement based method, 6.93. No feedback garnered a rating of 5.85. The constraintbased method had the highest average rating for each criteria, with all but partiality being above 9. Movement based rendering had the second highest average rating in all but ease and intuitiveness. No feedback was rated as easier than movement based feedback and of equal intuitiveness. All the subjects were able to identify that no feedback was given in one of the trials and most were able to correctly recognize the differences between the other two methods. No feedback was preferred by one subject, and three and four votes were given to the movement and constraint methods respectively. Some subjects did not vote and others voted for two methods, each method was given half a point in these cases. It was clear that some subjects did not understand the movement Average Rating 10 9 8 7 6 5 None Movement Constraint Realism Ease Aptness Intuitiveness Enthusiasm Partiality Fig. 6. Average rating for each criteria for each interface. Fig. 7. Total mean rating and standard error for each interface. Statistically significance is present for each interface. based rendering scheme and consequently conferred poor ratings. Other subjects opted for an all-or-nothing approach, either awarding a rating of 1 or 10 with little in-between. The constraint method was the biggest beneficiary of this approach and no feedback the most diminished by it. Many subjects were enthusiastic about the movement based method with half of the subjects giving the method a rating of ten for this category. VIII. CONCLUSION While not as realistic or intuitive as constraint-based methods, movement based haptic rendered provides a viable option for haptic interaction. Built around acceleration control, movement based rendering allows for haptic interaction in a control scheme other than position control. Incorporating normal force, friction, object weight, water drag, water inertia, and impact forces movement based rendering can easily be extended further to incorporate any number of surface or material properties. Human subject testing reveals that most people enjoy movement based feedback and find it entertaining. A timer callback which updates at 1000 Hz could be added to the system to help improve force rendering and reduce the need for smoothing. The system currently operates at 50 Hz which is well below the recommended haptic update rate. Movement based haptic feedback could also be retrofitted to existing games to see if any improvement is gained by adding haptic feedback. REFERENCES [1] C. B. Zilles and J. K. Salisbury, A constraint-based god-object method for haptic display, in IEEE International Conf. on Intelligent Robots and System, Human Robot Interaction, and Co-operative Robots, pp. 146 151, 1995. [2] D. C. Ruspini, K. Kolarov, and O. Khatib, The haptic display of complex graphical environments, in Proc. 24th Annual Conf. Computer Graphics and Interactive Techniques, pp. 345 352, Aug 1997. [3] C. Basdogan and C. H. Ho, Principles of haptic rendering for virtual environments. Online.

[4] K. Salisbury, F. Conti, and F. Barbagli, Haptic rendering: Introductory concepts, IEEE Computer Graphics and Applications, vol. 24, pp. 24 32, Mar-Apr 2004. [5] H. B. Morgenbesser and M. A. Srinivasan, Force shading for haptic shape perception, Proc. ASME Dynamic Systems and Control Division, vol. 58, pp. 407 412, 1996. [6] S. H. G. Cirio, M. Marchal and A. Lécuyer, Six degrees-of-freedom haptic interaction with uids, IEEE Trans. Vis. Comput. Graphics, vol. 17, pp. 1714 1727, Nov 2011. [7] K. J. Kuchenbecker, J. Fiene, and G. Niemeyer, Improving contact realism through event-based haptic feedback, IEEE Trans. Vis. Comput. Graphics, vol. 12, pp. 219 230, Mar 2006. [8] T. A. Kern, Biological basics of haptic perception, in Engineering Haptic Devices, ch. 3, pp. 35 58, Berlin: Springer-Verlag, 2009. [9] M. C. Lin and M. Otaduy, eds., Haptic Rendering: Foundations, Algorithms, and Applications. A K Peters/CRC Press, July 2008.