Haptic presentation of 3D objects in virtual reality for the visually disabled

Similar documents
Salient features make a search easy

Collaboration in Multimodal Virtual Environments

Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Overview of current developments in haptic APIs

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Force feedback interfaces & applications

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

FORCE FEEDBACK. Roope Raisamo

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Haptic Rendering CPSC / Sonny Chan University of Calgary

Input-output channels

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Toward an Augmented Reality System for Violin Learning Support

Augmented Reality Tactile Map with Hand Gesture Recognition

Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people

An Investigation of the Interrelationship between Physical Stiffness and Perceived Roughness

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

A comparison of learning with haptic and visual modalities.

Learning to Detect Doorbell Buttons and Broken Ones on Portable Device by Haptic Exploration In An Unsupervised Way and Real-time.

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Audio makes a difference in haptic collaborative virtual environments

Virtual Tactile Maps

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Interactive Exploration of City Maps with Auditory Torches

Haptics CS327A

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

6 Ubiquitous User Interfaces

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

What will the robot do during the final demonstration?

From Encoding Sound to Encoding Touch

Computer Haptics and Applications

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

Virtual Environments. CSCI 420 Computer Graphics Lecture 25. History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

Do You Feel What I Hear?

Cognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks

1/22/13. Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

tactile perception according to texts of Vincent Hayward, J.J Gibson. florian wille // tactile perception // // 1 of 15

Comparison of Haptic and Non-Speech Audio Feedback

Proprioception & force sensing

Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Flexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Beyond Visual: Shape, Haptics and Actuation in 3D UI

VR based HCI Techniques & Application. November 29, 2002

these systems has increased, regardless of the environmental conditions of the systems.

UMI3D Unified Model for Interaction in 3D. White Paper

A Kinect-based 3D hand-gesture interface for 3D databases

Research Seminar. Stefano CARRINO fr.ch

Differences in Fitts Law Task Performance Based on Environment Scaling

Development of K-Touch TM Haptic API for Various Datasets

HAPTIC VIRTUAL ENVIRON- MENTS FOR BLIND PEOPLE: EXPLORATORY EXPERIMENTS WITH TWO DEVICES

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

The Impact of Unaware Perception on Bodily Interaction in Virtual Reality. Environments. Marcos Hilsenrat, Miriam Reiner

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

3D interaction techniques in Virtual Reality Applications for Engineering Education

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

Haptic Perception & Human Response to Vibrations

Methods for Haptic Feedback in Teleoperated Robotic Surgery

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

Chapter 1 - Introduction

VisuaLax: Visually Relaxing Augmented Reality Application Using Music and Visual Therapy

PROPRIOCEPTION AND FORCE FEEDBACK

Buddy Bearings: A Person-To-Person Navigation System

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Automatic Online Haptic Graph Construction

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Benefits of using haptic devices in textile architecture

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

FM Knowledge Modelling and Management by Means of Context Awareness and Augmented Reality

VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing

Touch. Touch & the somatic senses. Josh McDermott May 13,

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

Transcription:

Haptic presentation of 3D objects in virtual reality for the visually disabled M Moranski, A Materka Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, Lodz, POLAND marcin.moranski@p.lodz.pl, andrzej.materka@p.lodz.pl www.eletel.eu ABSTRACT The paper presents an initial research on haptic perception of 3D objects in a virtual reality environment for aiding the visually disabled persons in learning new routes and obstacle identification. The study spans a number of fields, from the very technical, such as scene segmentation and obstacle detection algorithms to psychological aspects such as the effectiveness in utilizing haptic information. The authors constructed a prototype system for the tactile presentation of real objects in a virtual reality. 1. INTRODUCTION Research concerning the application of haptic force feedback devices for the blind are worth further development as sight cannot be substituted by the auditory information channel alone. Information concerning the living environment can be complemented in this case by the sense of touch. Furthermore, designing the tactile system dedicated to people with visual disabilities would allow them to access information from the virtual 3D world simply by touching it. The system should consist of three major elements: a camera (provides information about the distance from obstacles in a scene a depth map), a computer (the depth map is segmented and the virtual scene is created) and a haptic device (the interface for a tactile presentation of the acquired scene). A haptic device is the interface used for communication between a human and a virtual reality. Thanks to the force feedback it produces, a user can feel the shape, density and texture of 3D objects created in a virtual world. The touching experience when using this interface is quite close to reality. Haptic perception incorporates both kinaesthetic sensing, (i.e. of the position and movement of joints and limbs), and tactile sensing, (i.e. through the skin) (Loomis and Lederman, 1986). The most popular haptic devices available are the Phantom Sensable (Sensable Corp.) and a range of touching manipulators from Force Dimension (Force Dimension Corp.). The systems define one contact point at a time between the observer and the virtual object. They do not stimulate cutaneous receptors responding to temperature, pressure and pain. The mentioned devices have great potential and they were considered for use by the blind to familiarize themselves with obstacles inside buildings and for learning new routes and shapes. However, their high cost limits their availability to the average user. Since haptic force feedback technology has entered the world of computer games, a new low-cost device called Falcon Novint (Novint Corp.) has appeared on the market. Although the device has only 3 degrees of freedom, compared to that of the Phantom Sensable with 6, this is enough for the 3D object presentation. The aim of this experimental study is to present a prototype system which allows for real scenes to automatically appear in a virtual reality (by means of a time-of-flight 3D camera) and to be accessed in the haptic form by the usage of Falcon Novint. 2. RELATED WORK Research concerning the application of haptic-force-feedback, stationary devices for navigating the blind can be divided into two categories: building virtual maps and creating simulators where real obstacles and objects are presented virtually. They are made for learning new routes and it seems they have the potential as a tool which the blind can use to acquire knowledge about a place for an intended first time visit. The majority of projects are focused on checking if advanced, expensive devices with a proven quality of performance can be used for such purposes. In the paper (Jansson et al., 1999), two independent studies investigating problems concerning the use of haptic virtual environments for blind people are described. Two 103

devices, a Phantom 1.5 and an Impulse Engine 3000 were used to render virtual textures and 3D objects. Experiments proved that objects rendered by these devices can be effectively perceived by both blind and blindfolded sighted observers. However, the investigated scenarios were very simple. In another publication (Jansson, 1998), the usefulness of a haptic force feedback device (the PHANToM) for information without visual guidance was also confirmed. The author of the project tried to find the answers to the following questions: how well blind-folded observers perception of the roughness of real and virtual sandpapers agree and if the 3D forms of virtual objects could be judged accurately and with short exploration times down to a size of 5 mm. Blind-folded sighted observers judged the roughness of real and virtual sandpapers to be nearly the same. The presented experiments were concluded with a statement that a haptic device can present useful information without vision. Considerations using tactile maps for the blind were published. The paper (Kostopoulos et al., 2007) describes a framework of map image analysis and presentation of the semantic information to blind users using alternative modalities (i.e. haptics and audio). The resulting haptic-audio representation of the map is used by the blind for navigation and path planning purposes. However, available literature in the field lacks concrete findings concerning the usage of Falcon Novint game controller. 3. THE PROTOTYPE SYSTEM FOR HAPTIC PRESENTATION OF 3D SCENES The haptic presentation system was built in order to enable the blind people a touching interaction with 3D real objects created in virtual reality. The prototype consists of an SR3000 camera (Mesa Imaging AG), a laptop and a Falcon Novint haptic interface (see Fig. 1). Figure 1. Diagram of the designed system. The camera provides information about the distance from obstacles in a scene by calculating the time of flight of the emitted and reflected back light. A 2.5D depth map is calculated at the output. The camera is connected to the remote computer. Data processing on a laptop is divided into two stages: the scene segmentation and the virtual scene modeling (see Fig. 2). 3.1 Segmentation On a laptop, the depth map is segmented in order to extract all obstacles from the acquired scene. This process allows gathering information (i.e. location and size of objects) that is used to create a virtual scene. First, the point cloud representing the scene is processed in order to find points corresponding to planes. Planes finding procedure is based on normal vector estimation at each point. For a given point p t (x 0,y o,z 0 ) with normal vector: (1) n { a, b, c} the equation of a corresponding plane is given as: The input point cloud data is proceed as follows: a ( x x0) b( y y0) c( z z0) 0 (2) The input point cloud is ordered in a k-d tree data structure (K-D Tree). A normal vector at a each point is estimated from the surrounding point neighborhood. For this purpose k-neighbours are found. Next, the k-neighbours are used to calculate normal vectors (Rusu, 2009). Points which have the same normal vectors are grouped together (a certain deviation angle between normal vectors is assumed). 104

RANSAC algorithm (Random Sample Consensus) is applied to each group of points in order to find planes (a given group may include a few parallel planes, therefore certain distance threshold between points and the minimal number of points which form a plane are assumed). The calculated planes are filtered from the input cloud. Figure 2. Diagram of the designed system working principle. Next, the clustering algorithm (Rusu, 2009) is used in order to find points representing objects. Two points from the point cloud form an object when the distance d between them is shorter or equal to the assumed distance threshold d th. 3.2 Scene modeling for the tactile presentation The Falcon Novint haptic game controller is used for presentation of the virtual scenario. Using her/his sense of touch the blind user accesses information about the content of the observed scenes. The procedure of the virtual scene modeling is as follow: The found planes are created in a virtual reality (the background and the ground planes). The real obstacles are substituted by 3D boxes whose sizes and locations correspond to the real sizes and locations of objects. Locations of the boxes are given by the centroids calculated for each point cloud representing obstacles. Sizes of the boxes correspond to the maximum distance between points representing obstacles along the X and Y axis. The haptic and graphic rendering algorithms are applied to the created scene and the Falcon Novint device is activated. For the purpose of a tactile presentation an open source haptics software development platform H3D is used (H3D API). Procedure of segmentation and scene modeling are presented in Fig. 3. 4. EXPERIMENTS WITH THE BLIND USERS Experiments were designed and performed in order to examine usability of the tactile presentation of the real environment in a virtual reality, utilizing the designed prototype system for the blind users (see Fig. 4). 4.1 Aim of research Performed experiments had following goals: Check the opportunity of the application of the tactile presentation system for the blind and visually impaired. Examine the usability and potential of the force feedback device, Falcon Novint, for a 3D virtual object presentation without the usage of vision. 105

collect the blind participants opinion about their requirements and preferences concerning the design of such a system and learn the potential application areas of it. 4.2 Participants The group of participants consisted of eight blind people, two women and six men. Six of them were born blind and the others lost their sight at different times during their lives. They were chosen as representatives of different educational and occupational backgrounds. They also represent a different ability of tactile perception of the surrounding environment. a) b) c) Figure 3. The virtual scene modeling process: a 2.5D depth map of the scene a), the segmented scene b) (grey found planes, black found obstacles), the reconstructed scene for the tactile presentation c) (the obstacles are replaced by cubes, see the text ). 4.3 Procedure and evaluation The experiments were divided into 4 stages: Figure 4. Prototype system tested by the blind. Stage 1 Training phase Participants were informed about the prototype system. Then its functionality was explored. The practice period with the device was adjusted according to the ability of each participant (15-30 minutes). 106

Stage 2 Scene content recognition Experiments were performed for scenes with different number of obstacles (between 2 and 5). Participants were asked to say how many obstacles were presented in each scene. Stage 3 Distance estimation to each obstacle from a chosen point of observation This stage was divided into two scenarios. In the first scenario, three scenes with different location of one object were presented. In the second scenario, one scene consisted of 3 objects was presented. In both cases, participants were asked to estimate objects distances to the chosen point of observation (e.g. the background wall). Stage 4 Estimation of obstacles height Three scenes with different number of objects were presented (between 2 and 5). Users were asked to estimate heights of objects in each scene (the height of an object was estimated in relation to the other objects). The objects in each scene were located on the ground, against a background wall up to 7.5 m (the SR3000 camera measurement range). Exploration time of every scene was measured. Every participant decided themselves when to finish exploration of a given presented scene. After the exploration was finished he/she was asked to describe the scene. The task was carried out successfully when the blind person correctly identified all obstacles in the presented scene. 4.4 Results The results of the second stage are presented in Fig. 5 and Fig. 6. The outcomes of the third stage are shown in Fig. 7. The outputs of the last stage are presented in Fig. 8 and Fig. 9. Figure 5. Results of the second stage of experiments. Figure 6. Exploration time of a scene for the second stage (all users). 107

Figure 7. The outcomes of the third stage of experiments. Figure 8. The results of estitamtion of objects height. Figure 9. Exploration time of a scene for the fourth stage. 4.5 Discussion Diverse exploration time of the scenes was measured. It depended on a tactile perception skills of the participants and on the chosen way of the scene exploration. The time of exploration was not lengthen proportionally to the complexity of the scenes, because the participants learnt how to efficiently use the haptic interface in order to identify the scene s content. In the second stage the worst result was obtained for the scene with 5 objects (two object were identified as one, because they were located close to each other). In the third stage two ways of the obstacle s location estimation were compared. In both cases participants were able to find location of objects in relation to a chosen point of observation, but in case where all obstacles were located in one scene the process was faster (the distances could be compared directly without switching between scenes). In the last stage for scenes containing two or three objects nobody had problem to properly 108

estimate the heights of the objects. For the scene with five objects three blind persons failed to correctly recognize objects heights. When all the experiments were completed, the blind participants expressed their opinions about the system and its usability in real life scenarios. They were impressed by the system s performance. In their opinion, there are a couple of potential applications where such a system could help the blind in everyday activities. They gave many hints about improving the system. The first suggestion was to add vocal information about the 3D position of the probe (the virtual finger in the system). This would be very helpful in order not to lose themselves in the virtual environment. Furthermore, the sonification of some of the scene points or objects could also be very useful as the volunteers suggested (the presented research concerning the perception of 3D objects by touching it). Special focus is also required when creating virtual objects. They should be as similar to those real ones as possible (size, stiffness, texture, density). When the scene consists of many objects that differ in size, the smaller ones should be created specifically to be noticed. 5. CONCLUSIONS In the article a prototype system for tactile presentation of real objects in a virtually reality was described. The system usability was examined by the blind participants. The performed experiments have proved that the system can be applied for the blind and emphasized the challenges that yet have to be overcome. This kind of application has to meet special requirements in order to be safe and reliable. The challenging issue is to present a real world scenario in a virtual reality. Many requirements need to be met, mainly: choosing the scale of virtual objects in the ratio to the real ones, solving the problem of losing oneself in VR, solving the problem of presenting scenes consisting of many objects/details (each of a different size), as the system s resolution is finite. The above are all the subject of scientific research in terms of the technical and psychological aspects. Acknowledgements: This work has been supported by the Ministry of Science and Higher Education of Poland research grant no. N R02 008310 in years 2010-2013. The described study is carried out with cooperation with Lodz Chapter of the Polish Society for the Blind. 6. REFERENCES J Loomis and S Lederman (1986), Tactual perception. In Handbook of perception and human performance (K Boff & J Thomas, Eds), Wiley/Interscience, New York, pp: 31.31-31.41. Sensable Corp.: www.sensable.com, last accessed: 30 June 2012. Force Dimension Corp.: www.forcedimension.com, last accessed: 30 June 2012. Novint Corp.: home.novint.com, last accessed: 30 June 2012. G Jansson et al. (1999), Haptic virtual Environments for Blind People: Exploratory Experiments With Two Devices International Journal of Virtual Reality, 4, 3, pp. 10-20. G Jansson (1998), Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?, The International Journal of Virtual Reality, 4, pp: 105-110. K Kostopoulos et al. (2007), Haptic Access to conventional 2D maps for the visually impaired, Journal on Multimodal User Interfaces, 2, 1. Mesa Imaging AG: www.mesa-imaging.ch, last accessed: 30 June 2012. K-D Tree: www.en.wikipedia.org/wiki/k-d_tree, last accessed: 30 June 2012. R Rusu (2009) Semantic 3D Object Maps for Everyday Manipulation in Human Living Environments, Phd Dissertation, Institut für Informatik, der Technischen Universität München. H3D API: www.h3dapi.org, last accessed: 30 June 2012. 109