Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Similar documents
Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

Integrate the BlindAid system in a traditional orientation and mobility rehabilitation program

Haptic presentation of 3D objects in virtual reality for the visually disabled

Virtual Tactile Maps

Comparison of Haptic and Non-Speech Audio Feedback

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Interactive Exploration of City Maps with Auditory Torches

Can a haptic force feedback display provide visually impaired people with useful information about texture roughness and 3D form of virtual objects?

Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Assessing the utility of dual finger haptic interaction with 3D virtual environments for blind people

A Wiimote controlled interface to virtual environments for people who are blind mental models and attentional demands

Constructive Exploration of Spatial Information by Blind Users

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

UNCONSTRAINED WALKING PLANE TO VIRTUAL ENVIRONMENT FOR SPATIAL LEARNING BY VISUALLY IMPAIRED

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Exploring Surround Haptics Displays

HAPTIC VIRTUAL ENVIRON- MENTS FOR BLIND PEOPLE: EXPLORATORY EXPERIMENTS WITH TWO DEVICES

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

VR based HCI Techniques & Application. November 29, 2002

CAN VIRTUAL REALITY PROVIDE DIGITAL MAPS TO BLIND SAILORS? A CASE STUDY

Realtime 3D Computer Graphics Virtual Reality

Salient features make a search easy

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Using Haptic Cues to Aid Nonvisual Structure Recognition

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

A comparison of learning with haptic and visual modalities.

Do You Feel What I Hear?

A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

EXPLORATION OF VIRTUAL ACOUSTIC ROOM SIMULATIONS BY THE VISUALLY IMPAIRED

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Learning relative directions between landmarks in a desktop virtual environment

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Collaboration in Multimodal Virtual Environments

COGNITIVE-MAP FORMATION OF BLIND PERSONS IN A VIRTUAL SOUND ENVIRONMENT. Tohoku Fukushi University. Japan.

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet

Specification of symbols used on Audio-Tactile Maps for individuals with blindness

Introduction to Virtual Reality. Chapter IX. Introduction to Virtual Reality. 9.1 Introduction. Definition of VR (W. Sherman)

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Haplug: A Haptic Plug for Dynamic VR Interactions

GEOMETRIC SHAPE DETECTION WITH SOUNDVIEW. Department of Computer Science 1 Department of Psychology 2 University of British Columbia Vancouver, Canada

Augmented Reality Tactile Map with Hand Gesture Recognition

Using haptic cues to aid nonvisual structure recognition

Virtual Reality in Neuro- Rehabilitation and Beyond

Preliminary work for vocal and haptic navigation software for blind sailors

Computer Haptics and Applications

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

Miguel de Aboim Borges Fernando Moreira da Silva Faculdade de Arquitectura Universidade de Lisboa

Proprioception & force sensing

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

HAPTIC USER INTERFACES Final lecture

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Preliminary work for vocal and haptic navigation software for blind sailors

Shapes: A Multi-Sensory Environment for the B/VI and Hearing Impaired Community

Design and Evaluation of 3D Multimodal Virtual Environments for Visually Impaired People

Using VR and simulation to enable agile processes for safety-critical environments

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE. Towards Virtual Occupancy Evaluation in Designed Environments (VOE)

Introduction to Haptics

Geo-Located Content in Virtual and Augmented Reality

International Journal of Advanced Research in Computer Science and Software Engineering

Thinking About Psychology: The Science of Mind and Behavior 2e. Charles T. Blair-Broeker Randal M. Ernst

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

Berkshire Encyclopedia of Human-Computer Interaction, W. Bainbridge, Ed., Berkshire Publishing Group, 2004, pp Haptics

Virtual Environments. Ruth Aylett

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Input-output channels

Improving orientation and mobility skills through virtual environments for people who are blind: past research and future potential

Web-Based Touch Display for Accessible Science Education

Lecture 1: Introduction to haptics and Kinesthetic haptic devices

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

Keywords: Innovative games-based learning, Virtual worlds, Perspective taking, Mental rotation.

Air-filled type Immersive Projection Display

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Immersive Simulation in Instructional Design Studios

Running an HCI Experiment in Multiple Parallel Universes

Interface Design V: Beyond the Desktop

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

ARIANNA: path Recognition for Indoor Assisted NavigatioN with Augmented perception

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

these systems has increased, regardless of the environmental conditions of the systems.

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

(SEP Certified by. Spatial Management of Data. William Campbell Donelson. S.B., Massachusetts Institute of Technology

Development of a telepresence agent

Touching and Walking: Issues in Haptic Interface

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Development of Navigation Skills through Audio Haptic Videogaming in Learners who are Blind

Transcription:

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study Orly Lahav & David Mioduser Tel Aviv University, School of Education Ramat-Aviv, Tel-Aviv, 69978, Israel lahavo@post.tau.ac.il Abstract: Mental mapping of spaces, and of the possible paths for navigating these spaces, is essential for the development of efficient orientation and mobility skills. Most of the information required for this mental mapping is visual information (Lynch, 1960). Blind people lack this crucial information, thus facing great difficulties (a) in generating efficient mental maps of spaces, and therefore (b) in navigating efficiently within these spaces. The work reported here is based on the assumption that the supply of appropriate spatial information through compensatory channels (conceptual and perceptual), may contribute to blind people s spatial performance. A multisensory (haptic, auditory) virtual environment simulating real-life spaces has been developed and tested. A description of the learning environment and results from a pilot study are presented. Rationale The ability to navigate space independently, safely and efficiently is a combined product of motor, sensory and cognitive skills. This ability has direct influence in the individuals quality of life. Mental mapping of spaces, and of the possible paths for navigating through these spaces, is essential for the development of efficient orientation and mobility skills. Most of the information required for this mental mapping is visual information (Lynch, 1960). Blind people lack this crucial information, thus facing great difficulties (a) in generating efficient mental maps of spaces, and therefore (b) in navigating efficiently within these spaces. A result of this deficit in navigational capability is that many blind people become passive persons, depending on others for continuous aid (Foulke, 1971). More than 30% of the blind do not mobilize independently outdoors (Clark-Carter, Heyes & Howarth, 1986). The work reported here is based on the assumption that the supply of appropriate spatial information through compensatory sensorial channels, as an alternative to the (impaired) visual channel, may contribute to the mental mapping of spaces and consequently, to blind people s spatial performance. Research on blind people's mobility in known and unknown spaces (Golledge, Klatzky & Loomis, 1996; Ungar, Blades & Spencer, 1996), indicates that support for the acquisition of spatial mapping and orientation skills should be supplied at two main levels: perceptual and conceptual levels. At the perceptual level, the deficiency in the visual channel should be compensated with information perceived via other senses, e.g., touch and hearing. Haptic information appears to be essential for appropriate spatial performance. Haptics is defined in the Merriam-Webster dictionary as of, or relating to the sense of touch. Fritz, Way & Barner (1996) define haptics as tactile refers to the sense of touch, while the broader haptics encompasses touch as well as kinesthetic information, or a sense of position, motion and force. Haptic information is commonly supplied by the cane for low-resolution scanning of the immediate surroundings, by palms and fingers for fine recognition of objects' form, textures, and location, and by the legs regarding surface information. The auditory channel supplies complementary information about events, the presence of other people (or machines or animals) in the environment, materials which objects are made of, or estimates of distances within a space (Hill et al., 1993). At the conceptual level, the focus is on appropriate strategies for an efficient mapping of the space and the generation of navigation paths. Research indicates two main scanning strategies used by people: route and map strategies. Route strategies are based in linear (therefore sequential) recognition of spatial features. Map strategies, considered to be more efficient than the former, are holistic in nature, comprising multiple perspectives of the target space (Fletcher, 1980; Kitchin & Jacobson, 1997). Research shows that blind people use mainly route strategies while recognizing and navigating new spaces (Fletcher, 1980).

The Proposed Study Advanced computer technology offers new possibilities for supporting blind people's acquisition of orientation and mobility skills, by compensating the deficiencies of the impaired channel. Research on the implementation of haptic technologies within virtual navigation environments reports on its potential for initial training as well as for support and rehabilitation training with sighted people (Giess, Evers & Meinzer, 1998; Gorman, Lieser, Murray, Haluck & Krummel, 1998), as well as with blind people (Jansson, Fanger, Konig & Billberger, 1998; Colwell, Petrie & Kornbrot, 1998). In light of these promising results, the main goals of this study are: The development of a multisensory virtual environment enabling blind people to learn about different (real life) spaces that they are required to navigate (e.g., school, work place, public buildings). A systematic study of blind people s construction of cognitive maps of real spaces by means of the virtual environment. A systematic study of the contribution of this mapping to blind people s spatial skills and performance in real environment. The Virtual Environment Developer/Teacher mode The multisensory virtual environment simulating real-life spaces comprises two modes of operation: Developer/Teacher mode, and Learning mode. The core component of the developer mode is the virtual environment editor. This module includes three tools: (a) 3D environment builder, (b) Force feedback output editor, (c) Audio feedback editor. By the 3D-environment editor, the developer can define the environment characteristics: size and form of the room, and objects (e.g., doors, windows, walls, rectangle, cylinder). The Force-feedback output editor allows attaching Force-Feedback effects (FFE) to all objects in the environment. Examples of FFE are vibrations produced by ground textures (e.g., stones, parquet, grass etc), force fields surrounding objects, friction sensation. The audio feedback editor allows the attachment of appropriate audio units to the objects, e.g., first window, turn right. Figure 1 shows the environment-building editor screen, by which the researcher/teacher can build new navigation environments, according to the users needs in progressive levels of complexity. Figure 1: 3D environment builder Learning mode The learning mode includes two interfaces: User interface and Teacher interface.

The user interface consists of a virtual environment that simulates real rooms and objects. The subject navigates this environment using the Microsoft Force Feedback Joystick (FFJ). The feedback received while navigating the room includes sensations such as friction, objects' force fields and vibrations. By using the FFJ the subject can get foot-level information, equivalent to that she/he gets by his feet as he walks in the real space. In addition auditory information is generated by a "guiding computer agent, aiming to provide appropriate references whenever the subject gets lost in the virtual space. Figure 2 shows the user-interface screen. The teacher interface integrates series of features serving teachers during and after the learning session. Several monitors on the screen present updated information on the subject's navigation, e.g., position, objects reached. An additional function allows the teacher to record the subject's navigation path, and replay it to analyze and evaluate her/his performance (Figure 3). Figure 2: The user interface Figure 3: The teacher interface The Case Study: A blind subject's performance within the Force Feedback Virtual Environment and in the real environment The pilot case study aimed to analyze a subject's performance regarding five main aspects: (a) Technical issues in using the virtual environment (e.g., use of FFJ, response to FFE). (b) Ability to identify the virtual environment s components (e.g., identification of objects, recognition of spatial features). (c) Navigation and mobility within the virtual environment. (d) Construction of a cognitive map of the simulated room. (e) Performance in the real environment. Method Subject G., is a twenty-five years old, a late blind (G. became blind at the age of twenty). He is a computer user for more than three years, using voice output. Procedure The study consisted of three stages: familiarization with the virtual environment, navigation in the virtual environment, and navigation in the real environment. At the beginning of the familiarization with the virtual environment stage the subject received a short explanation about its features and how to operate the FFJ. A series of tasks were included regarding: (a) FFE and audio feedback; (b) mobility within the virtual environment (at varied levels of complexity). Data on the

subject s performance was collected by direct observation, and by video recording. This first stage lasted about three hours. The navigation in the virtual environment stage included three tasks: (a) exploration and recognition of the virtual environment; (b) a target-object task (e.g., walk from the starting point to the blackboard in the room); (c) a perspective-taking task (e.g., walk from the cube -in a room's corner- to the rightmost door -the usual starting point). Following the exploration task the subject was asked to give a verbal description of the environment, and to construct a scale model of it (selecting appropriate components from a large set of alternative objects and models of rooms). Several data-collection instruments served this stage: a computer log recording mechanism, which stored the subject s movements within the environment; video recording; recording of the subject's verbal descriptions; the physical model built by the subject. The second stage lasted about three hours. The navigation in the real environment stage included again two tasks: (a) a target-object task (e.g., reach and identify an object on the rectangular box); (b) a perspective-taking task (e.g., walk from the rightmost door to the cylinder). Data on the subject s performance was collected by video recording and direct observation. The third stage lasted about half an hour. Results Familiarization with the virtual environment components G. Learned to work freely with the force feedback joystick within a short period of time, walking directly and decisively towards the objects. Regarding mobility, G. could identify when he bumped into an object, or arrived to one of the room's corners. From the first tasks G. could walk around the object s corner and to walk a long the walls, that by using the FFE and the audio feedback. Navigation within the virtual room Exploration task G. navigated the environment in rapid and secure movement (Figure 4). He first explored the room s perimeter, walking along the walls. After two circuits he returned to the starting point, and begun to explore the objects located in the room. Figure 4 shows the intricate walk paths in the exploration task. The exploration session lasted about 43 minutes. Figure 4: Subject s navigation in the virtual environment Target-object task To get to the required object G. navigated the environment applying the object-to-object strategy. From the door (the starting point) G. walked to the cube and from the cube to the target - the blackboard (Figure 5). G.

reached rapidly the target -in 20 seconds- by choosing a direct way. Perspective-taking task Here once again G. applied the object to object strategy (Figure 6): he went from the cube (the starting point in this task) to the box, and then to the target, the door (which was the starting point in the previous tasks). G. choose a direct way, and completed the target in 52 seconds. Figure 5: Target-object task Figure 6: Perspective-taking task Cognitive map construction After completing the virtual environment exploration task G. was asked to construct a model of the environment. As shown in the picture of the model composed by G. (Figure 7), the subject acquired a highly accurate map of the simulated environment. All salient features of the room are correct (form, number of doors, windows and columns), as well as the relative form and size of the objects and their location in the room. Figure 7: Subject s model of the virtual environment Navigation in the real environment The subject walked through the real environment from his very fist time in it in a secure and decisive behavior. At the first task (reaching a target object: the leftmost box), G. Used the entrance door as initial reference, and walked along the walls in direct way to the box. He complete the task in 32 seconds. At the second s task (perspective-taking), G. applied the object to object strategy, and completed successfully the task in 49 seconds.

Discussion The case study reported in this paper is part of a research effort aimed to unveil if and how the work with an haptic virtual environment supports blind people s construction of spatial cognitive maps and their navigation in real environments. The case study results are encouraging. The subject, G., mastered in a short time the ability to navigate the virtual environment. He developed a fairly precise map of the simulated environment, and the completeness and spatial accuracy of this map became evident in two revealing situations. The first was the physical model built by G. after navigating the virtual room - the simulation of a space he did not know. The second was his impressive performance in the real environment. He entered the real room which he had not known until then, and which he was not given the opportunity to explore, and completed efficiently and in a very short time the different navigation tasks. Based on these an other preliminary results, a systematic empirical study (involving 30 subjects) of the effects of the haptic environment on blind people s navigation abilities is currently being conducted. The results have potential implications at varied levels, for supporting: blind people's acquaintance with new environments; their acquisition process of spatial knowledge and skills; their learning of concepts and subjects for which spatial information is crucial. Acknowledgement: The study presented here is partially supported by a grant from Microsoft Research Ltd. References Clark-Carter, D., Heyes A., and Howarth C. (1986). The effect of non-visual preview upon the walking speed of visually impaired people. Ergonomics, 29 (12), 1575-1581. Colwell, C., Petrie, H. and Kornbrot, D. (1998). Haptic Virtual Reality for Blind Computer Users. Assets 98 Conference. Fletcher, J. (1980). Spatial representation in blind children 1: development compared to sighted children. Journal of Visual Impairment and Blindness, 74 (10), 318-385. Foulke, E. (1971). The perceptual basis for mobility. Research Bulletin of the American Foundation for the Blind, 23, 1-8. Fritz, J., Way, T., and Barner, K. (1996). Haptic representation of scientific data for visually impaired or blind persons. In Technology and Persons With Disabilities Conference. Giess, C., Evers, H. and Meinzer, H.P. (1998). Haptic volume rendering in different scenarios of surgical planning. Proceedings of the Third PHANToM Users Group Workshop, M.I.T. Golledge, R. G., Klatzky, R. L., and Loomis, J. M. (1996). Cognitive Mapping and Way finding by adults without vision. In J. Portugali (Ed.). The Construction of Cognitive Maps, Netherlands, Kluwer, (pp. 215-246). Gorman, P., Lieser, J., Murray, W., Haluck, S,, and Krummel, T. (1998). Assessment and validation of force feedback virtual reality based surgical simulator. Proceedings of the Third PHANToM Users Group Workshop, M.I.T. Hill, E., Rieser, J., Hill, M., Halpin, J., and Halpin R. (1993). How persons with visual impairments explore novel spaces: Strategies of good and poor performers. Journal of Visual Impairment and Blindness, October, 295-301. Jansson, G., Fanger, J., Konig, H., and Billberger, K. (1998). Visually impaired persons use of the PHANToM for information about texture and 3D form of virtual objects. Proceedings of the Third PHANToM Users Group Workshop. Kitchin, R., and Jacobson, R. (1997). Techniques to Collect and Analyze the Cognitive Map Knowledge of Persons with Visual Impairment or Blindness: Issues of Validity. Journal of Visual Impairment and Blindness, 91 (4). Lynch, K. (1960). The image of the city. Cambridge, Ma., MIT Press. Ungar, S., Blades, M and Spencer, S. (1996), The construction of cognitive maps by children with visual impairments. In J. Portugali (Ed.). The Construction of Cognitive Maps, Netherlands: Kluwer, (pp.247-273).