Interactive Exploration of City Maps with Auditory Torches

Similar documents
IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Virtual Tactile Maps

Buddy Bearings: A Person-To-Person Navigation System

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

The Mixed Reality Book: A New Multimedia Reading Experience

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Haptic presentation of 3D objects in virtual reality for the visually disabled

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Multi-User Interaction in Virtual Audio Spaces

Constructive Exploration of Spatial Information by Blind Users

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Access Invaders: Developing a Universally Accessible Action Game

Using Haptic Cues to Aid Nonvisual Structure Recognition

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

SONIFICATION OF SPATIAL DATA

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Do You Feel What I Hear?

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

The ENABLED Editor and Viewer simple tools for more accessible on line 3D models. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Comparison of Haptic and Non-Speech Audio Feedback

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Exploring Geometric Shapes with Touch

Using haptic cues to aid nonvisual structure recognition

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Multisensory Virtual Environment for Supporting Blind. Persons' Acquisition of Spatial Cognitive Mapping. a Case Study I

A Study on the Navigation System for User s Effective Spatial Cognition

Effective Iconography....convey ideas without words; attract attention...

COPYRIGHTED MATERIAL. Overview

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

COPYRIGHTED MATERIAL OVERVIEW 1

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Glasgow eprints Service

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Waves Nx VIRTUAL REALITY AUDIO

Visualizing Remote Voice Conversations

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Spatialization and Timbre for Effective Auditory Graphing

HAPTIC USER INTERFACES Final lecture

I've Seen That Shape Before Lesson Plan

Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display

DESIGNING ENVIRONMENTAL SOUNDS BASED ON THE RESULTS OF INTERACTION BETWEEN OBJECTS IN THE REAL WORLD

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

AmbiGlasses Information in the Periphery of the Visual Field

Visual Communication by Colours in Human Computer Interface

Exploring Surround Haptics Displays

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

6 Ubiquitous User Interfaces

SpringerBriefs in Computer Science

HUMAN COMPUTER INTERFACE

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Accessing Audiotactile Images with HFVE Silooet

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

CONSTRUCTING THE YELLOW BRICK ROAD: ROUTE BRICKS ON VIRTUAL TACTILE MAPS. Jochen Schneider 1)

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

Virtual Reality Calendar Tour Guide

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Auditory distance presentation in an urban augmented-reality environment

Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems

Microsoft Scrolling Strip Prototype: Technical Description

Enhancing 3D Audio Using Blind Bandwidth Extension

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Realtime 3D Computer Graphics Virtual Reality

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

Layered Software Architecture for Designing Environmental Sounds in Non- Visual Interfaces

CSC 2524, Fall 2017 AR/VR Interaction Interface

R (2) Controlling System Application with hands by identifying movements through Camera

Rethinking Prototyping for Audio Games: On Different Modalities in the Prototyping Process

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

GestureCommander: Continuous Touch-based Gesture Prediction

MAJOR GEOGRAPHIC CONCEPTS

Light In Architecture

Platform-independent 3D Sound Iconic Interface to Facilitate Access of Visually Impaired Users to Computers

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Adapting SatNav to Meet the Demands of Future Automated Vehicles

III. Publication III. c 2005 Toni Hirvonen.

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Acquisition of spatial knowledge of architectural spaces via active and passive aural explorations by the blind

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

A Mixed Reality Approach to HumanRobot Interaction

Blindstation : a Game Platform Adapted to Visually Impaired Children

A contemporary interactive computer game for visually impaired teens

Analysis of Frontal Localization in Double Layered Loudspeaker Array System

Transcription:

Interactive Exploration of City Maps with Auditory Torches Wilko Heuten OFFIS Escherweg 2 Oldenburg, Germany Wilko.Heuten@offis.de Niels Henze OFFIS Escherweg 2 Oldenburg, Germany Niels.Henze@offis.de Susanne Boll University of Oldenburg Escherweg 2 Oldenburg, Germany Susanne.Boll@informatik.unioldenburg.de Copyright is held by the author/owner(s). CHI 2007, April 28 May 3, 2007, San Jose, USA ACM 1-xxxxxxxxxxxxxxxxxx. Abstract City maps are an important means to get an impression of the structure of cities. They represent visual abstraction of urban areas with different geographic entities, their locations, and spatial relations. However, this information is not sufficiently accessible today to blind and visually impaired people. To provide a nonvisual access to map information, we developed an interactive auditory city map, which uses 3D nonspeech sound to convey the position, shape, and type of geographic objects. For the interactive exploration of the auditory map, we designed a virtual walk-through. This allows the user to gain an overview of an area. To be able to focus on certain regions of the map, we equip the user with an auditory torch. With the auditory torch users can change the number of displayed objects in a self directed way. To further aid in getting a global idea of the displayed area we additionally introduce a bird s eye view on the auditory map. Our evaluation shows that our approaches enable the user to gain an understanding of the explored environment. Keywords sonification, auditory display, 3D sound, exploration, interaction techniques, city maps, orientation

2 ACM Classification Keywords H.5.1 Multimedia Information Systems, Audio input/output; H.5.2 User Interfaces, Auditory (nonspeech) feedback. Introduction Whether blind, visually impaired, or sighted, our quality of life greatly depends on our ability to make spatial decisions. Sighted people typically use visual maps to make themselves familiar with spatial relations. Maps are used to build a mental model of a spatial environment, which helps people to navigate and orientate within this environment. Access to visual maps is very difficult if not impossible for blind and visually impaired people. However, a mental model about the environment is very important for them in order to find their way, as they are not able to perceive visual landmarks such as signs and buildings while maneuvering. As Jacobson stated in [1], an idea of the area enhances their wayfinding and orientation skills. Through an image of the environment and its phenomena it is possible to improve the quality of life of visually impaired and blind people by increased mobility and independence. An information presentation is needed, which allows a blind or visually impaired user to access the same map information as a sighted person, however, with a different sense and channel. Different approaches to provide blind and visually impaired people with geographic information have been developed in the past: The most common projects focus on tactile exploration of maps. With physical tactile maps and computer based tactile [2,3] and similar auditory [4] approaches the user moves a finger or a pointing device across maps. However, geographic objects are only perceivable, if the user directly points to the object. Therefore, it is difficult to find certain objects, as the whole map has to be explored, e.g., from the top left to the bottom right. Also, current approaches suffer from the inability of presenting more than one object at the same time, making it a challenge to understand spatial relations between geographic objects like distances and directions. We can overcome these drawbacks by presenting map entities through nonspeech sound objects, which are played concurrently and also provide information about their location at the same time. We developed a system, which enables the user to explore digital city maps using an auditory display [5]. With our system each geographic feature such as a lake or a park is represented by a corresponding natural sound like dabbling water or a singing bird. These sounds are placed on a horizontal plane within a virtual room. Their location illustrated in Figure 1 is equivalent to the position of their visual peer on the map. figure 1. Illustration of our sonfication of city maps. The highlighted areas represent different geographic objects. Each object type is sonified by a corresponding sound.

3 City maps typically contain many hundreds of geographic objects such as buildings, parks, and specific points of interest. To get a global idea of the city it is sufficient to display only the most prominent features. Nevertheless, the number of objects needed to provide an appropriate overview exceeds the humans ability to perceive parallel sounds. Recent research by Brazil and Fernström [6] showed that the identification accuracy of different sounds clearly decreases with incrementing the number of concurrently played sound objects. When playing less than six concurrent sound objects, almost 85% of the objects can be recognized correctly. When playing more than six objects, the identification accuracy decreases below 50%. Therefore, the number of the simultaneously played objects must be reduced. We filter the objects according to their location. Interacting with the auditory display the user chooses a region of interest on the map which is displayed with the auditory map. By actively changing the region the user interactively explore the map. We developed different techniques to interact with the auditory map. With the first one described in [5] the user virtually walks on the maps and perceives all objects in the surrounding area by changing the position of a virtual listener. We enhanced this technique by equipping the user with a so called auditory torch, which acoustically illuminates the users surrounding. With the auditory torch the user can change the size of the perceived region and focus on smaller details by using a small torch or get a global impression using a larger one. To further aid in getting a more global idea of the map, we introduced a third technique, which raises the user s virtual position from the map to a bird's eye view. While all approaches focus on different aspects the goal of all interaction techniques is to aid the user in exploring the map and gain a cognitive model of the displayed area. In the following sections we describe in more detail the three interaction techniques for exploring the auditory map, followed by the results of our evaluation and an outline of our future work. Virtual Walk through the City To get an idea of the city's general spatial layout it is necessary that the user can easily perceive the most prominent features of the city. In a first step, we analyzed city maps and identified parks, lakes, sights, squares, and public buildings as main features. These objects are presented using an auditory display. According to their type, location, and shape, each geographic object is represented by an individual sound. Different object types are sonified using nonspeech sounds that provide some correlation to the real live object. For example, a lake is represented by the sound of dabbling water and parks by singing birds. To display the objects shape and location we place the sounds in a 3D sound room. All objects represented by a two dimensional area are located on a plane within the sound room. Their position and shape on the plane is equivalent to the position and shape on the map. If all main features of the map are sonified at the same time the user cannot distinguish the objects and identify their properties like type and position. Therefore, we make the objects accessible, by placing their sounds relative to a virtual listener. The user can freely move this listener across the plane on which the objects are located as shown in Figure 2. Thus, the user virtually walks through the city. As long as the listener

4 is outside of an object, its sound is placed at the point of the objects border with the smallest distance between the border and the listener. Moving the listener around an object changes the position of the sound on the objects border. Thus, the user always hears the object s nearest point and can thereby construct the object s silhouette. To further aid in understanding the map s global structure the listener s position is controlled by an absolute input device. The displayed objects are mapped on the surface of a digitizer tablet and the listener can be moved with the tablet s stylus. Moving the stylus on the tablet accordingly moves the listener on the map. The user can feel the extent of the tablet. By feeling and controlling the stylus position the user perceives the listener s position relative to the maps border. Knowing the listener s exact position on the map eases locating the surrounding objects. Illuminating the City Displaying more and more objects raises the problem that it becomes difficult to distinguish objects of the same kind, which are located close to each other and to perceive the objects shape and size. Therefore, we enable the user to dynamically concentrate on a certain region of the map. figure 2. The user explores the map by moving the listener across the map. Depending on the listener s position the objects can be heard in different intensity and from different directions. To ease the understanding of the objects relative arrangement, for instance that a park is left of a lake, the user perceives all nearby objects simultaneously from the listener s position on the map. If the user points between a park on the left and a lake on the right the user hear the park from left and the lake from the right accordingly. Thus, the user can sense relative directions. Because nearby objects are louder than farther objects the user is enabled to perceive the distance between objects as well. Our solution is an auditory torch which is moved with the listener and virtually illuminates the torch's surrounding as introduced by Donker et al. [7]. As shown in Figure 3, some objects on the map are shadowed which means that they remain silent. Only illuminated objects are hearable. When the torch approaches an object the object gets enlightened and its sound gets louder smoothly. By changing the brightness of the torch the user determines the area he or she is currently hearing. The brighter the torch, the larger is the illuminated area and farther objects are added to the auditory presentation. The user perceives a wider region and it becomes easier to find more distant objects. In order to focus on a smaller region and get a more detailed description, the user can dim the brightness of the torch.

5 the cursor around the map the user can select the perceived region and its size. i figure 3. The objects located in the shade remain silent, while the lighted objects are playing gently. Listening like a Bird The auditory torch enables the user to investigate either small or large regions, depending on personal preferences or tasks. The user s mental model is created from the listener s point of view on the map. The auditory localization of objects is always related to the listener s position and therefore relative. Absolute localization can performed only through the absolute pointing device. To enhance absolute localization also through the auditory sense and thus the perception of the map's global layout, we developed a third technique, which presents the auditory map from a bird's eye view. To underline the objects absolute positions on the map, we enable the user to step back and take a look on the map. The user's listening position is raised out of the map s plane as shown in Figure 4. That means that the sound of an object in the upper left corner of the map is perceived as being in the upper no matter how the cursor is moved. To still enable the user to focus on parts of the map we use the same torch metaphor as described above. By moving figure 4. Interacting with torches - Listener walks through the map (left) or observes the map from a fixed and distant location like a bird (right). Evaluations To test our interaction techniques we conducted two evaluations using headphones and a 3D-sound library. In the first evaluation we investigated the virtual walkthrough with eleven blind participants. The evaluation consisted of the three experiments according to the following aspects: The auditory map should aid in building and reproducing a mental model of an unknown area, mediate spatial relations between objects, and show relative distances between objects. We found that performing concrete tasks like " find a lake which is inside a park or find the lake which is most nearby a building were managed easily. Even though our results are promising we found two challenges. When presenting more than ten objects simultaneously, it was difficult to distinguish objects of the same type when they are located close to each other. In addition, most participants could not reproduce the objects shapes precisely.

6 Our tests of the two torch-based interaction techniques focused on reproducing the map. Six untrained persons explored and sketched the auditory presentation of a Brussels map with one of the two techniques. An example of a sketch is shown in Figure 5. The results of all three interaction techniques did not show significant differences in the quality of the reproduced maps. Further evaluations are planned to investigate the techniques specifically for certain user tasks, e.g. getting an overview, perceiving the shape of objects, measuring distances, following paths. figure 5. The presented auditory map of Brussels is shown on the left and the drawn impression on the right. Conclusion and Future Work We presented Auditory Maps, a system that sonifies city maps using 3D non-speech sound. It enables blind users to build a mental model of a city by determining the type and location of geographic objects and their relations. Three interaction metaphors can be used to explore an auditory map: the virtual walk-through, the walk through with an auditory torch, and the bird s eye view with an auditory torch. All techniques support the user in getting a non-visual overview of a city as a first step in the navigation process. The next step is the planning and exploration of routes. We plan to investigate these tasks and to apply haptic and tactile feedback when necessary. Our future work will also concentrate on using the proposed concepts for other map types, e.g. political, chloropleth, and weather maps, and for applications supporting sighted users, when their visual sense is already used for more critical tasks, e.g. presentation of spatial information while driving a car. Acknowledgments This paper is supported by the European Community s Sixth Framework programme (FP6-2003-IST-2-004778). References [1] Jacobson, R. D., Cognitive mapping without sight: Four preliminary studies of spatial learning. In Journal of Environmental Psychology (18), 1998, 289-305. [2] Gallagher, B., and Frasch, W., Tactile acoustic computer interaction system (TACIS): A new type of graphic access for the blind. In Proc. TIDE 1998. [3] Iglesias, R., Casado, S., Gutierrez, T., Barbero, J., Avizzano, C., Marcheschi, S., and Bergamasco, M, Computer graphics access for blind people through a haptic and audio virtual environment. In Proc. HAVE 2004, 13-18. [4] Zhao, H., Smith, B. K., Norman, K., Plaisant, C., and Shneiderman, B., Interactive sonification of choropleth maps. IEEE MultiMedia, 12(2), 2005, 26 35. [5] Heuten, W., Wichmann, D., and Boll, S., Interactive 3D Sonification for the Exploration of City Maps. In Proc. NordiCHI 2006, 155-164. [6] Brazil, E., and Fernström, M., Investigating concurrent auditory icon recognition. In Proc. ICAD 2006, 51-58. [7] Donker, H., Klante, P., and Gorny, P., The design of auditory user interfaces for blind users. In Proc. NordiCHI 2002, 149-156.