MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Similar documents
Audio GPS: spatial audio in a minimal attention interface

AudioGPS: spatial audio in a minimal attention interface

Navigation-by-Music for Pedestrians: an Initial Prototype and Evaluation

Virtual Tactile Maps

ARIANNA: path Recognition for Indoor Assisted NavigatioN with Augmented perception

Electronic Navigation Some Design Issues

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Buddy Bearings: A Person-To-Person Navigation System

Multi-User Interaction in Virtual Audio Spaces

Spatially Augmented Audio Delivery: Applications of Spatial Sound Awareness in Sensor-Equipped Indoor Environments

Navigation System for the Blind:

Auditory Localization

Interactive Exploration of City Maps with Auditory Torches

Introduction. 1.1 Surround sound

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

Non-Visual Navigation Using Combined Audio Music and Haptic Cues

PhantomParasol: a parasol-type display transitioning from ambient to detailed

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Sound source localization and its use in multimedia applications

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Experimenting with Sound Immersion in an Arts and Crafts Museum

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Introduction. phones etc. Those help to deliver services and improve the quality of life (Desai, 2010).

Haptic presentation of 3D objects in virtual reality for the visually disabled

Surround: The Current Technological Situation. David Griesinger Lexicon 3 Oak Park Bedford, MA

IMPROVING THE REALITY PERCEPTION OF VISUALLY IMPAIRED THROUGH PERVASIVE COMPUTING

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment

Location and navigation system for visually impaired

Algorithm for blind navigation along a GPS track

AUDITORY ILLUSIONS & LAB REPORT FORM

ABSTRACT. A usability study was used to measure user performance and user preferences for

Glasgow eprints Service

MIKING ACTORS ON A THEATER STAGE By Bruce Bartlett Copyright 2010

Context Sensitive Interactive Systems Design: A Framework for Representation of contexts

The psychoacoustics of reverberation

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

SOPA version 2. Revised July SOPA project. September 21, Introduction 2. 2 Basic concept 3. 3 Capturing spatial audio 4

Auditory distance presentation in an urban augmented-reality environment

A Novel Sound Localization Experiment for Mobile Audio Augmented Reality Applications

6 Ubiquitous User Interfaces

Experiments With Multi-Modal Interfaces in a Context-Aware City Guide

Research Article Testing Two Tools for Multimodal Navigation

Spatial auditory interface for an embedded communication device in a car

The Chatty Environment Providing Everyday Independence to the Visually Impaired

MUSC 1331 Lab 3 (Northwest) Using Software Instruments Creating Markers Creating an Audio CD of Multiple Sources

Microsoft Lync compatibility. Sennheiser Communications solution overview

Virtual Reality Calendar Tour Guide

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture. Radius, and Practice. BRUCE N. WALKER and JEFFREY LINDSAY

Constructive Exploration of Spatial Information by Blind Users

Microsoft Lync compatibility. Sennheiser Communications solution overview

Waves Nx VIRTUAL REALITY AUDIO

Designing Audio and Tactile Crossmodal Icons for Mobile Devices

ARGUS: Assisting Personal Guidance System for People with Visual Impairment

Platform-independent 3D Sound Iconic Interface to Facilitate Access of Visually Impaired Users to Computers

Usability Studies in Virtual and Traditional Computer Aided Design Environments for Benchmark 2 (Find and Repair Manipulation)

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time.

Sennheiser tour-guide systems. Created to. inspire people. Tour-guide systems

Service Cooperation and Co-creative Intelligence Cycle Based on Mixed-Reality Technology

Finding the Prototype for Stereo Loudspeakers

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Subject Name:Human Machine Interaction Unit No:1 Unit Name: Introduction. Mrs. Aditi Chhabria Mrs. Snehal Gaikwad Dr. Vaibhav Narawade Mr.

Navigation in Immersive Virtual Reality The Effects of Steering and Jumping Techniques on Spatial Updating

III. Publication III. c 2005 Toni Hirvonen.

SpringerBriefs in Computer Science

Audio Spotlighting. Premkumar N Role Department of Electrical and Electronics, Belagavi, Karnataka, India.

Haptic Navigation in Mobile Context. Hanna Venesvirta

Aastra Telecom compatibility. Sennheiser Communications solution overview

Rethinking Prototyping for Audio Games: On Different Modalities in the Prototyping Process

Adapting SatNav to Meet the Demands of Future Automated Vehicles

snom compatibility Sennheiser Communications solution overview

Guiding Tourists through Haptic Interaction: Vibration Feedback in the Lund Time Machine

Context-Aware Interaction in a Mobile Environment

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Avaya compatibility. Sennheiser Communications solution overview

1. Introduction. 2. Research Context

Laboratory Project 4: Frequency Response and Filters

Awakening Your Psychic Self: Use Brain Wave Entrainment to have a psychic experience Today!

Broadsoft Compatibility. Sennheiser Communications Solution Overview

Designing Embodied Interfaces for Casual Sound Recording Devices

Multimodal Interaction and Proactive Computing

Auditory Distance Perception. Yan-Chen Lu & Martin Cooke

Virtual Acoustic Space as Assistive Technology

A Java Virtual Sound Environment

The Sound of Touch. Keywords Digital sound manipulation, tangible user interface, electronic music controller, sensing, digital convolution.

The Mixed Reality Book: A New Multimedia Reading Experience

how many digital displays have rconneyou seen today?

ARIANNA: a smartphone-based navigation system with human in the loop

How To Work Out Songs By Ear On Guitar By Andy Crowley

INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR PROPOSING A STANDARDISED TESTING ENVIRONMENT FOR BINAURAL SYSTEMS

Interactive Headphones for a Cloud 3D Audio Application

Designing an Audio System for Effective Use in Mixed Reality

Designing Engaging Experiences with Children and Artists 1

Multichannel Audio In Cars (Tim Nind)

Transcription:

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS Richard Etter 1 ) and Marcus Specht 2 ) Abstract In this paper the design, development and evaluation of a GPS-based auditory navigation system is presented that implicitly guides a user by a contextualized rendering of personal audio files. The benefit of this navigation system is that the user can listen to his own audio contents while being navigated. Wearing headphones, the user listens to audio contents which are located in a virtual environment. The user simply walks in the direction where the sound seems to have its origin. A formal evaluation under field conditions proved that navigation with contextualized audio contents is efficient and intuitive and that users are highly satisfied with the navigation support given by the evaluated auditory display. 1. Introduction To navigate a person from one place to another, two essential pieces of information have to be communicated to the person: First the distance to the destination (or an intermediate waypoint). Secondly, the direction where he will find the destination relatively to the current direction of movement or relatively to the direction the user is facing [1]. Most auditory navigation systems use speech in order to communicate this information [3, 4]. The disadvantage of speech is that it requires a lot of attention. A way to communicate distances and directions without speech is spatial audio. An auditory display that uses spatial audio creates a sound that appears to emanate from a given environmental location. Most navigation systems that use spatial audio use it to add spatial information to spoken navigation cues [2]. A navigation system that does not use speech is AudioGPS [1]. It places a briefly repeated tone in the direction of the destination. This indicates the user in which direction to go. These traditional navigation systems have in common, that they guide a user with explicit navigation instructions. They communicate instructions directly by using speech or non-speech navigation cues. Our approach is that explicit instructions are not necessary to navigate a person. In this paper we present the auditory navigation system Melodious Walkabout that implicitly guides a user by providing him with awareness where the destination is located. 1 Fraunhofer Institute of Applied Information Technology (FIT), email: richard.etter@ipsi.fraunhofer.de 2 Fraunhofer Institute of Applied Information Technology (FIT), email: marcus.specht@fit.fraunhofer.de

2. Melodious Walkabout 2.1. Scenario Joy is located in the middle of a huge unknown town. She faces the everyday problem to find the way from one place to another. Usually she starts asking people for the way, struggles with maps and finally panics. This time it is different - Joy puts on headphones and selects her destination and favourite songs on her Personal Digital Assistant (PDA). The music begins to play and Joy follows the music through the city. After listening a couple of her favourite songs Joy arrives relaxed at her destination. 2.2. Design of Melodious Walkabout Melodious Walkabout is a PDA-based auditory navigation system. It provides a mobile user with awareness where the destination is located by contextualizing audio contents the user is listening to. The mobile user wears headphones and hears audio contents that reach him from a certain direction. The direction of the virtual sound source unobtrusively tells him in which direction to go. An important feature of Melodious Walkabout is, that it can be used with any audio contents. This gives the user the opportunity to listen to audio contents he likes to, while at the same time being aware of the direction and distance of the destination. An auditory navigation system consists of three components: 1. A module for determining the traveller s position and movement. 2. A Geographical Information System (GIS) that defines geographical cues. 3. A user interface consisting of an input device to control the navigation system and an auditory display that audifies the geographical cues, so the user is able to hear and understand the information. Melodious Walkabout uses the global positioning system (GPS) to locate the user. A GPS receiver continuously determines the position of the user and his direction of movement. Based on this data the GIS calculates the distance to the destination and determines the bearing of the destination relative to the user's direction of movement. Melodious Walkabout does not take into account the direction in which the user is facing. The convention is that the user faces the direction of travel. However the system could be easily augmented with a compass to track the line of sight. 2.3. An Auditory Display Model for Contextualizing Personal Audio Contents The auditory display of the navigation system creates the illusion of a sound source reaching the user from the direction of the destination. The direction of the virtual sound source indicates the user in which direction he has to move. Instead of using fixed audio contents, the auditory display of Melodious Walkabout takes the user's personal audio contents and places them in the direction of the destination. This means the personal audio contents are contextualized since they are adapted to the geographical context of the user. The auditory display model was designed to be simple, unobtrusive, intuitively understandable and convenient. A challenge of an auditory display that uses personal audio contents is to preserve their aesthetical character. The goal was to design an auditory display model that plays audio contents in high quality and alters them in a clear but acoustically convenient way.

To indicate the direction where the destination is located simple panning is used, representing the direction across the stereo sound stage. The benefit of using panning is that the user hears the music without any changes as long as he moves in the right direction. However, panning the sound source from left to right does not indicate the user if the destination is located in front or behind him. This is achieved by using a low pass filter: The more the destination is located opposite of the user's direction of moving, the stronger the sound is muffled. If the destination is located in front of the user, the low pass filter is not used at all. In order to preserve the aesthetics of the sound or music, it was decided not to map the distance to the destination during most of the time. Only in the end of the navigation the distance is indicated. The volume of the sound decreases towards zero when the user approaches the destination. This focuses the user on his real acoustic environment and indicates him the distance to the destination. 3. Implementation Melodious Walkabout consists of a GPS-receiver, a Personal Digital Assistant (PDA) and headphones. Since users are moving outside it is important not to shut out completely the real auditory environment of the user. Therefore open headphones are used. This enables a user to hear both, his auditory environment and the played audio contents. The lightweight GPS-receiver is installed on top of the headphones, since the reception is best in this place. A PDA has been used because it is small enough to be carried on a belt or in a pocket. Thus the system can be worn conveniently and the user can move freely. In addition the used technology is widespread and available worldwide. In regard to the audio processing special attention has been paid to a high quality of the sound. This is important since one can assume that users would not be glad about hearing e.g. their favourite songs distorted. However Digital Signal Processing (DSP) needs high computing power and the audio filters have to be applied in real-time to the audio contents. In order to implement the system on a PDA fast, efficient and high-quality audio filters had to be developed. 4. Evaluation A first evaluation focused the following questions: Is an implicit navigation with contextualized personal audio contents intuitively understandable? Is this type of navigation efficient? Does the usage of familiar audio contents increase the effectiveness of the navigation? 4.1. Methodology A total of 24 subjects were recruited. Each subject tested the prototype on two different but equivalent routes. On one route subjects were navigated with a song they didn't know. On the other route subjects were navigated with their own music. The subjects were asked to bring their favourite song. Both, the order of the routes (route A and route B), and the music (own music and unknown music) were alternated equally. The subjects had to find out by themselves how the navigation works. No further explanation was given to them. Melodious Walkabout automatically measured the time each subject needed to complete a route and the distance he covered. After completing the two routes each subject was asked to fill out a questionnaire. This method was chosen to get a comprehensive understanding of the way the subjects perceived the navigation.

4.2 Results The results of the measured data were the following: The subjects completed the second route significantly shorter than the first route. On an average they completed the first route in 4:40 minutes and the second in 3:30 minutes. The covered distance of the second route was also significantly smaller than the distance of the first route. On the average, subjects improved from 342 to 256 meters. The optimal length of both routes was 210 meters. Regarding the type of music no significant influence was detectable. The subjects performed equally with own music and with unknown music. However the questionnaire data indicated that subjects perceived the navigation cues significant clearer and were significantly more satisfied with the navigation, when using their own music. On the average the subjects were satisfied with the navigation. They stated the navigation worked well and required only minimal attention. 4.3. Discussion It is important to state that the subjects had to find out by themselves how the navigation works. On the average, subjects completed the second route 27% faster. The length they covered was 25% shorter. After spending on the average 4:40 minutes with the system, they were able to complete the second route close to the optimum. These results show that the subjects understood the navigation intuitively and quickly. Also a high learning effect was detectable. An implicit navigation with contextualized audio contents is efficient. The subjects used the navigation system with their own music and with unknown music. Regarding the objective data the music did not have any impact on the time or length of path. However, the results of the questionnaire revealed that the subjects perceived the navigation cues significantly clearer and that they were significantly more satisfied with the navigation. This means that from a subjective perspective the navigation was more efficient. Since the subjects were free to use any song they liked the system was tested with a variety of different music styles ranging from piano over electronic up to traditional Chinese music. Even music songs with stereo and surround effects had no detectable effect on the navigation. 5 Conclusion Melodious Walkabout has shown to be an efficient alternative to speech navigation systems and especially more life style oriented audio applications could make use of such an auditory display. The audio mapping implemented and later evaluated was simple but very intuitive and efficient for user navigation. Nevertheless the auditory display model evaluated is just one possible audification of context parameters like the position and the direction of a user relative to the user's goal. The system allows a dynamic adaptation of a given audio content. It could be used for a variety of purposes for bringing awareness to the user in a mobile user context. Especially in regard to mobile user devices the possibilities of audio interfaces and the possibilities to send multiple information to the user via an audification are often underestimated. Melodious Walkabout shows how the encoding of context parameters can be used to give navigational clues and other information by combining panning and audio filters. In the work presented we explicitly constrained the evaluation of the system to a single user scenario. The technology available can easily be extended to a multi

user system with auditory displays. We think that multi user auditory displays will have an important impact on user interface development for mobile and pervasive computing as they allow us to encode information in an unobtrusive but still effective and intuitively understandable manner. References [1] HOLLAND S., MORSE D.R., and GEDENRYD H., AudioGPS: Spatial audio navigation with a minimal attention interface, in: Personal and Ubiquitous Computing, 6:253, 2002. [2] LOOMIS J.M., GOLLEDGE R.G., and KLATZKY R.L., Navigation system for the blind: Auditory display modes and guidance, in: Presence, Vol. 7, page 199, 1998. [3] MAKINO H., OGATA T., and ISHII I., Basic study for a portable location information system for the blind using a global positioning system, Technical report of the IEICE, 1992. [4] STROTHOTTE T., PETRIE H., REICHERT L., Development of dialogue systems for a mobility aid for blind people: initial design and usability testing, in: Proceedings of the second annual ACM conference on Assistive technologies, pages 139-144. ACM Press, 1996.