Buddy Bearings: A Person-To-Person Navigation System

Similar documents
Interactive Exploration of City Maps with Auditory Torches

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Haptic presentation of 3D objects in virtual reality for the visually disabled

A Study on the Navigation System for User s Effective Spatial Cognition

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

Comparison of Haptic and Non-Speech Audio Feedback

Multi-User Interaction in Virtual Audio Spaces

Waves Nx VIRTUAL REALITY AUDIO

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Toward an Augmented Reality System for Violin Learning Support

3D and Sequential Representations of Spatial Relationships among Photos

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

GPS Waypoint Application

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Exploring Surround Haptics Displays

The Gender Factor in Virtual Reality Navigation and Wayfinding

Introduction. 1.1 Surround sound

Validation of lateral fraction results in room acoustic measurements

AUDITORY ILLUSIONS & LAB REPORT FORM

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

SpringerBriefs in Computer Science

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Virtual Reality Calendar Tour Guide

M-16DX 16-Channel Digital Mixer

Abstract. 2. Related Work. 1. Introduction Icon Design

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Running an HCI Experiment in Multiple Parallel Universes

The effect of 3D audio and other audio techniques on virtual reality experience

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Building a bimanual gesture based 3D user interface for Blender

Glasgow eprints Service

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Robotic Spatial Sound Localization and Its 3-D Sound Human Interface

Designing & Deploying Multimodal UIs in Autonomous Vehicles

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Haptic control in a virtual environment

Bridgemate App. Information for bridge clubs and tournament directors. Version 2. Bridge Systems BV

INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR PROPOSING A STANDARDISED TESTING ENVIRONMENT FOR BINAURAL SYSTEMS

Head-Movement Evaluation for First-Person Games

5/17/2009. Digitizing Color. Place Value in a Binary Number. Place Value in a Decimal Number. Place Value in a Binary Number

The GPS Classroom. Jared Covili

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments


HELPING THE DESIGN OF MIXED SYSTEMS

Indoor Positioning 101 TECHNICAL)WHITEPAPER) SenionLab)AB) Teknikringen)7) 583)30)Linköping)Sweden)

Virtual Reality Based Scalable Framework for Travel Planning and Training

Experimenting with Sound Immersion in an Arts and Crafts Museum

Using Doppler Systems Radio Direction Finders to Locate Transmitters

Measuring impulse responses containing complete spatial information ABSTRACT

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Digitizing Color. Place Value in a Decimal Number. Place Value in a Binary Number. Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally

Interior Design using Augmented Reality Environment

Technology designed to empower people

Enhanced Push-to-Talk Application for iphone

Quick Start Training Guide

The Mixed Reality Book: A New Multimedia Reading Experience

Lab 8: Introduction to the e-puck Robot

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Geo-Located Content in Virtual and Augmented Reality

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

USING AUDITORY DISPLAY TECHNIQUES TO ENHANCE DECISION MAKING AND PERCEIVE CHANGING ENVIRONMENTAL DATA WITHIN A 3D VIRTUAL GAME ENVIRONMENT

Optical Marionette: Graphical Manipulation of Human s Walking Direction

The Wellness Call How-to Guide

Surround: The Current Technological Situation. David Griesinger Lexicon 3 Oak Park Bedford, MA

FLUX: Design Education in a Changing World. DEFSA International Design Education Conference 2007

Platform-independent 3D Sound Iconic Interface to Facilitate Access of Visually Impaired Users to Computers

Accuracy, Precision, Tolerance We understand the issues in this digital age?

Evaluation of a new stereophonic reproduction method with moving sweet spot using a binaural localization model

AUGMENTED REALITY IN URBAN MOBILITY

Virtual Sound Source Positioning and Mixing in 5.1 Implementation on the Real-Time System Genesis

Proposal Accessible Arthur Games

O P S I. ( Optimised Phantom Source Imaging of the high frequency content of virtual sources in Wave Field Synthesis )

Journal of Professional Communication 3(2):41-46, Professional Communication

Getting Started Guide

User Guide. PTT Radio Application. ios. Release 8.3

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Transcription:

Buddy Bearings: A Person-To-Person Navigation System George T Hayes School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 ghayes@ischool.berkeley.edu Dhawal Mujumdar School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 dhawal@ischool.berkeley.edu Thomas Schluchter School of Information University of California, Berkeley 102 South Hall Berkeley, CA 94720-4600 thomas@ischool.berkeley.edu Copyright is held by the author/owner(s). CHI 2010, April 10 15, 2010, Atlanta, Georgia, USA. ACM 978-1-60558-930-5/10/04. Abstract This paper proposes a mobile application to facilitate the meeting of people in unmarked spaces. We report on the concept, aspects of the work-in-progress implementation and future steps. Keywords 3D Sound, Navigation ACM Classification Keywords H.5.1 [MULTIMEDIA INFORMATION SYSTEMS] Audio input/output, H.5.2 [USER INTERFACES]: Input devices and strategies, Haptic I/O General Terms Design, Experimentation, Human Factors Introduction Buddy Bearings is a mobile application, currently in a conceptual stage that enables people outdoors to find each other without having to negotiate a meeting place. The usefulness of this approach lies in situations where locations are not readily addressable by all participants. This might include meeting in locations that have no marked or identifiable features such as natural spaces, or unfamiliar locations where users have no prior experience in navigating. The goal in designing Buddy

Bearings is to promote an exploratory and playful method of navigation that allows a user to learn more and make decisions about their environment. Current navigation systems assume that users are looking for one predetermined set of directions to a fixed location; Buddy Bearings innovation is its flexibility in wayfinding to a potentially moving target. Although using Buddy Bearings is a goal-driven activity, it makes the actual navigation a background task. We focused on providing guidance through sound cues rather than through visual means, eliminating the need to carry a device that requires the user's constant attention. Buddy Bearings differs significantly from traditional navigation systems and from previous research in ways we will describe below. Previous work Research on orientation techniques and wayfinding devices that offer alternatives to purely visual interfaces dates back to the mid 1980s. The efforts that are most relevant to our project can be divided into two main areas: Orientation in virtual worlds/cyberenvironments Orientation in the real world for people with visual impairments Orientation in Virtual Worlds/Cyber-Environments Darken and Sibert [1] present a "Toolset for Navigation in Virtual Environments". Based on research that tries to map real-world navigation and orientation behavior to interactions with virtual worlds, the paper explores ways to maintain knowledge of position and orientation. Among the tools used in the virtual worlds are spatialized auditory cues that effectively serve to enlarge the user's perception of objects that are in sight. As the authors note, the findings from this study have limited applicability to real three-dimensional navigation processes. Nevertheless, it sustains the basic hypothesis that auditory cues can augment the exploration of an (cognitively) unmapped space by providing orientation. In the same vein, but with greater theoretical depth, Begault [2] systematizes the use of 3D audio for Virtual Reality systems. For orientation purposes, the use of sounds that are placed outside the stereo basis using spatial modeling techniques yields excellent results under stable conditions. Most listeners are able to locate the simulated source of a sound in all three dimensions with relative precision, a result that is corroborated by similar experiments in a non-virtual spatial setup [3]. To reproduce the stable conditions required for these results though is infeasible in the setting that Buddy Bearings is targeting. Specifically, spatializing an audio signal played back through headphones is unreliable unless a customization to the individual physiological features of the listener is carried out. Orientation in the Real World for People with Visual Impairments The second relevant strand of research explores ways to provide spatial orientation to people with partial or complete loss of vision in everyday situations. Ross/Blasch [4] evaluate the effectiveness of wearable devices that employ a mixture of auditory and haptic feedback to guide a user from waypoint to waypoint. According to their findings, haptic feedback yields the most reliable results in terms of routing, while Marston et al. [5] report equally successful experiments with haptic and auditory feedback. The latter study also suggests that a relatively simple, binary sound feedback is sufficient for waypoint navigation.

The studies most similar to our project are conducted by the Sonification Lab at the Georgia Institute for Technology. The SWAN project (System for Wearable Audio Navigation) [6] uses the concept of 'sound beacons' that represent locations, which visually impaired user is supposed to reach on a predefined path. Sound beacons replace the visual cues a sighted person would use for navigational purposes with sound signals that are manipulated according to the listener's distance and heading. While Buddy Bearings builds on a similar idea, namely to direct users to a location that is represented by a virtual sound source, two key aspects differentiate it from previous work. 1. The domains in which the aforementioned projects operate impose the constraints on the form of guidance that these systems provide. Since it is of vital importance that a visually impaired person does not stray from a predefined path that can be considered reasonably safe, all the systems take a traditional waypoint approach: The route consists of a finite number of positions that must be reached in order to proceed further. In that sense, these projects emulate car navigation systems and mainly optimize the output for low cognitive load. Buddy Bearings is targeted towards users who can orient themselves visually, and aims to augment their normal wayfinding experience with generalized directional guidance. 2. Buddy Bearings has no concept of predefined locations. The users cannot know in advance what their meeting point will be. Instead, Buddy Bearings aims for a fair distribution of effort by constantly adapting the meeting point to the relative positions of the users. As a consequence, the exploratory aspect of the navigation process gains importance. Buddy Bearings tries to incorporate an element of playfulness into the task of finding another person and relies on the participants' ability to decide autonomously which route they want to take towards a flexible goal. Use case: Two people in an environment that is not familiar to one of them Audrey and Jim haven't seen each other in a while. When Audrey by chance comes to the Bay Area on a business trip, they decide to meet. Jim, a graduate student at the Music Department at UC Berkeley, knows his way around campus very well. Audrey has never before visited UC Berkeley. Following Jim s advice to find parking close to campus, Audrey drives her car to North Gate; from here on, it s either biking or walking. After parking her car, Audrey opens the Buddy Bearings application on her smartphone and issues a Buddy Bearings request to Jim. A text message containing a session link is sent to Jim who is already waiting for Audrey s request. He accepts and agrees that his location information may be used during this session. Both are now prompted to follow the orientation process. On the screen of their phone, they see a representation of their own position in relation to the other as well as the distance between them. Both start walking in this general direction as far their surroundings allow. Because of the layout of their surroundings, they can't always follow a straight path. Buddy Bearings constantly picks up their location and

their bearing through the built-in GPS and digital compass modules of their phones. That way, the meeting point between them is recalculated as they move on. In regular intervals, they are alerted to the position of the meeting point through auditory signals. The intensity of the signals indicates how far off to the left or to the right their bearing is from the position of the meeting point. If they turn towards the meeting point while the signal is sounding, it fades in intensity, if they turn away, it rises in intensity. Whenever their bearing falls within a certain range close to that of the meeting point, they hear an affirmative sound different from the directional cues. When either party feels the need, they can request a re-orientation through their phone. Again, they will be shown the relative direction of their Buddy as well as the total distance. The process continues until the Buddies are close enough to start looking for each other. Once reunited, they quit the application. System Overview Ideally Buddy Bearings will work as a smartphone application that utilizes a digital compass and GPS capabilities as found in the current model of the Apple s iphone. It will allow users to connect to each other using this application and then use an initial visual cue along with a series of dynamically updating auditory cues to find the central point between two users. The GPS system will constantly report positional data for both users. The digital compass provides information about the user s heading. Using these inputs, the application will calculate the common midpoint between the users in regular intervals. The application will use directional audio output to indicate whether the user currently is headed in the right direction. The auditory cues vary in intensity according to the user s position relative to the midpoint. Visual Interface & Interactions Along with the auditory guidance system there will be a visual interface that would serve to allow a user to initiate a session, connect with another user, provide a directional visual orientation and a trip record. The visual interface of the application is purposefully meant to be simplistic in its functionality; focusing just on these four areas so that a user is not encouraged to routinely look down at the screen while navigating. The primary experience should be auditory, allowing for more complex tasks before and after the session has been terminated by the user and they have found their Buddy. The Main Menu is the first screen that loads once the application is launched. Its primary call to action is to Connect with a Buddy, which would allow a user to connect with another user by either adding a number from their phone s address book, typing in a number or scrolling through their Buddies list. Once the user has selected how they want to connect with their Buddy system will generate a text message to them with a link to either download and sign up for Buddy Bearings, or to launch the application. Alternatively a user could also scroll through their entire Buddies list and connect directly to a user they ve initiated a session with in the past. The main menu also gives notices as to when a user has invited you to a Buddy Bearings session, which you can select to confirm and begin using the service.

Figure 1. (Re)-Orientation screen that displays the Buddy s position in relation to the user. Once a user confirms and then begins a session, they are initially given a visual cue as to where they are facing in comparison to their buddy (Fig. 1). This is shown using a central figure meant to represent the user through an icon that they ve chosen that is associated with their profile. Through a circle drawn around the user it shows where they headed using an arrow, and then gives a view of where their Buddy is facing in relationship to them. The circle is shown as red when the user is misaligned with their Buddy, but once the user adjusts to match the direction of their Buddy it turns into green. When the user is aligned with their Buddy, users then receive their first audio cue for direction, and the device can be stored for the remainder of the session. User can receive the updated visual (re)-orientation throughout the session by making request. When the trip has been terminated and a user has found their Buddy a record of the user s trip would be generated on a traditional digital map (Fig. 2). It would give an indication as to both where the user and their Buddy began walking, the path that they took and where they ended up meeting. It might also give statistics as to how far they both travelled, how many turns or reorienting points they incurred and how long the session lasted. Prototype Development and Evaluation Figure 2. Recapitulation of the process after the session has been terminated. To model the required interactions, we carried out a use case simulation. Two persons equipped with video cameras recorded a self-directed walk towards a location that had been agreed on earlier. The setup emulated a situation where two persons independently choose a path to a shared point. Based on timestamps, we plotted the movements on a map, reconstructing the route and the approximate headings of the participants at a given time (Fig. 3). Using this data, we could identify the most common events and constellations that the system needed to cover. After repeating the simulation several times at different locations, it became obvious that the dominating case would involve both users being slightly off target (typically in a range of +/- 30 degrees) while generally proceeding in a direction that made a meeting reasonably probable. The first physical prototype aimed to translate these results into distinctive patterns of audio feedback. Using this prototype, we wanted to test whether people were able to distinguish various dynamic auditory cues to establish a direction. The digital compass unit was connected to a laptop computer parsed the incoming data. The program that was developed for this purpose used the directional data from the digital compass to generate dynamic auditory cues based on a comparison of the user s heading and the actual direction of the destination (standard north for testing purposes). To allow for free movement, the laptop computer was placed in a backpack, and the compass unit was affixed to one of Figure 3. The transcription shows the approximate position of the user at a certain timestamp and displays her bearing as a vector.

backpack s straps to ensure that the readings were not influenced by axis rotation (Fig.4). Figure 4. First iteration of the physical prototype with digital compass unit attached to the left strap of the backpack. We provided three different localizable auditory cues on each side that varied based on the user s bearing. The most effective cues were in the form of several clicks at variable rates of speed. The more the user was misaligned with the target, the faster the click rate he or she heard. These cues were played in groups of three ( bursts ), so that there would be a longer interval of silence between them. The reasoning behind this approach was to achieve regularity of feedback without creating an overbearing system that terrorized the user with constant artificial sounds. To reassure the user in the case of turning in the right direction, a different sound was played as soon as the bearing fell into a range of +/- 10 degrees. In-ear headphones were used to deliver the auditory cues. After the implementation of the prototype, we conducted user testing. We asked our test users to wear the backpack and to try to follow the audio cues. The specific task was to move through a room and find the direction the virtual sound source seemed to be coming from. In general, users were successful in identifying the reference direction. Future Work The next immediate step would be to build on the initial prototype to include a GPS device for live field tests. Once we have completed our initial prototype we would begin development of a mobile application. We would also need to experiment with ways that the user could potentially wear the device so that the digital compass would be able to operate throughout the session. User testing would need to occur to evaluate what would be the best visual interface that both balances simplicity and usability. Once we ve completed one version of our application we would consider other ways to use sound and navigation to connect people. Some of our initial thoughts are using sound icons as a means of identification of either individuals or groups of people, allowing users walking around to hear what might be happening in a room or building through public microphones or using sound to improve live action games through augmented reality. Acknowledgements We would like to thank our faculty advisor Prof. Kimiko Ryokai for her help and guidance as well as all the users who participated in prototype testing. References [1] Darken, R.P., Sibert, J. A Toolset for Navigation in Virtual Environments. Proc. UIST 93, ACM Press (1993), 157-165 [2] Begault, D. 3-D Sound for Virtual Reality and Multimedia, NASA Technical Information Service, 2000. [3] Gröhn, M., Lokki, T., Takala, T. Localizing Sound Sources in a CAVE-Like Virtual Environment with Loudspeaker Array Reproduction. Presence, Vol. 16, No. 2 (2007), 157 171. [4] Ross, D.A., Blasch, B.B. Wearable Interfaces for Orientation and Wayfinding. Proc. ASSETS 00, ACM Press (2000), 193-200. [5] Marston, J. R. et al. Nonvisual Route Following with Guidance from a Simple Haptic or Auditory Display. Journal of Visual Impairment & Blindness (2007), 203211. [6] Walker, B. N., Lindsay, J. Navigation Performance with a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice. Human Factors, Vol. 48, No. 2 (2006), 265-278.budd