t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Similar documents
Cognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks

Haptic presentation of 3D objects in virtual reality for the visually disabled

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

INTELLIGENT WHITE CANE TO AID VISUALLY IMPAIRED

Blind navigation with a wearable range camera and vibrotactile helmet

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

SMART VIBRATING BAND TO INTIMATE OBSTACLE FOR VISUALLY IMPAIRED

A Design Study for the Haptic Vest as a Navigation System

Mobile Cognitive Indoor Assistive Navigation for the Visually Impaired

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Localized HD Haptics for Touch User Interfaces

Part 1: Determining the Sensors and Feedback Mechanism

PROJECT BAT-EYE. Developing an Economic System that can give a Blind Person Basic Spatial Awareness and Object Identification.

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

Interactive Exploration of City Maps with Auditory Torches

Substitute eyes for Blind using Android

A Survey on Assistance System for Visually Impaired People for Indoor Navigation

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Creating Usable Pin Array Tactons for Non- Visual Information

Design and Evaluation of Tactile Number Reading Methods on Smartphones

SMART WEARABLE PROTOTYPE FOR VISUALLY IMPAIRED

Comparison of Haptic and Non-Speech Audio Feedback

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Haptics for Guide Dog Handlers

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Azaad Kumar Bahadur 1, Nishant Tripathi 2

Towards a 2D Tactile Vocabulary for Navigation of Blind and Visually Impaired

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Haptic messaging. Katariina Tiitinen

Technology offer. Aerial obstacle detection software for the visually impaired

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Team members: Christopher A. Urquhart Oluwaseyitan Joshua Durodola Nathaniel Sims

Search Strategies of Visually Impaired Persons using a Camera Phone Wayfinding System

Salient features make a search easy

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Visually Impaired Assistant (VIA)

1 ABSTRACT. Proceedings REAL CORP 2012 Tagungsband May 2012, Schwechat.

Buddy Bearings: A Person-To-Person Navigation System

Enhanced Collision Perception Using Tactile Feedback

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

CANE: A Wearable Computer-Assisted Navigation Engine for the Visually Impaired

Adaptable Handy Clench for Destitute of Vision using GSM

A Comparative Study of Structured Light and Laser Range Finding Devices

HAPTICS AND AUTOMOTIVE HMI

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Safe Local Navigation for Visually Impaired Users with a Time-of-Flight and Haptic Feedback Device

Running an HCI Experiment in Multiple Parallel Universes

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

SMART READING SYSTEM FOR VISUALLY IMPAIRED PEOPLE

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Solar Powered Obstacle Avoiding Robot

CCNY Smart Cane. Qingtian Chen 1, Muhammad Khan 1, Christina Tsangouri 2, Christopher Yang 2, Bing Li 1, Jizhong Xiao 1* and Zhigang Zhu 2*

Smart Navigation System for Visually Impaired Person

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS

Haptic Feedback Technology

Waves Nx VIRTUAL REALITY AUDIO

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

MSD I SMART CANE INTEGRATION SYSTEM SYSTEMS DESIGN PHASE REVIEW. P15043 October 2, 2014

Exploring Geometric Shapes with Touch

Access Invaders: Developing a Universally Accessible Action Game

Haptics in Military Applications. Lauri Immonen

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

3D ULTRASONIC STICK FOR BLIND

ASSESSMENT OF A DRIVER INTERFACE FOR LATERAL DRIFT AND CURVE SPEED WARNING SYSTEMS: MIXED RESULTS FOR AUDITORY AND HAPTIC WARNINGS

Digitizing Color. Place Value in a Decimal Number. Place Value in a Binary Number. Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally

FLASH LiDAR KEY BENEFITS

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet

Exploring Surround Haptics Displays

CS415 Human Computer Interaction

EMA-Tactons: Vibrotactile External Memory Aids in an Auditory Display

Determining the Impact of Haptic Peripheral Displays for UAV Operators

COPYRIGHTED MATERIAL. Overview

Automatic Online Haptic Graph Construction

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

PerSec. Pervasive Computing and Security Lab. Enabling Transportation Safety Services Using Mobile Devices

AirTouch: Mobile Gesture Interaction with Wearable Tactile Displays

COPYRIGHTED MATERIAL OVERVIEW 1

5/17/2009. Digitizing Color. Place Value in a Binary Number. Place Value in a Decimal Number. Place Value in a Binary Number

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Development of Visually Impaired Guided System Using GPS, Sensors and Wireless Detection

Input-output channels

University of Nevada, Reno. Augmenting the Spatial Perception Capabilities of Users Who Are Blind. A Thesis Submitted in Partial Fulfillment

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

EFFECTIVE NAVIGATION FOR VISUALLY IMPAIRED BY WEARABLE OBSTACLE AVOIDANCE SYSTEM

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS)

Smart Blind Help ABSTRACT I. INTRODUCTION II. LITERATURE SURVEY

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Interactive guidance system for railway passengers

Study of Effectiveness of Collision Avoidance Technology

Prospective Teleautonomy For EOD Operations

ASSISTIVE TECHNOLOGY BASED NAVIGATION AID FOR THE VISUALLY IMPAIRED

Electronic Travel Aid Based on. Consumer Depth Devices to Avoid Moving Objects

2016 IROC-A Challenge Descriptions

EVALUATION OF DIFFERENT MODALITIES FOR THE INTELLIGENT COOPERATIVE INTERSECTION SAFETY SYSTEM (IRIS) AND SPEED LIMIT SYSTEM

Transcription:

t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss rstr r sr r 2 4 s r 6 t str t ss st t s2st s r t 2 s s t 2 r t s t t r s rs r t s s r t r s rt r t s2st s t t t r t t r r t st s r t s t s r t s2st s r t r t t 23 t t t t t t 2 s rs t s2st s r t t s rs t r t st st rs r r t s rs t r2 r s ts t s r t t t s s r 2 t s t s rs t s r st t t s2st s s 2 t s rs 2 r s t t s t ss st 2st tr t Visually impaired people face a wide amount of challenges when navigating outdoors without the assistance of a sighted person. Current standards suggest to walk only through predefined and previously known routes, while using a white cane for short range obstacle avoidance. Although guide dogs are a popular alternative, their availability is limited, and their costs very high. Recently, the widespread use of mobile phones with GPS has been a revolution in the field, allowing blind people to reach new places and thus providing an increased feeling of freedom. But the white cane is still necessary, as there are some problems that GPS-based systems cannot solve such as detection of obstacles in real time, finding crosswalks, etc. Several systems have attempted to replace or enhance the white cane (e.g. [1 4]). However the perception challenges of those systems have overshadowed the task of conveying real time obstacle information to the visually impaired users. Most GPS systems use speech to convey directions to the user, but this approach is not valid for real-time tasks, thus more fundamental audio and haptic interfaces are required.

rt 3 st t s rt st r t In this work, we have performed an in-depth analysis for the task of conveying short range navigation information to the blind user. In particular, we compared haptic against audio interfaces on a similar navigation scenario. Performing a fair comparison between haptic and audio modalities is complex due to the large variety of possible interfaces. In our lab we have developed several interfaces for a wide variety of tasks, so we chose our state-of-the-art audio and haptic interfaces as representatives of their respective modalities. Our audio based system used open headphones to pulse 20ms beeps at 800Hz. While the haptic system used a versatile bluetooth module to drive two linear vibration motors in 25ms pulses at 190Hz. In our preliminary tests, we found that two of the most common objective metrics used to evaluate the performance of user interfaces (speed and success rate) were of little use for this task. Particularly, on object finding tasks, users spent more time on a task when they enjoyed the interface, therefore a quick success does not necessarily imply a better interface. Success rate was also not relevant, as if the users are focused enough, they were able to achieve their goal using almost any kind of feedback. Therefore we based our evaluation on the well known NASA-TLX protocol that rates the perceived workload of each modality. The NASA-TLX (Task Load index) protocol is a subjective test developed by the Human Performance Group at NASA. It measures the perceived workload of a task over six categories: Mental Demands, Physical Demands, Temporal Demands, Own Performance, Effort and Frustration, and also weights the relevance of each category. Our results suggest that blind participants strongly favor haptics over audio. White cane users were accustomed to perceive short range navigation information from the haptic channel, and therefore found the system intuitive to use. On the other hand, they use the auditory channel for other tasks (i.e. orientation, communication, alerts), and its use for navigation was linked to an increase in stress. Blindfolded people however, reacted differently. People used to navigate with their eyes slightly favored the audio interface and found the haptic interface confusing. These results are important as, in studies researching interfaces for blind persons, blindfolded people are usually used as proxies for visually impaired people in order to have a significant amount of participants. Our results stresses the importance of having visually impaired users in the loop while researching user interfaces for the blind, instead of relying only on blindfolded people. Furthermore, we suggest that cognitive evaluation of navigation systems can reveal important cues that are not evident under objective measures such as speed or success rate. t r Short range navigation for the visually impaired has received a lot of attention both in indoor and outdoor scenarios [1 8].

t t rt t s s Fig. 1: 20m x 5m obstacle course used in our experiments. Eight chairs were used as obstacles. Each chair was labeled with an orange paper, as we simulated the obstacle detection part of the experiment using a wearable, camera based, color recognition system. Three obstacle configurations were used: one for the preliminary exploration, one for the audio round, and one for the haptic round. The problem is divided into two major components: perceiving the spatial information and conveying directions to the user. The perception problem was traditionally approached using ultrasonic distance sensors [1, 3,4], however several computer vision systems are being currently researched [2, 5, 7], as they have the potential of providing better guidance from a richer representation of the environment. Conversely, a wide variety of methodologies are used to convey directions to the user. Sonification systems are common in spatial localization tasks (e.g. [9]) and have been used for short range navigation [3]. Haptic actuators are very popular, however ultracane [1] places them on the handle of their smart cane while Cardin et al. [4] place them on their vest. Belts, gloves and bracelets are other common placement options, but there is no clear winner. Some user interfaces have been designed specifically for navigation tasks: Guide- Cane [3] pulls your hand towards the right direction while the tactile map presented by Velazquez et Al. [2] conveys directions using a 8x8 binary dot matrix. 1 r t t 3.1 Test Methodology The evaluation was performed outdoors albeit on a quiet neighborhood. We set up an obstacle course of 20 meters length and 5 meters width. The obstacles were represented by eight chairs and labeled in orange. Fig. 1.

rt 3 st t s rt st r t The test started with a briefing where users were allowed to familiarize themselves with the maze and the test (blind users used their white cane to explore the maze). Then the audio system was introduced, the obstacles were rearranged, and the users traveled several times through the maze until they were familiarized with the system. At that point, the experience was evaluated using the NASA- TLX protocol, and their opinions were also registered. The haptic test followed similarly. One hour was required per person, in order to allow enough time to familiarize with the interfaces and adjust them perfectly to their needs. None of the users had previously evaluated any of our systems. 3.2 Object Detection System To localize the obstacles, we labeled them using orange papers and detected them using our color recognition software from camera glasses. This color recognition software is an evolved subset of our object localization system [9]. The original system beeped every time a frame was processed (i.e. between 1 and 10 times per second depending on the mode of operation). Users were satisfied with the system, but claimed that lag on the feedback made the usage of the system in dynamic tasks difficult. We upgraded the system with a simpler and faster image processing algorithm. It allows 30Hz performance while achieving a very small delay between the video input and its derived audio output (between 5ms and 20ms). Increased speed introduced a small detriment of precision and a small increment of false detection ratio, but our users preferred the faster feedback. However, during the test, there were a few occasions were the color recognition software could not be used, and a Wizard-of-Oz approach was used instead. In those cases the test operator manually signaled the obstacles using bluetooth from an Android device. 3.3 Audio Feedback System Our audio feedback system was developed originally in 2011 together with our object localization software [9]. The original system beeped every time a frame was processed, mapping the horizontal coordinate to sound panorama (left-right) and the vertical coordinate to pitch. The system was upgraded to reduce the lag between the capture of the image and the signal of the information. This fast feedback allowed us to drop the vertical axis mapping, as users found that performing a beam scan with the camera was faster than processing the frequency information. Unexpectedly, we found that most test users were able to identify up to four items when sonifying them all simultaneously if they were focused enough. To further diminish the latency, we use a very lightweight interface based on OpenAL [12]. Each time a frame is processed, the information about the

t t rt t s s Fig. 2: Left: our haptic module with a battery, arduino processor, bluetooth communication, charger, motor driver, and two lentil linear vibration motors. Total weight: 16g. Center and right: haptic module installed on a white cane with the motors attached to the handle of the cane. The placement of the motors was customizable to each user. color blobs, their size, and a confidence value between 0 and 1 is mapped into audio. All detected blobs are sonified simultaneously. The frequency was fixed to 800Hz and the pulse duration to 20ms. The volume is mapped to the product of selection confidence and its area (bigger and clearer color areas are stronger). Although the camera has a field of view of approximately 60, the output sound is mapped between -90 and 90 (i.e. the angle is 3x magnified). The current evaluation achieved very positive results from our blind colleagues, who are accustomed to test our systems. In the navigation scenario, most users found that the simultaneous sonification of multiple obstacles was confusing, therefore the camera was worn pointing to the ground, resulting in only one obstacle usually being inside the field of view. 3.4 Haptic Feedback System To develop and evaluate haptic systems, we designed a tiny module capable of driving a wide array of different vibration motors (see Fig. 2). Each module is managed by an arduino processor, includes a battery, a bluetooth communication module, and a motor driver capable of driving two motors. It weights 16g. Bluetooth connectivity allows us to control the vibration modality either from a laptop or an android phone and interfaces easily with our computer vision systems. Each module can control up to two vibration motors independently, but there is no limit on the number of modules that can be controlled simultaneously. We have been using this platform since 2012 to evaluate a variety of haptic configurations which involved placing the motors on gloves, belts and white cane handles. Although we tested several different vibration configurations, for this evaluation we fixed the frequency to 190Hz and the pulse duration to 25ms. The placement configuration we evaluated in this paper was the most promising one: placing two motors on the handle of a white cane. We used linear haptic motors which provided finer tuning and faster response time than conventional

rt 3 st t s rt st r t eccentric-weight based motors. Vibration bursts were used to signal obstacles, with one motor signaling left, the second motor signaling right. Simultaneous vibration of both motors signaled front. Only one obstacle was signaled at a time. The haptic system required more customization than the audio system. Some users were not able to distinguish between left and right, in those cases both motors were activated only when their path was blocked by an obstacle. t 4.1 NASA Task Load index The NASA TLX [10] protocol was developed in 1986 at the Human Performance Center at NASA to evaluate the sources of workload of a particular task. This protocol has become a widely accepted tool used to evaluate cognitive aspects in a multidimensional way. The six dimensions measured are: Mental Demands, Physical Demands, Temporal Demands, Own Performance, Effort and Frustration. Three of them relate to the demands imposed on the subject (Mental, Physical and Temporal demands), while the other three evaluate the interaction of the subject with the task (Effort, Frustration and Performance). The test is meant to be straightforward to apply. It consists of two steps. First, the 15 possible pairwise comparisons between the six dimensions are presented, and the subject selects the member of each pair that contributed more to the workload of that task. The number of times that a dimension has been selected establishes the relevance of each dimension (0-5). The second step is to obtain numerical ratings (between 0 and 100) for each dimension that reflect the magnitude of that factor in a given task. The final workload value for each category is the product between the rating and the dimension. The maximum value for a single category is 500 (100 rating 5relevance), but the maximum value for the overall workload is 1500, as the sum of all relevance values is 15. Therefore, by dividing the sum of all workload values by 15, we obtain the percentage of total workload. In our case we administered the paper and pencil version [11]. 4.2 Results Due to the extensive test procedure, only six persons with different levels of visual impairment were evaluated. Half of them were white cane users while the other half took the test blindfolded. Results on the blindfolded group showed an overall cognitive workload ratio of 32.6% for the audio system against a 56.6% ratio for the haptic system. However on the blind group, the cognitive workload of the audio was of 74.7% against a mere 3.3% of the haptic system. For the complete results see Fig. 3. In general, the physical demand was the lightest of the six categories evaluated by the NASA-TLX test, followed closely by the performance category. This

t t rt t s s P 2s P 2s r t r t P r r r str t P r r r str t rt t s rs rt t s rs t P 2s P 2s r t r t P r r r str t P r r r str t rt t s rs rt t s rs t Fig. 3: NASA Task Load index: Sources of workload for our short range navigation experiment. The workload of the audio system was 74.7% on white cane users, 23.3% of which came from their own frustration while only 6.6% came from their performance. The workload of the haptic system was of 3.3%. On blindfolded users the results were inverted. The workload of the audio system was 32.6% with no frustration, while the workload of the haptic system was 56% of which 26% came from frustration. is because both systems were able to adequately guide the users through the obstacle course and were qualified as useful for the task. In the open questionnaire that was taken after the test, blindfolded users reflected on how the haptic system felt more limited than the audio system, as it was more difficult to discern between left and right signals. On the other hand, white cane users were not comfortable using audio as a feedback, since the auditory channel usually needs to be used for safety purposes (such as detecting cars, other people, and generally making sense of the environment).

rt 3 st t s rt st r t s s We have evaluated two state-of-the-art interfaces for blind users for the task of obstacle avoidance in short range navigation systems, one based on audio and the other on haptic feedback. Although both systems were qualified as satisfactory by the users, the cognitive load of the audio system was rated by the blind users more than 22 times higher than the load of the haptic based system. This is because haptics are very intuitive for white cane users while the auditory channel is being used much more for other important tasks. This bias was not present when both systems were evaluated by blindfolded users. Those results suggest that the common practice of using blindfolded test users to evaluate user interfaces for the blind should be avoided in short range navigation tasks. ts This research has been partially funded by Google through a Google Faculty Research Award. r s tt tr 3q 3 t t r t t t s r r t t s 2 r t t t st s2st s r t s 2 r r t r st t t s2st r s 2 r P st r t ss t t t r s P P t r str t r r r t s2st r t P str t t r t s 2 r rt 3 t t r s r st t t r t s 2 r rt t ss st s s2st r t t t s st t s P rt t t s ts r t r t r s r tt s2st s r s r s s tt t str s t t t