Comparison of Haptic and Non-Speech Audio Feedback

Similar documents
GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Exploring Geometric Shapes with Touch

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Exploring Surround Haptics Displays

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Salient features make a search easy

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

From Encoding Sound to Encoding Touch

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Creating Usable Pin Array Tactons for Non- Visual Information

Investigating the use of force feedback for motion-impaired users

INVESTIGATING BINAURAL LOCALISATION ABILITIES FOR PROPOSING A STANDARDISED TESTING ENVIRONMENT FOR BINAURAL SYSTEMS

Evaluation of Five-finger Haptic Communication with Network Delay

Computer Haptics and Applications

Evaluating the Effectiveness of Auditory and Tactile Surface Graphs for the Visually Impaired

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Do You Feel What I Hear?

Using haptic cues to aid nonvisual structure recognition

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Haptic presentation of 3D objects in virtual reality for the visually disabled

Enhanced Collision Perception Using Tactile Feedback

GEOMETRIC SHAPE DETECTION WITH SOUNDVIEW. Department of Computer Science 1 Department of Psychology 2 University of British Columbia Vancouver, Canada

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Tactile Presentation to the Back of a Smartphone with Simultaneous Screen Operation

Using Haptic Cues to Aid Nonvisual Structure Recognition

Feeding human senses through Immersion

Proprioception & force sensing

Interactive Exploration of City Maps with Auditory Torches

these systems has increased, regardless of the environmental conditions of the systems.

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

INDE/TC 455: User Interface Design

702. Investigation of attraction force and vibration of a slipper in a tactile device with electromagnet

INDE/TC 455: User Interface Design

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

THE INTERACTION BETWEEN HEAD-TRACKER LATENCY, SOURCE DURATION, AND RESPONSE TIME IN THE LOCALIZATION OF VIRTUAL SOUND SOURCES

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

HRTF adaptation and pattern learning

Vibrotactile Apparent Movement by DC Motors and Voice-coil Tactors

Running an HCI Experiment in Multiple Parallel Universes

2. Introduction to Computer Haptics

The influence of exploration mode, orientation, and configuration on the haptic Mu«ller-Lyer illusion

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

Light and Sound Brochure. Techniquest Stuart Street Cardiff Bay Cardiff CF10 5BW. Tel:

Glasgow eprints Service

Glasgow eprints Service

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

The Impact of Haptic Touching Technology on Cultural Applications

The effect of 3D audio and other audio techniques on virtual reality experience

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Haptic Rendering CPSC / Sonny Chan University of Calgary

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

SpringerBriefs in Computer Science

Automatic Online Haptic Graph Construction

Shanthi D L, Harini V Reddy

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Input-output channels

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Haplug: A Haptic Plug for Dynamic VR Interactions

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Microsoft Scrolling Strip Prototype: Technical Description

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Output Devices - II

A Design Study for the Haptic Vest as a Navigation System

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

Tactile Vision Substitution with Tablet and Electro-Tactile Display

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Haptic Technology- Comprehensive Review Study with its Applications

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Introduction to Haptics

Towards a 2D Tactile Vocabulary for Navigation of Blind and Visually Impaired

TapBoard: Making a Touch Screen Keyboard

Collaboration in Multimodal Virtual Environments

Prepare Sample 3.1. Place Sample in Stage. Replace Probe (optional) Align Laser 3.2. Probe Approach 3.3. Optimize Feedback 3.4. Scan Sample 3.

Geo-Located Content in Virtual and Augmented Reality

VIRTUAL ACOUSTICS: OPPORTUNITIES AND LIMITS OF SPATIAL SOUND REPRODUCTION

Glasgow eprints Service

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Output Devices - Non-Visual

Transcription:

Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability study which investigated the use of haptic versus non-speech audio interface to identify different geometric shapes. The study used simple graphics containing one to three geometric shapes (line, triangle, rectangle and circle). We presented the graphics to 11 participants in two different modes, audio and haptic, in a counterbalanced design. The participants were asked to identify the number and the types of the shapes. Error rates with audio and haptic feedback were very similar. The time to answer the overview task was generally faster with audio feedback, however it was generally faster with haptic feedback for detailed view task. These results need to be considered with some care because they were not statistically significant because of the small number of participants. Keywords: graphics, usability, accessibility, haptic, audio, multi-touch 1 Introduction There have been many assistive technologies which use different human sensory systems. Among these systems haptic and aural systems are the most preferred ones to present graphical information because of their characteristics. The haptic subsystem is specialised to process tactual and kinesthetic stimuli. It has sensors that receive stimuli about touch, temperature and motion [1], so it can provide information about shape, size, texture and position of an object [2]. The aural subsystem has sensors that receive aural information such as speech and non-speech audio [1]. It is more effective in acquiring sequential stimulus than the haptic subsystem [2]. The aural subsystem provides binaural hearing in which time and position differences in sound occurring due to the natural spacing of the head and the ears, enables a person to locate the source of a stimulus [5]. As a result of their characteristics, these sensory systems have been used in many different assistive technologies, such as [3, 4] which use haptic, and [6, 7] which use aural. Both of these approaches have successfully demonstrated that they work on some graphics. However, it was not clear for us to decide which one to use in the GraVVITAS system [8] that we have been developing. Therefore, we want to compare these two different approaches. 26

This paper reports a usability study which investigates the use of haptic versus non-speech audio interface modes to identify different geometric shapes. We provide the preference, time and number of errors for participants in each of the modes, as well as the strategies that they use. 2 Comparison of Haptic and Non-Speech Audio Feedback In our first trials we experimented with the number of fingers that we attached the vibrating motors to. We tried (i) only the right index finger, (ii) the left and right index fingers, and (iii) the left and right index and middle fingers. Our experience, corroborated by feedback from blind participants in pilot studies was that it was beneficial to use fingers on both hands but that it was difficult to distinguish between vibration of the index and middle finger on the same hand. We first tried attaching the vibrating devices to the underside and then to the top of the finger but this made little difference. Our experience is that, with sufficient practice, one can distinguish between vibration on all four fingers but this takes many hours of use. We therefore decided to use the tool with two fingers the left and right index fingers as we would not be able to give the participants the necessary time to learn to use four fingers before conducting the user study. Given that we decided only to provide haptic feedback for the left and right index finger, a natural question to investigate was whether stereo audio feedback might be better. To determine this we implemented an audio feedback mode as an alternative to haptic feedback. This mode was restricted to the use of one finger or two fingers on different hands. In audio mode if the user touches an object on the screen then they will hear a sound from the headphones. If they use one finger they will hear a sound coming from both headphones while if they use two fingers then they will hear a sound on the left/right headphone if their left/right finger is on an element. The sounds associated with objects were short tones from different instruments played in a loop. We conducted a usability study to investigate whether audio or haptic feedback was better for determining the geometric properties (specifically position and shape) of graphic elements. The study used simple graphics containing one to three geometric shapes (line, triangle, rectangle and circle). Each shape had a low intensity interior colour and a thick black boundary around it. This meant that the intensity of the haptic or audio feedback was greater when the finger was on the boundary. We used 5 different training graphics in total; 4 of which included different types of shapes (line, triangle, rectangle and circle) see Figure 1. The last one included all of the shapes. For each shape we used a different audio. We changed the audio files to be sure that no kind of shape always has the same audio associated. We used 6 graphics for the experiment whose complexity varied in the number of shapes: easy (1 shape), medium (2 shapes), and hard (3 shapes) see Figures 2 to 4. 27

(a) (c) (d) (e) Fig. 1: Training graphics used in comparison of audio and haptic feedback. We presented the graphics to each participant in the two different modes audio and haptic in a counterbalanced design. For each mode the following two-step procedure was carried out. First we presented the participant with one training graphic that contained all of the different shapes. In this step we told them what shapes were on the screen and helped them to trace the boundaries by suggesting techniques for doing so and then letting them explore the graphic by themselves. Second, the participant was shown three graphics, one at a time, and asked to explore the graphic and let us know when they were ready to answer the questions. They were then asked to answer two questions about the objects in the graphic: 1. How many objects are there in the graphic? 2. What kind of geometric shape is each object? The times taken to explore the graphic and then answer each question were recorded as well as their answers. After viewing and answering questions about the graphics presented with the audio and haptic interaction modes, the participants were asked which interaction they preferred and invited to give comments and explain the features that influenced their preference. 28

(a) Fig. 2: Simple graphics used in comparison of audio and haptic feedback. (a) Fig. 3: Medium hard graphics used in comparison of audio and haptic feedback. (a) Fig. 4: Hard graphics used in comparison of audio and haptic feedback. 29

A caveat is that we slightly modified the presentation midway through the usability study. This was because the first three participants had difficulty identifying the geometric shapes. The reason was that they found it difficult to determine the position and number of vertices on the shape. To overcome this in subsequent experiments object vertices were given a different colour so that the audio and haptic feedback when touching a vertex differed from that for the boundary and the interior of the shape. This reduced the error count to almost zero in the subsequent participants. Another source of annoyance to the first three participants was a delay in response from the haptic feedback due to latencies in the touch screen, Arduino circuit board, and the inertia in the vibrating motor. It was at this point that we added a predictive component to the tool which provided haptic feedback based on the expected position of the finger. 3 Data analysis and results We recruited 11 participants, 6 born blind and 5 late blind for the study. They were aged between 17 and 63. They all had previously read a tactile graphic. 3 of the participants could not complete the experiment because of hearing and sensing problems. 8 participants completed the usability study. We found that 6 out of 8 participants preferred haptic feedback, and 2 of the 3 excluded participants also preferred haptic feedback. Error rates with audio and haptic feedback were very similar. The time to answer the question 1 (overview task) was generally faster with audio feedback, however it was generally faster with haptic feedback for question 2 (detailed view task). These results need to be considered with some care because they were not statistically significant because of the small number of participants. In Table 1 and Figure 5 we give the preference, time and number of errors for each participant in each of the two modes. There were 14 errors out of 48 diagrams which included 96 shapes. However, as we discussed earlier the first 3 participants had difficulty identifying the geometric shape. To overcome this we added vertices with a different colour on the shapes so that the intensity of audio and haptic feedback for a vertex differed from the boundary and the interior of the shape. This reduced the error count significantly in the remaining 5 participants. We observed that participants used two quite different strategies to identify shapes. The first strategy was to find the corners of the shapes, and then to carefully trace the boundary of the object using one or two fingers. This was the strategy we had expected. The second strategy was to use a single finger to repeatedly perform a quick horizontal and/or vertical scan across the shape, moving the starting point of the finger between scans slightly in the converse direction to that of the scan. Scanning like this gives rise to a different audio or haptic pattern for different shapes. For instance, when scanning a rectangle, the duration of a loud sound on an edge, a soft sound inside the shape, and another loud sound on the other 30

Median P8 P7 P6 Participant P5 P4 P3 P2 P1 Median P8 P7 P6 P5 Q1 Q2 Interface Audio Haptic P4 P3 P2 P1 0 100 200 300 400 Time Fig. 5: Experiment 1 time results, and first and third quartiles for the median time. Table 1: Experiment 1 results which shows the preferences, times (in seconds) and error numbers for haptic versus audio interface comparison. Since each graphic has multiple shapes, the times given are the average times for one shape. Audio Haptic Participant Preference Q1 Q2 Q1 Q2 Error Time Error Time Error Time Error Time P1 Audio 0 10.00 3 98.50 0 10.00 1 121.33 P2 Haptic 0 10.00 2 286.83 0 10.00 2 191.00 P3 Haptic 0 41.00 2 450.17 0 102.67 3 256.00 P4 Audio 0 10.00 0 27.17 0 10.00 0 31.33 P5 Haptic 0 31.33 0 230.17 0 134.33 1 98.83 P6 Haptic 0 30.33 0 142.17 0 197.33 1 188.83 P7 Haptic 0 45.67 0 105.83 0 56.33 0 41.17 P8 Haptic 0 41.33 1 43.83 0 42.33 0 30.50 Median 0 30.83 1 124 0 49.33 1 110.08 31

edge are all equal as you move down the shape. But, for a triangle, the duration of the soft sound will either increase or decrease as you scan down the shape. Moreover, users could increase the speed of scanning so that they could finish the whole process quicker. However, this was harder with the tracing because the users had to adjust the direction by using the audio and the haptic feedback. With the scan strategy it was important to use the same speed for the scan, otherwise it might be confusing. We thought that this might be a problem, but surprisingly all the participants used this strategy without any problems. The scan strategy was quite effective and those participants who used it were faster than those using the boundary tracing strategy. 4 Conclusion As a result of this usability study we decided to provide haptic feedback (through the vibrating motors) rather than audio feedback to indicate when the user was touching a graphic element. Our study showed that this was quite effective, allowing the users to determine geometric properties of graphic elements (position and shape). The decision was due to user preferences, the slight performance advantage for haptic feedback in the detailed view task, haptic feedback being more readily generalised to more than two fingers, and because it allowed audio feedback to be used for other purposes. References 1. Coren, S., Ward, L., Enns, J.: Sensation and perception (2004) 2. Hatwell, Y. In: Images and Non-visual Spatial Representations in the Blind. John Libbey Eurotext (1993) 13 35 source: http://books.google.com. 3. Bliss, J., Katcher, M., Rogers, C., Shepard, R.: Optical-to-tactile image conversion for the blind. Man Machine Systems, IEEE Transactions on 11(1) (March 1970) 58 65 4. McGookin, D., Brewster, S.: MultiVis: Improving Access to Visualisations for Visually Impaired People. In: CHI 06 Extended Abstracts on Human Factors in Computing Systems, ACM (2006) 267 270 5. Gibson, J.: The Senses Considered as Perceptual Systems. Greenwood Pub Group (1966) 6. Kildal, J., Brewster, S.: Exploratory strategies and procedures to obtain non-visual overviews using tablevis. In: in 6th Intl Conf. Disability, Virtual Reality & Assoc. Tech, Citeseer (2006) 7. Kennel, A.: Audiograf: a diagram-reader for the blind. Proceedings of the second annual ACM conference on Assistive technologies (1996) 51 56 8. Goncu, C., Marriott, K.: Gravvitas: generic multi-touch presentation of accessible graphics. Human-Computer Interaction INTERACT 2011 (2011) 30 48 32