Glasgow eprints Service

Similar documents
Designing Audio and Tactile Crossmodal Icons for Mobile Devices

Heads up interaction: glasgow university multimodal research. Eve Hoggan

Glasgow eprints Service

Multimodal Interaction and Proactive Computing

Glasgow eprints Service

Design and evaluation of Hapticons for enriched Instant Messaging

Tutorial Day at MobileHCI 2008, Amsterdam

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

DOLPHIN: THE DESIGN AND INITIAL EVALUATION OF MULTIMODAL FOCUS AND CONTEXT

in HCI: Haptics, Non-Speech Audio, and Their Applications Ioannis Politis, Stephen Brewster

Exploring Surround Haptics Displays


Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display

An Investigation on Vibrotactile Emotional Patterns for the Blindfolded People

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Creating Usable Pin Array Tactons for Non- Visual Information

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Haptic messaging. Katariina Tiitinen

Abstract. 2. Related Work. 1. Introduction Icon Design

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Interactive Exploration of City Maps with Auditory Torches

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Conversational Gestures For Direct Manipulation On The Audio Desktop

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

From Encoding Sound to Encoding Touch

Salient features make a search easy

Supporting Interaction Through Haptic Feedback in Automotive User Interfaces

Reflections on a WYFIWIF Tool for Eliciting User Feedback

Preeti Rao 2 nd CompMusicWorkshop, Istanbul 2012

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Ellen C. Haas, Ph.D.

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 5: 12 Feb A. Faulkner.

Static and dynamic tactile directional cues experiments with VTPlayer mouse

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 4: 7 Feb A. Faulkner.

AmbiGlasses Information in the Periphery of the Visual Field

SpringerBriefs in Computer Science

Comparison of Haptic and Non-Speech Audio Feedback

Dimensional Design; Explorations of the Auditory and Haptic Correlate for the Mobile Device

The psychoacoustics of reverberation

Haptic and Tactile Feedback in Directed Movements

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

INTRODUCTION. General Structure

Haplug: A Haptic Plug for Dynamic VR Interactions

The Mixed Reality Book: A New Multimedia Reading Experience

EMA-Tactons: Vibrotactile External Memory Aids in an Auditory Display

AUDL GS08/GAV1 Auditory Perception. Envelope and temporal fine structure (TFS)

Buddy Bearings: A Person-To-Person Navigation System

Redundant Coding of Simulated Tactile Key Clicks with Audio Signals

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Perceptual Overlays for Teaching Advanced Driving Skills

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

Multi-Modal User Interaction

Using Vibrotactile Cues for Virtual Contact and Data Display in Tandem

Introduction to Haptics

Exploring Geometric Shapes with Touch

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Perception of pitch. Importance of pitch: 2. mother hemp horse. scold. Definitions. Why is pitch important? AUDL4007: 11 Feb A. Faulkner.

HAPTIC USER INTERFACES Final lecture

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

"From Dots To Shapes": an auditory haptic game platform for teaching geometry to blind pupils. Patrick Roth, Lori Petrucci, Thierry Pun

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Booklet of teaching units

Glasgow eprints Service

Do You Feel What I Hear?

HAPTIC USER INTERFACES Final lecture

Providing external memory aids in haptic visualisations for blind computer users

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

Collaboration in Multimodal Virtual Environments

Haptic Feedback on Mobile Touch Screens

Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions

16.400/453J Human Factors Engineering /453. Displays. Prof. D. C. Chandra Lecture 7

vsmileys: Imaging Emotions through Vibration Patterns

Automatic Online Haptic Graph Construction

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

Exploration of Tactile Feedback in BI&A Dashboards

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

Force versus Frequency Figure 1.

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Evaluation of a new stereophonic reproduction method with moving sweet spot using a binaural localization model

Lecture 8: Tactile devices

Transcription:

Hoggan, E.E and Brewster, S.A. (2006) Crossmodal icons for information display. In, Conference on Human Factors in Computing Systems, 22-27 April 2006, pages pp. 857-862, Montréal, Québec, Canada. http://eprints.gla.ac.uk/3269/ Glasgow eprints Service http://eprints.gla.ac.uk

Crossmodal Icons for Information Display Eve E. Hoggan and Stephen A. Brewster Glasgow Interactive Systems Group Department of Computing Science University of Glasgow Glasgow G12 8QQ, UK {eve, stephen}@dcs.gla.ac.uk www.tactons.org Abstract This paper describes a novel form of display using crossmodal output. A crossmodal icon is an abstract icon that can be instantiated in one of two equivalent forms (auditory or tactile). These can be used in interfaces as a means of non-visual output. This paper discusses how crossmodal icons can be constructed and the potential benefits they bring to mobile human computer interfaces. Keywords Crossmodal interaction, non-visual interaction, tactile icons, audio icons. ACM Classification Keywords H5.2. User Interfaces: Auditory (non-speech) Feedback, Haptic I/O, Interaction Styles. Copyright is held by the author/owner(s). CHI 2006, April 22 27, 2006, Montréal, Québec, Canada. ACM 1-59593-298-4/06/0004. Introduction Most interface designs used in wearable/mobile computers today draw from notions in desktop computing such as small pointers, graphical user interfaces, keyboards, and pen-based devices. If these devices are to become a natural part of our everyday attire it may be necessary to move away from the mobility constraints imposed by such interface designs.

2 In this paper we consider the role of crossmodal interaction with mobile computers. Given the ever-decreasing size of mobile devices, the input and output capabilities are often restricted. Due to the lack of screen space, both the graphical user interface and the amount of information able to be presented are limited. This has resulted in displays with small text which is difficult to read, cramped graphics and little contextual information. Such output can place heavy demands on the user. These screens require the user s attention to be diverted from the rest of the physical world. There are many activities, such as walking, in which the user's eyes may be busy although they are otherwise able to attend to information from the mobile computer. Moreover, being predominantly reliant on a single sense is unnatural because, in the real world, we receive information from several modalities, as when we both hear and see someone speaking. Humans use speech, gestures, and writing tools either alone or in combination to communicate with other humans everyday. For example, in noisy conditions, combining audio speech signals with the visible evidence of articulation can improve our comprehension [6]. It is proposed that these crossmodal interactions can be used to influence the design of mobile device interaction. By offering multiple paths through which information may be transferred between the device and user, crossmodal interfaces have the potential to significantly augment the scope and flexibility of interaction. Interaction through modalities, other than vision, is now becoming an option in mobile devices. For example, mobile phones, PDAs, and pagers all feature audio and vibrotactile output. However, the vibrations and audio alerts used in these devices usually contain limited amounts of information. So, the time is right to start thinking about ways in which crossmodal use of these features may improve interaction by exploiting the potential of both audio and vibration as methods of informative feedback. This paper will introduce the concept of the crossmodal icon. A crossmodal icon is an abstract icon that can be instantiated in one of two equivalent forms (auditory or tactile). These can be used in mobile interfaces as a means of output. The paper will begin by providing some background into crossmodal and multimodal interaction, then crossmodal icons will be described and finally the potential uses will be outlined. Background and Previous Work Much of the attention in tactile and audio research focuses on unimodal interaction. Earcons are a common type of non-speech auditory display, which Blattner defines as "non-verbal audio messages that are used in the computer/user interface to provide information to the user about some computer object, operation or interaction" [2]. Brewster et. al have conducted detailed investigations of Earcons [4], which have shown that they are an effective means of communicating information in sound. Brown et. al have investigated tactile icon design by developing Tactons [3]. These are structured vibrotactile messages which can be used to communicate information non-visually.

3 Despite the fact that research has shown both audio and tactile icons to be effective means of communication, the area of crossmodal auditory/tactile displays has been studied less. Van Erp and van Veen transformed a set of audio melodies to the tactile domain using a low pass filter [8]. However, they only established two parameters for the tactile versions of the melodies (tempo and intrusiveness). More recently, Immersion Corp. created Vibetonz which can provide cues when messaging or browsing on a mobile phone and include controllable ringtones accompanied by vibration [7]. However there have been no experiments conducted to investigate how much information can be encoded in these cues. The research discussed here will build on this work by developing crossmodal audio/tactile icons. These may be advantageous to users because different modalities may be more or less appropriate depending on the user and their environment. For example, when a mobile phone user is travelling in a vehicle with a mobile phone placed on the seat beside them, audio cues would be more appropriate because tactile cues often go unnoticed unless the device is in contact with the user s skin. However, once the user has entered a meeting, audio is no longer the most appropriate modality as it can be intrusive and may disrupt. Crossmodal Icons Crossmodal icons are abstract icons which can be automatically instantiated as either an Earcon or Tacton (figure 1), such that the resultant Earcons or Tactons are intuitively equivalent and can be compared as such. figure 1: the relationship between Crossmodal Icons and Earcons/Tactons. Crossmodal icons enable the same information to be presented interchangeably via different modalities. To develop a set of Earcons/Tactons as crossmodal icons, the information represented must be able to be encoded in both modalities. For example, to construct a cue representing a message as a crossmodal icon, an equivalent Earcon and Tacton must be created. In a single case, an Earcon representing a message could use a rhythm with an intensity increase in volume over time while the equivalent Tacton could use a rhythm with an intensity increase in amplitude over time [3] (figure 2). This would mean, for example, that users could move from an audio to a tactile presentation of the same message. figure 2: depiction of output from an Earcon and Tacton using increasing intensity as a parameter. Designing Crossmodal Icons Auditory and tactile displays were chosen because they are ideal candidates for crossmodal combination in view of the fact that both modalities share a temporal property. It has been suggested that the more properties shared between two modalities, the stronger will be the observer s unity assumption that

4 information from different sensory channels can be attributed to the same distal event or object [1]. Unlike icons using the auditory and tactile modalities, visual icons are usually static and only use the temporal dimension in a limited manner (more often than not, changing between static states). Thus, auditory/tactile properties like rhythm and tempo cannot be directly transferred to the visual domain. In the future, visual icons could also be included as crossmodal icons after further investigation into the properties shared between the audio, tactile and visual modalities. It may be possible, for example, to use intensity, location or texture in all three modalities. One important area of study is the set of parameters that can be used to create auditory/tactile cues where the same information can be easily mapped between the two modalities. This is difficult because many of the parameters available in the audio domain do not have direct mappings to the tactile domain and vice versa. For example, both timbre and pitch are recommended parameters for use when creating Earcons [4]. However, timbre and pitch cannot be directly transferred to the tactile domain because of the limited capabilities of current actuators. If there is no direct mapping between modalities, an abstract mapping must be developed where the cues may still be perceived as equivalent representations of information. The current parameters under investigation have been derived from a survey of related work on the parameters available in the audio and tactile domain [4,5], which, in turn, have been derived from psychoacoustics and psychophysics. The encoding of information is similar to that of both Earcons and Tactons where each of their shared parameters (e.g. rhythm, texture, intensity) is manipulated to develop equivalent cues. The basic parameters available for auditory/tactile crossmodal icons are: Rhythm: As outlined by Brewster, short motifs can be used to represent objects or actions [4]. Such motifs can be both audio and tactile due to their shared temporal properties. Rhythm could, for instance, be used to encode information about the type of an alert [5]. For example, in a mobile phone, an appointment reminder could be represented by the rhythm in figure 3. The audio icon would play this rhythm from a MIDI file via a loudspeaker. The tactile icon would transmit this same rhythm via a series of pulses [3] through a vibrotactile device like the EAI C2 (figure 4). figure 3: appointment reminder rhythm used in crossmodal Earcon and Tacton figure 4: Engineering Acoustics Inc (EAI) C2 vibrotactile actuator Roughness: this has been used as an effective parameter for Tactons [5]. Modulating the amplitude of a tactile pulse creates differing levels of roughness. There are many versions of audio roughness

5 documented such as audio amplitude modulation and dissonances [9]. It may be possible for users to perceive an auditory equivalent of tactile roughness. Then, for example, an important appointment could be represented as a rhythm with a rough texture. This would be achieved in a Tacton by using pulses made up of a modulated sine wave. While an Earcon, if using amplitude modulation, could create this same effect by playing a rhythm enveloped with amplitude modulation (figure 5). figure 5: rhythm from figure 3 using 250Hz sinusoid modulated by a 30Hz sinusoid (Tacton) and the same rhythm played by a piano, with an amplitude envelope (Earcon). A preliminary experiment was conducted in order to determine which version of audio roughness can be perceived as equivalent and maps most effectively to tactile roughness (amplitude modulation). Initial results show that participants preferred the use of differing timbres (e.g. smooth flute rough tremolo strings) in audio to represent the different levels of roughness used in tactile. However, the results also show that there is no significant difference in performance between timbre and audio amplitude modulation. Therefore, amplitude modulation or timbre in audio can be perceived as equivalent and map effectively to tactile roughness but timbre is the preferred choice. Intensity: this is another directly transferable parameter between Earcons and Tactons. High/low intensity could be achieved by increasing/decreasing the amplitude of the Earcon or Tacton. However, Earcon guidelines suggest that it should be used vary carefully as it can cause annoyance and has few absolutely discriminable values [4]. Further studies are required to determine the usefulness of this parameter. Spatial Location: unlike rhythm and intensity, spatial location cannot be directly transferred from the tactile domain to the audio domain. The spatial location of transducers placed on the body is concrete while spatial location in audio environments is an abstract concept. Research is needed to investigate how to map from a tactile location on the body to an audio location in a soundscape. For instance, one possibility is to place the tactile transducers around the waist whilst using an audio display presented through headphones. Then, a navigational cue such as turn right could be presented via an Earcon by panning the audio to the left of the soundscape. Tactons could give the same cue by activating a transducer placed on the left hand side of the waist (figure 6). figure 6: turn right cue indicated by audio panned to the right (Earcon) and by activation of tactor on right hand side of waist (Tacton). Potential Uses There are many potential uses for crossmodal displays. This research will concentrate on possible uses in mobile/wearable devices.

6 Context Aware Mobile Devices: tactile perception may be reduced when engaged in another activity while audio cues may be blocked out by environmental noise. Also, given the personal nature of mobile computers, the interface must be adaptable by adjusting to the individual preferences and habits of its user. An interface incorporating crossmodal icons could present audio and/or tactile cues depending on the situation and adapt by permitting easy movement between modalities. Widgets: as mentioned earlier, mobile devices often have cluttered displays due to the lack of screen space. Crossmodal features could be added to buttons, scrollbars, and menus so that information about those widgets can be presented non-visually. This would allow the widget size to be reduced and allow more information to be presented on the display. Displays for Visually Impaired People: since mobile devices primarily provide output via the visual modality, visually impaired people have limited access to this information. Crossmodal icons could improve the interaction between visually impaired users and mobile devices by providing alternative channels through which this information may be displayed. Conclusions and Future Work This paper outlines some of the potential parameters that could be used to create crossmodal audio and tactile icons. This is just a first step, as this set of parameters is small making it difficult to create a range of different messages. Experiments investigating different possible parameters and mappings are needed to help inform designers as to how information can be encoded in the displays. Once a set of crossmodal parameters has been established, it will be possible to include crossmodal icons in various mobile applications. Acknowledgements This work was funded by EPSRC grant GR/S53244. References [1] Adelstein, B.D., Begault, D.R., Anderson, M.R., and Wenzel, E.M. Sensitivity to Haptic-Audio Asynchrony. Proc. 5th International Conference on Multimodal Interfaces, ACM Press (2003), 73-76. [2] Blattner, M.M., Sumikawa, D.A., and Greenberg, R.M., Earcons and Icons: Their Structure and Common Design Principles. Human Computer Interaction 4(1), (1989), 11-44. [3] Brewster, S.A. and Brown, L.M. Tactons: Structured Tactile Messages for Non-Visual Information Display. Proc. Australasian User Interface Conference, Australian Computer Society (2004), 15-23. [4] Brewster, S.A., Wright, P.C. & Edwards, A.D.N. An evaluation of earcons for use in auditory humancomputer interfaces. Proc. InterCHI'93, ACM Press (1993), 222-227. [5] Brown, L.M., Brewster, S.A., and Purchase, H.C. A First Investigation into the Effectiveness of Tactons. Proc. World Haptics 2005, IEEE Press (2005), 167-176. [6] Calvert, G.A., Brammer, M.J., and Iversen, S.D. Crossmodal Identification. Trends Cognit. Sci. 2 (1998), 247-253. [7] Immersion: http://www.immersion.com/mobility/docs/vibetonz_mo bile_player_0305_v1.pdf [8] van Veen, H.A.H.C, and van Erp, J.B.F. Tactile information-presentation in the cockpit. Haptic Human- Computer Interaction. Lecture notes in computer science Vol. 2058, Springer Verlag (2001), 174-181. [9] Wendahl, R.W. Some Parameters of Auditory Roughness. Folia Phoniatr 18 (1966), 26-32.