Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Similar documents
Exploring Surround Haptics Displays

The haptic cuing of visual spatial attention: Evidence of a spotlight effect

Rendering Moving Tactile Stroke on the Palm Using a Sparse 2D Array

Vibrotactile Apparent Movement by DC Motors and Voice-coil Tactors

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Thresholds for Dynamic Changes in a Rotary Switch

CB Database: A change blindness database for objects in natural indoor scenes

Comparison of Haptic and Non-Speech Audio Feedback

Enhanced Collision Perception Using Tactile Feedback

Heads up interaction: glasgow university multimodal research. Eve Hoggan

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

A Design Study for the Haptic Vest as a Navigation System

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Issues and Challenges of 3D User Interfaces: Effects of Distraction

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

A Tactile Display using Ultrasound Linear Phased Array

Salient features make a search easy

Effect of Cognitive Load on Tactor Location Identification in Zero-g

Glasgow eprints Service

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Collaboration in Multimodal Virtual Environments

Differences in Fitts Law Task Performance Based on Environment Scaling

Object Perception. 23 August PSY Object & Scene 1

Designing Audio and Tactile Crossmodal Icons for Mobile Devices

The Shape-Weight Illusion

How Many Pixels Do We Need to See Things?

Touch Perception and Emotional Appraisal for a Virtual Agent

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Multi-Modality Fidelity in a Fixed-Base- Fully Interactive Driving Simulator

Here I present more details about the methods of the experiments which are. described in the main text, and describe two additional examinations which

Chapter 6 Experiments

Perception of pitch. Importance of pitch: 2. mother hemp horse. scold. Definitions. Why is pitch important? AUDL4007: 11 Feb A. Faulkner.

Title: A Comparison of Different Tactile Output Devices In An Aviation Application

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Redundant Coding of Simulated Tactile Key Clicks with Audio Signals

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 4: 7 Feb A. Faulkner.

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 5: 12 Feb A. Faulkner.

Glasgow eprints Service

Haptic Identification of Stiffness and Force Magnitude

III. Publication III. c 2005 Toni Hirvonen.

Virtual Tactile Maps

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Realtime 3D Computer Graphics Virtual Reality

Learning relative directions between landmarks in a desktop virtual environment

Effective Vibrotactile Cueing in a Visual Search Task

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

Spatial Low Pass Filters for Pin Actuated Tactile Displays

A Comparison of Two Wearable Tactile Interfaces with a Complementary Display in Two Orientations

Combined effects of low frequency vertical vibration and noise on whole-body vibration sensation

Tactile Interface for Navigation in Underground Mines

Experiment HP-23: Lie Detection and Facial Recognition using Eye Tracking

Study of 2D Vibration Summing for Improved Intensity Control in Vibrotactile Array Rendering

Comparison of Three Eye Tracking Devices in Psychology of Programming Research

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Design and Evaluation of Tactile Number Reading Methods on Smartphones

COM325 Computer Speech and Hearing

This is a repository copy of Centralizing Bias and the Vibrotactile Funneling Illusion on the Forehead.

From Encoding Sound to Encoding Touch

CPSC 532E Week 10: Lecture Scene Perception

Haptic presentation of 3D objects in virtual reality for the visually disabled

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses

Naturalness in the Design of Computer Hardware - The Forgotten Interface?

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

Effect of the number of loudspeakers on sense of presence in 3D audio system based on multiple vertical panning

Tone-in-noise detection: Observed discrepancies in spectral integration. Nicolas Le Goff a) Technische Universiteit Eindhoven, P.O.

An Example Cognitive Architecture: EPIC

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

Exploring Geometric Shapes with Touch

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Early Take-Over Preparation in Stereoscopic 3D

Speech, Hearing and Language: work in progress. Volume 12

Spatial Judgments from Different Vantage Points: A Different Perspective

An unnatural test of a natural model of pitch perception: The tritone paradox and spectral dominance

Blind navigation with a wearable range camera and vibrotactile helmet

Perception. What We Will Cover in This Section. Perception. How we interpret the information our senses receive. Overview Perception

Tactual. Disptays for Wearabte Computing

Introduction to Mediated Reality

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Force versus Frequency Figure 1.

Multi-Modal User Interaction

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

Designing Pseudo-Haptic Feedback Mechanisms for Communicating Weight in Decision Making Tasks

EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES

Comparison of Driver Brake Reaction Times to Multimodal Rear-end Collision Warnings

Modulating motion-induced blindness with depth ordering and surface completion

Enclosure size and the use of local and global geometric cues for reorientation

Discriminating direction of motion trajectories from angular speed and background information

Transcription:

In Usability Evaluation and Interface Design: Cognitive Engineering, Intelligent Agents and Virtual Reality (Vol. 1 of the Proceedings of the 9th International Conference on Human-Computer Interaction), M. J. Smith, G. Salvendy, D. Harris, and R. J. Koubek (Eds.), Mahwah, NJ: Lawrence Erlbaum Associates, 21, pp. 678-682. Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces Hong Z. Tan, Robert Gray, J. Jay Young, and Piti Irawan Haptic Interface Research Laboratory, Purdue University 1285 Electrical Engineering Building, West Lafayette, IN 4797 Nissan Cambridge Basic Research 4 Cambridge Center, Cambridge, MA, 2142 ABSTRACT This study is part of an ongoing program designed to investigate the integration of visual and haptic information in the context of multimodal interfaces. With the current experiments, we study whether haptic cues can be used to redirect spatial attention in a visual task where an observer is asked to detect a change between two scenes. Subjects were asked to look at visual scenes consisting of rectangular horizontal and vertical elements of equal sizes. Their task was to detect an orientation change in one of the elements. Prior to this visual task, the subject was tapped on the back at one of four locations by a vibrotactile stimulator. It was found that reaction time to detect a visual change decreased significantly when the location of the tactor coincided with the quadrant of the visual scene where the changing element occurred. It was also found that reaction time increased when the location of the tactile stimulation did not coincide with the visual quadrant where change occurred. These results have implications for designers of multimodal interfaces where a user can benefit from haptic attentional cues in order to detect and process information in a small area within a large and complex visual display. 1. INTRODUCTION The growing trend in interface research is towards multimodal human-computer interfaces. This is motivated by the facts that humans naturally employ multimodal information channels for communication, and that multimodal interfaces have been demonstrated to be effective (Oviatt, 1999). Cognitive research has shown that multimodal communication results in increased amount of transmitted information (Miller, 1956). It is well known that a signal with a single varying attribute can at most transmit 2-3 bits to a human observer (for example, we can only identify about 5-7 loudness levels of a fixed-frequency pure tone). However, greater information transmission can be achieved by employing signals with multiple attributes (for example, one can easily identify hundreds of faces at a glance of a person or a photograph, because many facial features contribute to the appearance of a face). This increase in transmitted information can be achieved whether multiple modalities convey different information or encode the same information redundantly (Miller, 1956). Therefore, multimodal interfaces facilitate more natural and efficient human-computer interactions. One challenge in multimodal interface research is the lack of multimodal interface systems. Robust systems for applications such as speech recognition or gesture interpretation require long-term research and development efforts from a multidisciplinary team of investigators. True multimodal interactions can not take place until problems in each of these application domains are solved. Compared with visual and auditory interfaces, the field of haptic interface research is a less developed yet fast-growing and promising area. For the past several years, we have been developing a tactor-array for the back of a user. We have been studying its effectiveness in conveying directional information for applications such as a haptic navigation guidance system for drivers and blind travelers (Ertan, Lee, Willets, Tan, & Pentland, 1998; Tan, Lim, & Traylor, 2; Tan, Lu, & Pentland, 1997; Tan & Pentland, 1997, 21). Recently, we studied the integration of visual and tactile information about moving objects (Gray & Tan, 2). In one experiment, we found that tactile pulses simulating motion along the forearm facilitated the speed and accuracy with which subjects discriminated visual targets on the same forearm. In another experiment, we concluded that an approaching visual target s time to contact with the forearm influenced subject s ability to perform tactile discrimination on that forearm. These results demonstrate dynamic links in the spatial mapping between vision and touch. In the current study, we explore this issue further with a paradigm that examined how haptic cueing might 678

affect an observer s visual spatial attention. The long term objective of our research is to investigate the integration of visual and haptic information in the context of cross-modal priming. In the current study, we investigate whether haptic cues (taps on the back) can be used to redirect spatial attention in a visual task where an observer is asked to detect a change between two scenes. Recent research has shown that attention is required to perceive (even large) changes in a visual scene. This phenomenon, termed change blindness, occurs in both laboratory (Rensink, O Regan & Clark; 1997) and real-world (Simons & Levin, 1998) conditions. The proposed explanation for change blindness is that we do not form a complete detailed representation of our surroundings. Such a representation occurs only for the small part of the visual field that we are attending. In the typical experimental setup for studying change blindness, termed the flicker paradigm (Rensink, 2), two scenes are alternately displayed with a blank inserted between them (to mask motional cues). An observer is asked to respond as soon as a difference between the two scenes is detected. It has been found, using scenes consisting of photographs, that reaction time in such tasks depends on the degree to which the changing element is of interest (i.e., captures the viewer s attention). If attention is the key factor affecting reaction time, then any means of manipulating an observer s attention should affect the reaction time associated with the detection of scene changes. Our experiments are therefore designed to investigate whether such effects can be elicited by drawing an observer s attention to a spatial location via haptic stimulation. 2. METHODOLOGY 2.1 Stimulus The visual stimuli used in these experiments were based primarily on the flicker paradigm used for the study of change blindness (Rensink, 2). The visual scenes consisted of rectangular elements of equal sizes, but in either horizontal or vertical orientations (Fig. 1). Two scenes, differing only in the orientation of one of the elements, were presented in an alternating order with a blank scene inserted in between. The duration of the two patterned scenes was called the on time. The duration of the blank scene was called the off time. The experimental apparatus for haptic cueing consisted of a 3-by-3 vibrotactile display developed at the Purdue Haptic Interface Research Laboratory. The tactor array is draped over the back of an office chair (Fig. 2). For the experiments reported here, only the four corner tactors (i.e., tactors No. 1, 3, 7, and 9 in Fig. 2) were used. Each tactor could be independently driven by a 6-ms sinusoidal pulse. The frequency of the pulse was between 29-36 Hz (corresponding to the resonant frequencies of the four tactors). The intensity of the vibration was between 26.1-27.9 db SL (sensation level) under unloaded condition. Blank Scene #2 Blank Scene #1 Figure 1. The two visual scenes used in our change-detection experiments (modified from Fig. 2 in Rensink, 2). Figure 2. The haptic cueing system. Shown here is a 3-by-3 tactor array draped over the back of a chair. 2.2 Subjects Ten college students, 5 females and 5 males, participated in the experiment as paid research participants. The average age of the subjects was 21 years. All subjects had normal or corrected vision. They reported no known abnormalities with tactile perception on their back. 679

2.3 Procedures Before the experiments began, subjects were informed of the nature of the task. Specifically, they were told that they needed to locate a rectangular element on the computer screen that was changing its orientation. Their job was to locate and identify this element as quickly as possible. To ensure that the subjects could clearly feel the vibrations presented by the tactor array on their back, and that the subjects could correctly associate each tactor location with the corresponding quadrant on the computer screen, an absolute identification experiment was conducted with the tactor array before each new session. The subject s task was to click on one of the four quadrants on the monitor (represented by four large rectangles) in response to a vibration on the back (e.g., the correct response to a vibration near the right shoulder would be to click on the upperright quadrant of the monitor). Each subject had to complete one perfect run (i.e., 1% correct) of 6 trials before starting the visual change-detection task. This test with the tactor array was repeated each time the subject left and returned to the chair. During the visual change-detection task, the subjects were instructed to click the left mouse button as soon as the changing element was found (without moving the cursor over the element). The screen then froze and the color of all elements turned from white to pink. The subjects were required to make a second mouse click with the cursor centered on the element that they perceived to change orientation. The timing of the first mouse click was recorded as the reaction time. The x-y positions of the second click were used to discard trials where the wrong element was identified. The independent variables employed were the state of the tactors (OFF or ON) and on time (8, 48, and 8 ms). Three 6-trial runs were conducted for each experimental condition and each subject. The order of the eighteen runs (2 tactor state 3 on time 3 runs) were randomized. For all experimental conditions, off time was fixed at 12 ms. The total number of rectangular elements was fixed at 12 (or equivalently, 3 elements per quadrant). The x-y positions of the elements were chosen randomly within each quadrant with the constraint that the elements never overlapped. For the experiments where tactors were ON (i.e., haptic cueing was present), the interstimulus interval (ISI the interval from the time the tactor was turned off to the time the first scene was shown on the monitor) was fixed at 5 ms. The percentage of trials with valid haptic cues (i.e., the location of the vibrating tactor coincided with the quadrant where the changing element occurred) was fixed at 5%. Our subjects were aware of the fact that the location of the haptic cue may or may not be valid on any particular trial. They were left to decide on their own whether and how they would utilize the information provided by the haptic cues. Throughout the experiments, subjects were instructed to sit upright with their back pressed against the tactor array. They were instructed not to move their body relative to the chair, or to move the chair relative to the monitor. Headphones were used to block any audible noise from the tactor array. Each subject typically finished all the experiments within 2-3 sessions. 2.4 Data Analysis The dependent variables were mean reaction time and standard error. For each of the six experimental conditions tested (2 tactor state 3 on time), data from all subjects were pooled. Data from the tactor OFF condition served as a baseline measure for reaction time. Data from the tactor ON condition were separated into two subgroups: those with valid haptic cues and those with invalid cues. Mean reaction times for the two groups of trials were computed separately. All error trials (where the subject selected the wrong rectangle element during the second mouse click) were discarded. 3. RESULTS In general, our results show that reaction time decreased significantly with valid haptic cues, and increased with invalid haptic cues. For example, results for one subject (S5) are shown in Fig. 3. It can be seen that the average reaction time for each experimental condition increased monotonically with the value of on-time. Compared to baseline measures (i.e., reaction times with no haptic cues, shown as filled diamonds), reaction time decreased with valid haptic cues (filled circles), and increased with invalid haptic cues (filled triangles). This is true for each of the ten subjects tested. The extent to which valid or invalid haptic cues decreased or increased reaction time varied from subject to subject. For example, shown in Fig. 4 are data from another subject (S9) who has a lower baseline measure than S5 (i.e., faster response without haptic cues). Subject S9 benefited less from valid haptic cues than S5 (i.e., a smaller decrease in reaction time). This subject was also less distracted by invalid haptic cues than S5 (i.e., a smaller increase in reaction time). Subjects S5 and S9 represent the two most extreme observers among the ten subjects tested. We hasten to point out that, despite the obvious differences in the data shown in Fig. 3 and 4, our general 68

8 7 6 5 4 2 S5 8 7 6 5 4 2 S9 2 4 6 8 2 4 6 8 Figure 3. Mean reaction times and standard errors for subject S5. Figure 4. Mean reaction times and standard errors for subject S9. conclusion regarding the effect of valid and invalid haptic cues on the reaction time of our visual change-detection task is still valid. Results averaged over all ten subjects are shown in Fig. 5. As found with individual subject s data, mean reaction time increased as on-time increased, for the valid-cue, invalid-cue and no-cue conditions. Overall, compared with baseline measures, reaction time decreased by 163 ms (4.6%) with valid haptic cues, and increased by 781 ms (18.9%) with invalid haptic cues. All standard errors are relatively small as compared to the values of reaction time. One interesting observation from Fig. 3 and 4 is that the datum points for the valid-cue condition for subjects S5 and S9 seem to be quite similar, despite the large differences in reaction time for the invalid-cue and no-cue conditions. To investigate this further, standard deviations of reaction time from data pooled from all ten subjects were computed (Fig. 6). Indeed, it seems that the standard deviations for the valid-cue condition are lower than those for the other two conditions across the three on-time values tested. We therefore conclude, based on the limited data we have collected, that valid haptic cueing reduces the inter-subject variability in the response time for the visual change-detection task employed in this study. Finally, average number of error trials varied among the subjects tested, with a range of -9 per experimental run of 6 trials. Averaged across the subjects, there were fewer than 4 error trials per 6-trial run. 8 7 All Subjects 6 5 4 2 2 4 6 8 Standard Deviation of 25 2 15 5 All Subjects 2 4 6 8 Figure 5. Mean reaction times and standard errors averaged over all ten subjects. Figure 6. Standard deviations of reaction time from data pooled over all subjects. 681

4. CONCLUSIONS In this study, we examined the extent to which haptic spatial cues can speed up or slow down an observer s reaction time to detect a change in a visual scene. Our data suggest that (1) valid haptic cues decrease the reaction time for the detection of a visual change, and (2) invalid haptic cues increase the reaction time (to a lesser degree) for the same task. This general conclusion holds for data from individual subjects as well as for pooled data from all subjects. This conclusion also holds despite the inter-subject differences in their natural reaction time (i.e., some subjects tend to react faster than others when no haptic cues are present). Similar results have been reported for visual spatial cueing of a visual change-detection task (Scholl, 2). Finally, we have some evidence suggesting that valid haptic cues decrease the inter-subject variability in their reaction time. Our results have implications for designers of multimodal interfaces. In an automobile, for example, a haptic display built into the driver s seat can be useful in alerting the driver of impending collision on one side of the car. In a large and complex visual display for air traffic control, a haptic display used in conjunction with a non-invasive eye-tracking system can remind the operator to look at a neglected area of the visual display, or to pay attention to an area with busy traffic. In general, haptic cueing can provide an effective alternative to visual and auditory cueing for a complex information display. ACKNOWLEDGEMENT This work has been partly supported by a gift fund from Nissan Research & Development, Inc., a grant from Honda R&D Americas, Inc., and a National Science Foundation Faculty Early Career Development (CAREER) Award under Grant No. 9984991-IIS. The authors wish to thank Drs. Ron Rensink and Ian Thornton for many insightful discussions on this project. REFERENCES Ertan, S., Lee, C., Willets, A., Tan, H. Z., & Pentland, A. (1998). A wearable haptic navigation guidance system, Digest of the Second International Symposium on Wearable Computers (pp. 164 165). Piscataway, NJ: IEEE Computer Society. Gray, R., & Tan, H. Z. (2). Dynamic spatial mapping between vision and touch. Abstracts of the Psychonomic Society, 5, 38. Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. The Psychological Review, 63(2), 81-97. Oviatt, S. (1999). Ten myths of multimodal interaction. Communications of the ACM, 42(11), 74 81. Rensink, R. A. (2). Visual search for change: A probe into the nature of attentional processing. Visual Cognition, 7(1-3), 345-376. Rensink, R. A., O'Regan, J. K., & Clark, J. J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8(5), 368 373. Scholl, B. J. (2). Attenuated change blindness for exogenously attended items in a flicker paradigm. Visual Cognition, 7, 377-396. Simons, D. J., & Levin, D. T. (1998). Failure to detect changes to people during a real-world interaction. Psychonomic Bulletin and Review, 5, 644-649. Tan, H., Lim, A., & Traylor, R. (2). A psychophysical study of sensory saltation with an open response paradigm. In S. S. Nair (Ed.), Proceedings of the Ninth (9th) International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, American Society of Mechanical Engineers Dynamic Systems and Control Division (Vol. 69-2, pp. 119 1115). New York: ASME. Tan, H. Z., Lu, I., & Pentland, A. (1997). The chair as a novel haptic user interface. In M. Turk (Ed.), Proceedings of the Workshop on Perceptual User Interfaces (PUI'97) (pp. 56 57). Banff, Alberta, Canada, Oct. 19 21. Tan, H. Z., & Pentland, A. (1997). Tactual displays for wearable computing. Personal Technologies, 1, 225 23. Tan, H. Z., & Pentland, A. (21). Tactual Displays for Sensory Substitution and Wearable Computers. In W. Barfield & T. Caudell (Eds.), Fundamentals of Wearable Computers and Augmented Reality (Chap. 18, pp. 579 598). Mahwah, NJ: Lawrence Erlbaum Associates. 682