Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design Charles Spence Department of Experimental Psychology, Oxford University In the Realm of the Senses Wickens (1980, 1984, 1988, 1992...) Structure of human processing resources Majority of information presented visually Are there any costs of monitoring more than one sensory channel? Target Loudspeaker Left/right discrimination task Target Vibrator Target Light Performance Costs Associated with Attending to Multiple Modalities Possible Presented Cost in ms Audition Vision = 55 Touch Vision =104 Touch is Touch Audition = 98 Sticky Vision Audition = 68 Audition Touch = 67 Vision Touch = 66
Relevant Visual Position Relevant Auditory Irrelevant Auditory Different Position Relevant Visual Irrelevant Visual Irrelevant Visual Irrelevant Auditory Relevant Auditory % Correct Shadowing Performance 70 60 50 40 30 1) Chewing- Lips 2) Speaking- Lips Position Different Position 1) Lip-reading facilitates shadowing 2) Better performance when auditory & visual information from same position Don t Dial & Drive? Leeds Advanced Driving Simulator Spence & Read (2003) Simulator Screen IS RS Shadow Front Shadow Side RS Multisensory Warning Signals
Multisensory Integration there is no animal in which there is known to be a complete segregation of sensory processing (Stein et al., 1996) Superadditivity Multisensory Enhancement Multisensory Suppression You simply cannot predict multisensory perception by studying senses in isolation Multisensory Perception
Multisensory Motion Perception Loudspeaker cone Fixation LED Lights 30 cm Displays presented every 2 s until response Task: Report direction of auditory motion Percentage correct Crossmodal Dynamic Capture 1.00 0.75 0.50 0.25 0.00 Synch Async 2.50 2.25 2.00 1.75 1.50 Number of display updates before response Auditory motion perception compromised by synchronous presentation of visual motion Opposite Opposite Rules of Multisensory Integration Virtual Body Effect Superadditivity: Weak stimuli interact synergistically when presented from same location at about same time Subadditivity: When these conditions are not met Sensory Dominance: Vision for space, hearing for time, olfaction for appetitive, touch & olfaction for affective Incorporation & Embodiment Changing perception of touch with sound? Virtual body effect (shadows) Tool-use (computer mice/laser pointers) Dry Hydrated
Headphones Product Microphone Dimension scale Multisensory Synchronization When should you present multisensory stimuli? Footpedals Proportion (Simultaneous Response) 1 0.8 0.6 0.4 0.2 0 Perception of Simultaneity position Position Different Different positions -200-100 0 100 200 Sound first Vision first Stimulus Onset Asynchrony (ms) Wide temporal window of multisensory integration Perception of simultaneity enhanced when stimuli from same location Biophysics: Transduction Latencies S L O W Speed of neural processing F A S T Physics Light travels faster than sound, so distant events seen first Horizon of Simultaneity Physics cancels out biophysics at 10m Multisensory Synchronization Most interfaces closer than 10 m Simultaneous presentation of multisensory signals doesn t assure perception of simultaneity Desynchronizing inputs might enhance multisensory integration & perception (warning signals)
Multisensory Entertainment Most designers have gotten to the point in production where the decision is made to hit the viewer with everything they ve got. The big sounds, the dramatic slam of music from the dead silence, the sudden appearance of the beast. And the kids sit there saying been there...done that...ho hum... (Ralph Thomas, Nothing to sniff at?, 2002) Olfactory Interfaces? Reducing symptoms of road rage Alerting drowsy drivers Burnt rubber smell for bad drivers Olfactory console so drivers can choose smell to suit mood/ surroundings Technology available to introduce PC smell (Digiscents failed; Arvel, Japan) Conclusions Attention & multisensory integration critically determine perception & behavior Spatial constraints on focused & divided attention between hearing, sight & touch Multisensory temporal synchrony Understanding multisensory interactions will lead to better interface design From intuition to understanding via cognitive neuroscience Aging & Multisensory Perception By 2025, more than a billion people over 60 (US Senate Special Committee on Aging, 1985-1986) 5 th A n n u a l M e e t i n g Barcelona, June 2-5 2004 www.multisense.info/2004 Contact: imrf2004@psi.ub.es