A Vestibular Sensation: Probabilistic Approaches to Spatial Perception (II) Presented by Shunan Zhang

Similar documents
Lecture IV. Sensory processing during active versus passive movements

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments

A novel role for visual perspective cues in the neural computation of depth

7Motion Perception. 7 Motion Perception. 7 Computation of Visual Motion. Chapter 7

Outline 2/21/2013. The Retina

Sensation and Perception. What We Will Cover in This Section. Sensation

Sensation & Perception

Sensory and Perception. Team 4: Amanda Tapp, Celeste Jackson, Gabe Oswalt, Galen Hendricks, Harry Polstein, Natalie Honan and Sylvie Novins-Montague

TSBB15 Computer Vision

PERCEIVING MOVEMENT. Ways to create movement

Perception. What We Will Cover in This Section. Perception. How we interpret the information our senses receive. Overview Perception

B.A. II Psychology Paper A MOVEMENT PERCEPTION. Dr. Neelam Rathee Department of Psychology G.C.G.-11, Chandigarh

Lecture 14. Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017

A Three-Channel Model for Generating the Vestibulo-Ocular Reflex in Each Eye

Motion Perception II Chapter 8

Cognition and Perception

The eye* The eye is a slightly asymmetrical globe, about an inch in diameter. The front part of the eye (the part you see in the mirror) includes:

Does the Middle Temporal Area Carry Vestibular Signals Related to Self-Motion?

Chapter 73. Two-Stroke Apparent Motion. George Mather

Retina. Convergence. Early visual processing: retina & LGN. Visual Photoreptors: rods and cones. Visual Photoreptors: rods and cones.

Human Vision. Human Vision - Perception

Parvocellular layers (3-6) Magnocellular layers (1 & 2)

Psych 333, Winter 2008, Instructor Boynton, Exam 1

COGS 101A: Sensation and Perception

Maps in the Brain Introduction

Spatial navigation in humans

Chapter 3: Psychophysical studies of visual object recognition

2/3/2016. How We Move... Ecological View. Ecological View. Ecological View. Ecological View. Ecological View. Sensory Processing.

Chapter 8: Perceiving Motion

Joint Representation of Translational and Rotational Components of Self-Motion in the Parietal Cortex

WHEN moving through the real world humans

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Lecture 8. Human Information Processing (1) CENG 412-Human Factors in Engineering May

cogs1 mapping space in the brain Douglas Nitz April 30, 2013

Sensation. Our sensory and perceptual processes work together to help us sort out complext processes

Sensation and Perception

Chapter 5: Sensation and Perception

III: Vision. Objectives:

Lecture 4 Foundations and Cognitive Processes in Visual Perception From the Retina to the Visual Cortex

Aviation Medicine Seminar Series. Aviation Medicine Seminar Series

Perception: From Biology to Psychology

Perceiving Motion and Events

Decoding Natural Signals from the Peripheral Retina

Quiz 2, Thursday, February 28 Chapter 5: orbital geometry (all the Laws for ocular motility, muscle planes) Chapter 6: muscle force mechanics- Hooke

better make it a triple (3 x)

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

PERCEIVING MOTION CHAPTER 8

This question addresses OPTICAL factors in image formation, not issues involving retinal or other brain structures.

Why interest in visual perception?

Vision. Sensation & Perception. Functional Organization of the Eye. Functional Organization of the Eye. Functional Organization of the Eye

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow

3D Space Perception. (aka Depth Perception)

Insights into High-level Visual Perception

Vision V Perceiving Movement

COPYRIGHTED MATERIAL. Overview

Vision V Perceiving Movement

A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures

COPYRIGHTED MATERIAL OVERVIEW 1

Perception of the Spatial Vertical During Centrifugation and Static Tilt

Sensation and Perception. Sensation. Sensory Receptors. Sensation. General Properties of Sensory Systems

5R01EY Page 1 of 1. Progress Report Scanning Cover Sheet. PI Name: Org: Start Date: Snap:

Color and perception Christian Miller CS Fall 2011

Spatial Vision: Primary Visual Cortex (Chapter 3, part 1)

Dissociation of self-motion and object motion by linear population decoding that approximates marginalization

Object Perception. 23 August PSY Object & Scene 1

Spatial coding: scaling, magnification & sampling

Neurovestibular/Ocular Physiology

the dimensionality of the world Travelling through Space and Time Learning Outcomes Johannes M. Zanker

Unit IV: Sensation & Perception. Module 19 Vision Organization & Interpretation

Beau Lotto: Optical Illusions Show How We See

Yokohama City University lecture INTRODUCTION TO HUMAN VISION Presentation notes 7/10/14

Rotational Vestibular Chair

OPTO 5320 VISION SCIENCE I

Processing streams PSY 310 Greg Francis. Lecture 10. Neurophysiology

Egocentric reference frame bias in the palmar haptic perception of surface orientation. Allison Coleman and Frank H. Durgin. Swarthmore College

Feeding human senses through Immersion

Depth-dependent contrast gain-control

Vection in depth during consistent and inconsistent multisensory stimulation

Bottom-up and Top-down Perception Bottom-up perception

Perceived depth is enhanced with parallax scanning

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

AP PSYCH Unit 4.2 Vision 1. How does the eye transform light energy into neural messages? 2. How does the brain process visual information? 3.

What has been learnt from space

Advancing Simulation as a Safety Research Tool

Decoding natural signals from the peripheral retina

Contents 1 Motion and Depth

Digital Image Processing COSC 6380/4393

iris pupil cornea ciliary muscles accommodation Retina Fovea blind spot

Salient features make a search easy

Chapter 1 The Military Operational Environment... 3

Face Perception. The Thatcher Illusion. The Thatcher Illusion. Can you recognize these upside-down faces? The Face Inversion Effect

IV: Visual Organization and Interpretation

Bayesian Method for Recovering Surface and Illuminant Properties from Photosensor Responses

Fundamentals of Computer Vision

Modeling cortical maps with Topographica

Diverse Spatial Reference Frames of Vestibular Signals in Parietal Cortex

Chapter 4 PSY 100 Dr. Rick Grieve Western Kentucky University

A specialized face-processing network consistent with the representational geometry of monkey face patches

Vestibular System: The Many Facets of a Multimodal Sense

An Auditory Localization and Coordinate Transform Chip

Transcription:

A Vestibular Sensation: Probabilistic Approaches to Spatial Perception (II) Presented by Shunan Zhang

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Recent experiments identified a potential neural correlate of heading perception in MSTd of the dorsal visual stream More than half of MSTd neurons are multimodal and are tuned both to optic flow and translational motion in darkness Previous studies showed that neuronal responses in darkness are driven by vestibular signals

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Two groups of multimodal MSTd cells congruent neurons: similar visual/vestibular preferred directions signal the same motion direction in 3D under both unimodal (visual or vestibular) stimulus conditions opposite neurons: prefer nearly opposite directions under visual and vestibular stimulus conditions

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Neural activity in 2AFC heading discrimination task showed that congruent and opposite neurons have different, but complementary, roles in heading perception When the cues were combined (bimodal condition), tuning became steeper for congruent cells but more shallow for opposite cells

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception The effect on tuning steepness is reflected in the ability of an ideal observer to use a congruent or opposite neuron s firing rate to accurately discriminate heading under visual, vestibular, and combined conditions For congruent neurons, the neurometric function was steepest in the combined condition They became more sensitive, i.e., can discriminate smaller variations in heading when both cues are provided Opposite neurons are less sensitive during bimodal stimulation

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Average neuronal thresholds for bimodal stimulation Congruent MSTd cells: lower than either threshold in the single-cue conditions Opposite cells: higher Responses in the vestibular condition were significantly correlated with perceptual decisions correlations strongest for the most sensitive neurons only congruent cells were significantly correlated with the monkey s heading judgments in the bimodal stimulus condition congruent cells might be monitored selectively by the monkey to achieve near-optimal performance under cue combination

Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Recent studies had trained animals perform the heading discrimination task where visual and vestibular cues were put in conflict bimodal neural responses well fit by a weighted linear sum of vestibular and visual unimodal responses weights dependent on visual coherence MSTd neurons appear to give more weight to the strongest cue and less weight to the weakest cue In Summary, experiments in MSTd propose two general approaches to the neural basis of vestibular multisensory self-motion perception cortical vestibular responses quantified in the context of a perceptual task performed around psychophysical threshold neuronal visual/vestibular cue integration studied in behaviorally relevant way

Contribution of Vestibular Signals to Body Tilt Perception and Spatial Orientation Human and primates orient themselves using gravity Spatial orientation: our (change in) orientation relative to gravity (tilt) Subjects with defects in their vestibular system have severe spatial orientation Einstein s equivalence principle inertial accelerations experienced during self-motion are physically indistinguishable from accelerations due to gravity Otolith organs: linear acceleration sensors detect net acceleration cannot distinguish its source

Translation Interpreted as Tilt Vestibular system cannot disambiguate tilts relative to gravity and inertial accelerations at low frequencies Brain s default solution is to interpret low-frequency linear accelerations as tilt Bayesian framework explains the prevalence of tilt perception by a zero inertial acceleration prior At high frequencies, vestibular sensory likelihood function is relatively narrow sensory evidence dominates At low frequencies, vestibular sensory likelihood function is relatively broad prior dominates; tilt

Tilt Interpreted as Translation Off-vertical axis rotation (OVAR)

Role of Vestibular Signals in the Estimation of Visual Vertical Orientation constancy: ability to maintain accurate percept of allocentric visual orientation despite changes in head orientation; earth-vertical neural representation of the visual scene is modified by static vestibular/proprioceptive signals that indicate the orientation of the head/body Static vestibular/somatosensory cues can generate a robust percept of earth-vertical (i.e., which way is up?) A-effect: underestimate the true vertical orientation for tilts greater than 70 E-effect: overestimate the subjective vertical for small angles

Role of Vestibular Signals in the Estimation of Visual Vertical

Role of Vestibular Signals in the Estimation of Visual Vertical Bayesian approach in the interpretation of noisy sensory information estimation of the visual vertical biased by a priori assumption about the probability of a particular tilt assuming subjective vertical most likely aligned with the long axis of the body

Visual Constancy and Spatial Updating Visuopatial constancy: perception of a stable visual world despite constantly changing retinal images caused by eyes, head, and body movements A typical spatial updating paradigm for passive movements includes the subject fixates a central head-fixed target a peripheral space-fixed target is briefly flashed the subject is either rotated or translated to a new position (while maintaining fixation on the head-fixed target that moves along with them) the subject makes a saccade to the remembered location of the space-fixed target Poor performance suggests inability to integrate stored vestibular signals with retinal information

Visual Constancy and Spatial Updating Subjects are able to localize remembered, space-fixed targets better after roll rotations Spatial updating about the roll axis from an upright orientation was ten times more accurate than updating about the roll axis in a supine orientation Roll rotations likely use dynamic gravitational cues, resulting in relatively accurate memory saccades In contrast, subjects partially update the remembered locations of visual targets after yaw rotations

Visual Constancy and Spatial Updating Both humans and trained macaques can compensate for traveled distances in depth and make vergence eye movements that are appropriate for the final position of the subject relative to the target Trained animals loose their ability to properly adjust memory vergence angle after destruction of the vestibular labyrinths Previous studies also suggest a dominant role of otolith signals for the processing of both self-motion information and spatial updating in depth Neural basis of how vestibular information changes the goal of memory-guided eye movements remains to be explored

Concluding Remarks Vestibular-related activity is found in multiple regions of the cerebral cortex, where most of these neurons are multisensory Characterizing vestibular responses by both their mean values and neuronal variability Trying to understand the functional significance of diverse cortical representations of vestibular information; testing vestibular signals in behaviorally relevant task From a computational standpoint, it is also important to find out how well the Bayesian framework explains the diversity of behavioral data do our brains really make use of the variability in neuronal firing? how realistic is it that our brains actually implement such a complex framework?