Gravitoinertial Force Magnitude and Direction Influence Head-Centric Auditory Localization

Size: px
Start display at page:

Download "Gravitoinertial Force Magnitude and Direction Influence Head-Centric Auditory Localization"

Transcription

1 Gravitoinertial Force Magnitude and Direction Influence Head-Centric Auditory Localization PAUL DIZIO, 1 RICHARD HELD, 2 JAMES R. LACKNER, 1 BARBARA SHINN-CUNNINGHAM, 4 AND NATHANIEL DURLACH 3 1 Ashton Graybiel Spatial Orientation Laboratory and Volen Center for Complex Systems, Brandeis University, Waltham ; 2 Department of Brain and Cognitive Science and 3 Research Laboratory of Electronics, Department of Electrical Engineering, Massachusetts Institute of Technology, Cambridge 02139; and 4 Center for Adaptive Systems, Boston University, Boston, Massachusetts Received 18 September 2000; accepted in final form 16 February 2001 DiZio, Paul, Richard Held, James R. Lackner, Barbara Shinn- Cunningham, and Nathaniel Durlach. Gravitoinertial force magnitude and direction influence head-centric auditory localization. J Neurophysiol 85: , We measured the influence of gravitoinertial force (GIF) magnitude and direction on head-centric auditory localization to determine whether a true audiogravic illusion exists. In experiment 1, supine subjects adjusted computer-generated dichotic stimuli until they heard a fused sound straight ahead in the midsagittal plane of the head under a variety of GIF conditions generated in a slow-rotation room. The dichotic stimuli were constructed by convolving broadband noise with head-related transfer function pairs that model the acoustic filtering at the listener s ears. These stimuli give rise to the perception of externally localized sounds. When the GIF was increased from 1 to 2 g and rotated 60 rightward relative to the head and body, subjects on average set an acoustic stimulus 7.3 right of their head s median plane to hear it as straight ahead. When the GIF was doubled and rotated 60 leftward, subjects set the sound 6.8 leftward of baseline values to hear it as centered. In experiment 2, increasing the GIF in the median plane of the supine body to 2 g did not influence auditory localization. In experiment 3, tilts up to 75 of the supine body relative to the normal 1 g GIF led to small shifts, 1 2, of auditory setting toward the up ear to maintain a head-centered sound localization. These results show that head-centric auditory localization is affected by azimuthal rotation and increase in magnitude of the GIF and demonstrate that an audiogravic illusion exists. Sound localization is shifted in the direction opposite GIF rotation by an amount related to the magnitude of the GIF and its angular deviation relative to the median plane. INTRODUCTION Interaural timing, phase, and amplitude spectra are important cues for judging the azimuth of a broadband sound relative to the median plane of the head (cf. Blauert 1983; Colburn and Durlach 1978; Yost and Gourevitch 1987). The physical transformations that a sound waveform undergoes by interacting with the head and pinnae can be described by linear filters called head-related transfer functions (HRTFs) (Wightman and Kistler 1980). HRTFs are unique for each location around the head in humans because their ears are immobile. Binaural acoustical patterns are not the only factors influencing a sound Address for reprint requests: P. DiZio, Ashton Graybiel Spatial Orientation Laboratory, Brandeis University MS033, Waltham, MA ( dizio@brandeis.edu). source s perceived location. Proprioceptive and somatosensory information about target location derived from hand contact can also influence where an auditory stimulus is heard (Lackner and Shenker 1985). Head movements can help resolve auditory front-back ambiguities and the elevation of an external sound source (Wallach 1940). Their influence depends on relating movement-contingent auditory, proprioceptive, and vestibular signals. Head movements can also be used to recalibrate sound localization when pseudophones are worn that alter the auditory cues at the ears from an external sound source (Held 1955). Rotary acceleration of the whole body also influences the perceived auditory azimuth of a sound stimulus. A blindfolded listener in a rotating chair will hear a head-fixed, midline, sound source as moving and displacing relative to his or her head, a phenomenon known as the audiogyral illusion (Clark and Graybiel 1949). The auditory target will be heard to displace in the direction opposite self-rotation when the chair accelerates, to come back to the midline when constant velocity is maintained, and to displace again during deceleration (Arnoult 1952; Clark and Graybiel 1949; Lester and Morant 1970; Munsterberg and Pierce 1894). Thus during clockwise acceleration to constant velocity, a midline sound source will be heard to the left of the head s midline and then during deceleration will be heard to the right of midline. Graybiel and Niven (1951) found that linear acceleration influenced auditory localization as well and referred to this as the audiogravic illusion. If an observer is seated off-center in a rotating room, radial centripetal forces combine with gravity to generate a resultant linear gravitoinertial force (GIF) vector greater than either component and oriented between the two. Graybiel and Niven had seated observers face the center of a slow rotation room and lean over 90 to one side. A ring of speakers was positioned in the head s azimuthal plane at 5 separations. As the room began to spin, the GIF was displaced in relation to the head in azimuth (see Fig. 1). The observers were asked to indicate which speaker emitted a sound in the apparent horizontal plane. When the room was stationary, the median plane of the laterally flexed head was horizontal, and The costs of publication of this article were defrayed in part by the payment of page charges. The article must therefore be hereby marked advertisement in accordance with 18 U.S.C. Section 1734 solely to indicate this fact /01 $5.00 Copyright 2001 The American Physiological Society 2455

2 2456 DIZIO, HELD, LACKNER, SHINN-CUNNINGHAM, AND DURLACH Knudsen 1987). Jay and Sparks (1984) have shown that the auditory receptive fields of visual-auditory units in the primate superior colliculus change as a function of eye position so that the auditory and visual maps stay in register. Spatially tuned auditory-visual neurons exist in the primate parietal cortex as well (Stricane et al. 1996). Psychophysical experiments have shown that eye position and head position can affect auditory localization in humans (Lewald and Ehrenstein 1996, 1998). In the cat auditory cortex, the representation of sound location is distributed over large populations of very broadly tuned neurons that respond to multiple acoustic parameters (Middlebrooks et al. 1998). Our investigation of vestibular and somatic influences on sound localization will further identify spatial reference frames for multi-modal neural coding. In addition, understanding audiogravic effects has potential practical applications in instruments providing orientation cues that pilots might use to prevent disorientation in unusual flight environments (Teas 1993). METHODS FIG. 1. Schematic illustration of the audiogravic illusion in the experiment of Graybiel and Niven (1951). The subject sat at the periphery of a rotating room facing the center, leaning over to their side. At constant velocity the resultant (GIF) of gravity (g) and centrifugal force (F cent ) rotated (filled arrow) 29.2 in azimuth and increased to g. The physical sound source (open dot) that subjects selected as being in the apparent horizontal plane was actually below the horizontal plane (filled dot). observers correctly indicated a sound from the speaker located in that plane. When the room was spinning, the GIF was rotated inboard, and observers felt their whole body was tilted backward or outboard. They now identified as being in the horizontal plane sounds emitted by a speaker physically below the median plane of their head. Howard and Templeton (1966) and Howard (1982) have argued that this effect is not an audiogravic illusion but represents accurate auditory localization with respect to a changed reference frame. In other words, the subject feels tilted in relation to the horizontal and thus chooses a speaker that is displaced in relation to his or her body by the extent of the apparent self-tilt. Our goal in the present experiments was to determine whether a genuine audiogravic illusion exists such that sound localization vis a vis the head itself is altered. To do so, we measured head-centric auditory localization in azimuth during exposure to GIF transitions in a rotating room. Using headrelative instead of horizon-relative localization avoids the issue of a changed external reference frame for localization. We also used greater changes in GIF magnitude and direction and a more comfortable posture for the subjects than Graybiel and Niven. Our aims included determining whether changes in the angle of GIF, the magnitude of GIF, or a combination of the two lead to changes in head-relative auditory localization. We also wanted to observe the time course of any changes. Whether a head-relative audiogravic illusion exists has important theoretical implications. Binaural acoustic information such as interaural time and spectral differences are intrinsically in a head-centric frame of reference in humans. By contrast, neural maps in the colliculus of cats, animals with motile ears, have receptive fields defined in head-centric coordinates that include compensation for ear movements (Middlebrooks and Experiment 1: twofold increase and 60 rotation of GIF SUBJECTS. Fourteen subjects, nine males and five females, including two of the authors participated. Their ages ranged from 20 to 55 yr. The selection criteria included self-reports of normal hearing, balance, and posture and no general health restrictions that would make exposure to 2.0 g hazardous. The procedures used were approved by the Brandeis Human Subject Committee and were explained to subjects before they gave informed consent. APPARATUS. The experiment was performed in the Graybiel Laboratory slow rotation room (SRR), a circular enclosure 6.7 m in diameter powered by electric motors. A dedicated controller with a computer interface permits the programming of desired angular velocity profiles. The on-board equipment included a bed that could be tilted around its long axis, provisions for subject restraint, a system for generating spatially localizable sounds, and a joystick for the subject to indicate responses. Figure 2A illustrates the experimental situation. The bed held the subject supine, with his or her interaural axis co-linear with a radius of the room. The head s midline was 2.5 m from the center of the room with either the right or left ear toward the center. The desired ear was oriented toward the center by repositioning the entire bed with respect to the SRR. A stiff, tight fitting foam mold surrounded the back and sides of the head, and elastic straps across the forehead and chin further immobilized the head. Earphones (Sennheiser, Model HD540 II) were embedded in the head mold. The body rested on a stiff form fitting foam pad, and foam side pads restricted lateral movement. Adjustable braces at the shoulders, hips, knees, and ankles and a chest plate further restricted possible movement during rotation. (Video measurements of how much the head could move in the head-restraint system during increases in magnitude and rotations of the GIF for the conditions in the three experiments of the present report indicated a typical range under 0.5.) A Crystal River Convolvotron II board mounted in a PC was used to present acoustic stimuli over the earphones that gave rise to the perception of external sounds. The input to the system was squarewave modulated (4 Hz), Gaussian white noise (500 Hz to 20 khz) from a Sony digital audio tape player. The Convolvotron has a set of HRTFs containing all the spatial cues normally present in the signals reaching each ear canal from sources at different positions around a listener s head (Wenzel et al. 1993). The system generates signals for the left and right ears by convolving a monaural time series input with the pair of filters mapped to the desired location of the simulated sound. (Filters for nonmeasured positions are interpolated from adjacent measurements.) The subject could adjust the sound s location by

3 G-FORCE AFFECTS AUDITORY LOCALIZATION 2457 acceleration, six during constant velocity (GIF equal 2.0 g at 60 right of the median plane), seven during deceleration, and five each during the immediate and delayed postrotation periods. When a trial was completed, the computer saved the azimuthal angle of the stimulus chosen as being in the midline and the trial duration. In our nomenclature, 0 indicates stimulus settings actually to the median plane of the head, positive angles designate rightward settings relative to the head median plane, and negative leftward. Eleven subjects were positioned such that during rotation the GIF rotated rightward relative to their midsagittal plane (see Fig. 3) and eight were positioned so that it rotated leftward (including 5 of the 11, who were retested). Reversing the direction of GIF rotation relative to the subject was achieved by changing the orientation of the bed to have the subject s other shoulder toward the wall of the room. The interaural axis was always aligned with a radius of the room, as in Fig. 2A. We took the average of a subject s five prerotation midline settings and used this as the zero baseline for comparing the settings in the other periods of the experiment. The actual midline settings varied up to a degree from the ideal observer setting. FIG. 2. A: illustration of the experimental set up in the rotating room. The bed is shown in the earth-horizontal position, but it could rotate subjects around their long axis. The contoured foam restraints for the head and body are not shown, for clarity. Inset: how the subject used the joystick for adjusting the sound location. B: time series plots of normalized room angular velocity, GIF magnitude above 1 g and GIF angle relative to the subject s median plane in experiment 1. Pre- and postrotation, GIF magnitude is 1.0 g and GIF angle is aligned with the median plane (0 ). When angular velocity is ramping up to 152 /s, the GIF magnitude rises exponentially to 2.0 g and GIF angle sigmoidally approaches 60 right of the median plane. The deceleration patterns are the reverse of acceleration. means of a joystick connected to the PC through an A/D converter. Applying isometric torque to a sleeve (15 cm long 2.5 cm diameter) around the joystick handle increased or decreased the azimuthal angle of the HRTF pair used by the Convolvotron. The rate of change in stimulus azimuth varied randomly between trials (12 18 /s). The subject depressed a button switch at the free end of the joystick handle to indicate when he or she was satisfied the sound was in the median plane of the head. PROCEDURE. The subjects were blindfolded throughout the experiment. They were always kept earth-horizontal while the magnitude and orientation of GIF were manipulated by rotating the SRR. In the prerotation baseline phase, the SRR was stationary while auditory settings to the midsagittal plane of the head were made over a 100-s period. In phase two, the SRR was accelerated at 1 /s 2 for 152 s, held at 152 /s for 100 s, and decelerated to a stop at 1 /s 2. At constant velocity the resultant GIF was tilted 60 with respect to the subject s median plane and had a magnitude of 2.0 g. The third phase with the SRR stationary followed immediately and lasted for 100 s. A fourth identical phase followed after a 300-s delay. An entire run took 1,004 s. Figure 2B illustrates the SRR speed as well as GIF magnitude and direction during the rotation phase of a run. Figure 3A shows the experimental conditions during the baseline and constant velocity phases. Every 20 s during a run, the computer simulated a sound in the azimuthal plane at a random angle between 75 right and left of the head s midsagittal plane. The blindfolded subject s task was to use the joystick to bring the sound into the midsagittal plane of his or her head and then to press the thumb switch to indicate completion. The sound went off between trials. The subjects were given at least 20 prerotation practice trials before a run. They were warned that they would feel supine when the SRR was stationary and tilted when it was turning. They were instructed to make all auditory settings relative to their head rather than to external space. Five trials were run in the prerotation period (GIF equal 1.0 g at 0 to the median plane), seven during Experiment 2: twofold increase of GIF in the midsagittal plane We again doubled GIF magnitude but arranged for it to rotate into rather than out of alignment with the subject s median plane. This allowed us to determine whether an increase in magnitude of GIF per se would elicit an audiogravic illusion or whether a displacement relative to the sagittal plane is also necessary. FIG. 3. Schematic of experimental conditions. The same rotating room angular velocity profile ( ) was used in experiments 1 and 2. A:inexperiment 1, the subject was always supine, so the GIF vector equaled 1.0 g in the median plane while the room was stationary. At constant velocity rotation, a centrifugal force directed into the right ear produced a resultant GIF of 2.0 g oriented 60 right of the median plane. If the subject lay supine with his or her left ear toward the center (not shown), then the GIF rotated 60 leftward relative to the subject s median plane. B: inexperiment 2, subjects were tilted 60 right ear down about their long axis. With the room stationary, the GIF vector equaled 1.0 g, 60 left of the median plane; during constant velocity rotation, the resultant GIF was 2.0 g in the median plane. C: inexperiment 3, the supine subjects were tilted about their z axis in 15 increments up to 75 left or right ear down. The room was always stationary. Only the extreme left and right ear down tilts are illustrated.

4 2458 DIZIO, HELD, LACKNER, SHINN-CUNNINGHAM, AND DURLACH Ten subjects participated. Nine had been in experiment 1, including two of the authors. All gave informed consent to the Human Subject Committee approved protocol. The apparatus and procedure were the same as in experiment 1 with one exception. The restrained subject was tilted 60 right ear down from the supine position toward the center of the rotating room (see Fig. 3B). In the no-rotation periods, the GIF equaled 1.0 g and was oriented 60 left of the subject s median plane; when the room was rotating at 152 /s constant velocity, the GIF equaled 2.0 g and was aligned with the subject s median plane. The average of the five prerotation midline settings for each subject was taken as his or her zero baseline. Experiment 3: tilt of the median plane in a normal 1.0 g environment Six subjects who had participated in the prior experiments took part. The rotating room was always stationary so that the GIF was always 1.0 g. The subject s orientation to gravity was set to 1 of 11 bed angles around the z-axis between 75 right and left at 15 increments (see Fig. 3C). Subjects made auditory settings to center a sound in the head s median plane as in the earlier experiments. The bed angles were presented in random order and each angle was repeated six times within a session. A position was held long enough for the subject to make one setting and then the bed was manually moved to a new position. RESULTS Experiment 1 Figure 4A shows the averaged sequential auditory midline settings for subjects exposed to rightward rotation of the GIF relative to the midsagittal plane. During acceleration, auditory settings shifted to the right relative to the 1.0 g baseline and then plateaued at constant velocity (GIF equal 2.0 g, tilted 60 right re the head). In the six trials done at constant velocity, subjects indicated as being straight ahead auditory stimuli (mean SD) right of the baseline settings. During deceleration, settings shifted back toward prerotation baseline, reaching their resting level before the room came to a stop. In the immediate postrotation period, the average auditory setting was left of prerotation. This value was virtually unchanged five minutes later, left of baseline. ANOVA (SPSS MANOVA procedure) revealed significant differences [F(3,30) 4.73, P 0.008] among the four steady GIF periods prerotation, constant velocity, immediate postrotation, and delayed postrotation. Pairwise contrasts indicated that auditory settings in the constant velocity phase differed significantly from each of the no-rotation conditions (P at least), but the no-rotation conditions did not differ from one another. An ANOVA was also performed to test for differences in trial-to-trial variability across the steady GIF periods. The standard deviation of each subject s settings in each GIF period was used as a measure of variability. The overall test was significant [F(3,30) 4.02, P 0.034]. The constant velocity condition was significantly more variable (7.19 standard deviation) than the three no-rotation conditions collectively (5.28 ), P It took on average of 6.9 s to complete an auditory setting. There was no effect of rotation on the time to make a setting. Figure 4B shows the auditory midline settings of the subjects who were exposed to a leftward shift of the GIF during rotation. Their results were directionally opposite to those tested with rightward GIF rotation. During acceleration, auditory settings shifted leftward relative to baseline and at constant velocity rotation peaked at 6.8 leftward. During deceleration, the settings shifted back toward prerotation baseline and were at baseline by the time the room fully stopped. As with the subjects exposed to rightward displacement of the GIF, a MANOVA indicated that auditory settings varied significantly across the prerotation, constant velocity, immediate postrotation, and final postrotation periods [F(3,13) 34.6, P 0.004]. Only the constant velocity period settings differed significantly from the other conditions in pairwise comparisons. The trial-to-trial variability (standard deviation) was also greater in the constant velocity period compared with the other three periods. Experiment 2 The results are plotted in Fig. 5. There was very little shift relative to prerotation baseline during acceleration, settings averaged 2.3 right of baseline during constant velocity, 0.37 right immediately postrotation and 1.66 left in the delayed postrotation period. Analysis of variance revealed no significant difference among the four periods [F(3,27) 1.74, P 0.182]. FIG. 4. Plot of average sound settings (relative to prerotation baseline) and room angular velocity vs. time for experiment 1. A: at constant velocity (100-s duration), GIF magnitude 2.0 g, GIF direction 60 right of the median plane (A, n 11) or 60 left of the median plane (B, n 8). Experiment 3 The results are presented in terms of angles of the GIF relative to the subject s median plane with positive angles

5 G-FORCE AFFECTS AUDITORY LOCALIZATION 2459 FIG. 5. Plot of average sound settings, relative to baseline (n 10), and room angular velocity vs. time in experiment 2. GIF magnitude 2.0 g, GIF aligned with the median plane for 100 s at constant velocity. representing rightward displacement of the GIF in relation to the median plane. The midline settings at zero tilt angle were used as the baseline reference value. Each subject s six repeated settings at the same tilt angle were averaged and linear regression lines were fit to the auditory settings versus GIF tilt. Statistical comparisons were made of the average slopes. Figure 6 summarizes the results. In tilted conditions, the settings shifted slightly but systematically in the direction that the GIF rotated. The average slope was only 0.05, but this was significantly different from 0 (t 6.497, P 0.001) because the results were very consistent from subject to subject. DISCUSSION Experiment 1 The observations in experiment 1 unequivocally confirm the existence of an audiogravic illusion when the GIF vector is increased in magnitude and rotated away from the median plane of the head. Sounds must be shifted in the same direction as the rotation of the GIF relative to head azimuth to be perceived in the head s median plane. General discussion We conclude that a true audiogravic illusion exists in the form of a head-relative shift in auditory localization during exposure to a changing linear GIF resultant. The apparent direction of an auditory target shifts in the same plane but in the opposite direction to the displacement of the GIF resultant. In other words, increasing the magnitude of the GIF resultant and changing its direction relative to the head and torso induces an apparent displacement of a sound source relative to the head in the opposite direction. In their original study, Graybiel and Niven (1951) used ambient sound sources and had subjects make localization judgments relative to the apparent horizontal. They observed changes in auditory settings that corresponded to about 80% of the angular displacement of the GIF resultant. Howard (Howard 1982; Howard and Templeton 1966) argued that this shift does not represent an illusion but is a change attributable to using a new reference frame, that what needs to be explained is why the shift is not 100%. The head relative shifts we have observed in the present study correspond to about 20% of the shift of the GIF resultant. The Graybiel and Niven (1951) results thus reflect a reference frame shift and a true audiogravic illusion. We have found that a shift in sound lateralization is produced if the GIF resultant rotates away from the median plane and simultaneously increases in magnitude from 1.0 to 2.0 g. There is little or no bias if the GIF rotates into alignment with the median plane during the transition from 1.0 to 2.0 g. The auditory shifts are tightly coupled to temporal changes in GIF, and they return to baseline without significant aftereffects on return to a 1 g GIF. The small change in auditory localization associated with rotating the GIF relative to the sagittal plane of the head complements and is consistent with the findings of Lewald and Ehrenstein (1998), who found that turning the head relative to the torso without changing gravitoinertial orientation induces the same direction of auditory shift. The existence of an audiogravic illusion indicates an additional level of representation or analysis in computational neural maps subserving auditory localization. HRTF pairs encode target location in an intrinsic head-centric coordinate Experiment 2 The results in experiment 2 indicate that an audiogravic illusion does not occur when the GIF doubles and rotates into the median plane of the head and body. The absence of an auditory shift in this condition and the significant shifts seen in experiment 1 indicate that an increase in GIF magnitude in the sagittal plane is not sufficient to cause an audiogravic illusion but that a rotation of the GIF vector is necessary for the shift in localization to be induced. The final experiment determined how auditory localization would be affected by the direction of the GIF vector when its magnitude was always 1 g. The findings in experiment 3 point to small but systematic changes in perceived azimuth of an acoustic stimulus when a subject is reoriented in a normal terrestrial force background. A rightward rotation of GIF relative to the median plane requires a rightward shift of auditory settings for a sound to be heard in the subject s median plane. The direction of this effect is consistent with what was observed in amplified fashion at 2.0 g in experiment 1. FIG. 6. Plot of average sound settings, relative to baseline (n 6), for subjects in experiment 3 tilted 75 from supine about the horizontal z axis in stationary, 1.0 g conditions. The linear regression line fitting the data is significantly different from 0 because the 95% confidence interval (CI) around the line is quite small.

6 2460 DIZIO, HELD, LACKNER, SHINN-CUNNINGHAM, AND DURLACH system. Psychophysical mappings of HRTF information to perceived azimuth have been established empirically (Wightman and Kistler 1989). However, the relationship between physical acoustic information (HRTFs) and perceived spatial location is remapped by alterations in GIF. This means the neural computations underlying sound localization interrelate binaural auditory HRTF information about the acoustic target with vestibular, proprioceptive, and somatosensory representations of GIF direction and magnitude. The GIF influence on auditory localization may act at a level that affects sensory localization in multiple modalities. For example, a comparison of the audiogravic and oculogravic illusions suggests that alterations in GIF may have parallel effects on auditory and visual localization. The oculogravic illusion is a change in the perceived position or orientation of an object that is physically stationary in relation to an observer when the observer is exposed to a change in direction and magnitude of the GIF vector (Corriea et al. 1968; Graybiel 1952; Miller and Graybiel 1968). The audiogravic and oculogravic illusions are similar stationary auditory and visual targets appear to move and displace in the direction opposite the rotation of a supra-1 g GIF resultant. The audiogravic illusion we have demonstrated is with respect to the head s azimuthal plane while the oculogravic illusion has been tested primarily in the sagittal and frontal planes. Nevertheless, the similarity between the oculogravic and audiogravic illusions raises the possibility of parallel changes in visual and auditory spatial representations or of a common change altering multisensory localization. We have, in fact, completed studies of the oculogravic illusion during changes of GIF in azimuth and find that it matches the audiogravic illusion in amplitude and timing (DiZio, Lackner, and Held, unpublished data) This implies a common mechanism subserving both illusions, one in which the assignment of spatial direction relative to the head involves signals specifying body orientation in relation to the resultant GIF vector. Parietal cortex contains representations of the necessary reference frames for implementing such a transformation (cf. Andersen et al. 1997; Colby and Duhamel 1996; Kalaska et al. 1997; Stein and Meredith 1993). This work was supported by Air Force Office of Scientific Research Contract F REFERENCES ANDERSEN RA, SNYDER LH, BRADLEY D, AND ZING J. Multimodal representation of space in the posterior parietal cortex and its use in planning movements. Annu Rev Neurosci 20: , ARNOULT MD. The localization of sound during rotation of the visual environment. Am J Psychol 65: 48 58, BLAUERT J. Spatial Hearing. Cambridge, MA: MIT Press, CLARK B AND GRAYBIEL A. The effect of angular acceleration on sound localization: the audiogyral illusion. J Psychol 28: , COLBURN HS AND DURLACH NI. Models of binocular interaction. In: Handbook of Perception, edited by Carterette EC and Friedman MP. New York: Academic, COLBY C AND DUHAMEL J-R. Spatial representation for action in parietal cortex. Cereb Cortex 5: , CORRIEA MJ, HIXSON WC, AND NIVEN JI. On predictive equations for subjective judgments of vertical in a force field. Acta Otolaryngol 230: 3 20, GRAYBIEL A. The oculogravic illusion. AMA Arch Ophthal 48: , GRAYBIEL A AND NIVEN JI. The effect of a change in direction of resultant force on sound localization: the audiogravic illusion. J Exp Psychol 42: , HELD R. Shift in binaural localization after prolonged exposure to atypical combinations of stimuli. Am J Psychol 68: , HOWARD IP. Visual Orientation. New York: Wiley, HOWARD IP AND TEMPLETON WB. Human Spatial Orientation. London: Wiley, JAY MF AND SPARKS DL. Auditory receptive fields in primate superior colliculus shift with changes in eye position. Nature 309: , KALASKA JF, SCOTT SH, CISEK P, AND SERGIO L. Cortical control of reaching movements. Curr Opinion Neurobiol 7: , LACKNER JR AND SHENKER B. Proprioceptive influences on auditory and visual spatial localization. J Neurosci 5: , LESTER G AND MORANT RB. Apparent sound displacement during vestibular stimulation. Am J Psychol 83: , LEWALD J AND EHRENSTEIN WH. The effect of eye position on auditory lateralization. Exp Brain Res 108: , LEWALD J AND EHRENSTEIN WH. Influence of head-to-trunk position on sound lateralization. Exp Brain Res 121: , MIDDLEBROOKS JC, XU L, EDDINS AC, AND GREEN DM. Codes for sound source location in nontonotopic auditory cortex. J Neurophysiol 80: , MIDDLEBROOKS JC AND KNUDSEN EI. Changes in external ear position modify the spatial tuning of auditory units in the cat s superior colliculus. J Neurophysiol 57: , MILLER EF AND GRAYBIEL A. Visual horizontal perception in relation to otolith function. Am J Psychol 81: , MUNSTERBERG H AND PIERCE AH. The localization of sound. Psychol Rev 1: , STEIN BE AND MEREDITH MA. The Merging of the Senses. Cambridge, MA: MIT Press, STRICANE B, ANDERSON RA, AND MAZZONI P. Eye-centered, head-centered and intermediate coding of remembered sound locations in area LIP. J Neurophysiol 76: , TEAS DC. A Virtual Acoustic Orientation Instrument. Brooks AFB, TX: Armstrong Laboratory Report, AL-TR , WALLACH H. The role of head movements and vestibular and visual cues in sound localization. J Exp Psychol 27: , WENZEL EM, ARRUDA M, KISTLER DJ, AND WIGHTMAN FL. Localization using nonindividualized head-related transfer-functions. J Acoust Soc Am 94: , WIGHTMAN FL AND KISTLER DJ. A new look at auditory space perception. In: Psychophysical, Physiological and Behavioral Studies in Hearing, edited by vander Brink G and Bilsen FA. Delft: University Press, 1980, p WIGHTMAN FL AND KISTLER DJ. Headphone simulation of free-field listening. II. Psychophysical validation. J Acoust Soc Am 85: , YOST WA AND GOUREVITCH G. Directional Hearing. New York: Springer- Verlag, 1987.

A triangulation method for determining the perceptual center of the head for auditory stimuli

A triangulation method for determining the perceptual center of the head for auditory stimuli A triangulation method for determining the perceptual center of the head for auditory stimuli PACS REFERENCE: 43.66.Qp Brungart, Douglas 1 ; Neelon, Michael 2 ; Kordik, Alexander 3 ; Simpson, Brian 4 1

More information

Proceedings of Meetings on Acoustics

Proceedings of Meetings on Acoustics Proceedings of Meetings on Acoustics Volume 19, 213 http://acousticalsociety.org/ IA 213 Montreal Montreal, anada 2-7 June 213 Psychological and Physiological Acoustics Session 3pPP: Multimodal Influences

More information

A cat's cocktail party: Psychophysical, neurophysiological, and computational studies of spatial release from masking

A cat's cocktail party: Psychophysical, neurophysiological, and computational studies of spatial release from masking A cat's cocktail party: Psychophysical, neurophysiological, and computational studies of spatial release from masking Courtney C. Lane 1, Norbert Kopco 2, Bertrand Delgutte 1, Barbara G. Shinn- Cunningham

More information

Spatial Judgments from Different Vantage Points: A Different Perspective

Spatial Judgments from Different Vantage Points: A Different Perspective Spatial Judgments from Different Vantage Points: A Different Perspective Erik Prytz, Mark Scerbo and Kennedy Rebecca The self-archived postprint version of this journal article is available at Linköping

More information

Proceedings of Meetings on Acoustics

Proceedings of Meetings on Acoustics Proceedings of Meetings on Acoustics Volume 19, 2013 http://acousticalsociety.org/ ICA 2013 Montreal Montreal, Canada 2-7 June 2013 Psychological and Physiological Acoustics Session 3pPP: Multimodal Influences

More information

III. Publication III. c 2005 Toni Hirvonen.

III. Publication III. c 2005 Toni Hirvonen. III Publication III Hirvonen, T., Segregation of Two Simultaneously Arriving Narrowband Noise Signals as a Function of Spatial and Frequency Separation, in Proceedings of th International Conference on

More information

Chapter 73. Two-Stroke Apparent Motion. George Mather

Chapter 73. Two-Stroke Apparent Motion. George Mather Chapter 73 Two-Stroke Apparent Motion George Mather The Effect One hundred years ago, the Gestalt psychologist Max Wertheimer published the first detailed study of the apparent visual movement seen when

More information

HRIR Customization in the Median Plane via Principal Components Analysis

HRIR Customization in the Median Plane via Principal Components Analysis 한국소음진동공학회 27 년춘계학술대회논문집 KSNVE7S-6- HRIR Customization in the Median Plane via Principal Components Analysis 주성분분석을이용한 HRIR 맞춤기법 Sungmok Hwang and Youngjin Park* 황성목 박영진 Key Words : Head-Related Transfer

More information

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Date of Report: September 1 st, 2016 Fellow: Heather Panic Advisors: James R. Lackner and Paul DiZio Institution: Brandeis

More information

The Haptic Perception of Spatial Orientations studied with an Haptic Display

The Haptic Perception of Spatial Orientations studied with an Haptic Display The Haptic Perception of Spatial Orientations studied with an Haptic Display Gabriel Baud-Bovy 1 and Edouard Gentaz 2 1 Faculty of Psychology, UHSR University, Milan, Italy gabriel@shaker.med.umn.edu 2

More information

Binaural Hearing. Reading: Yost Ch. 12

Binaural Hearing. Reading: Yost Ch. 12 Binaural Hearing Reading: Yost Ch. 12 Binaural Advantages Sounds in our environment are usually complex, and occur either simultaneously or close together in time. Studies have shown that the ability to

More information

Audio Engineering Society. Convention Paper. Presented at the 131st Convention 2011 October New York, NY, USA

Audio Engineering Society. Convention Paper. Presented at the 131st Convention 2011 October New York, NY, USA Audio Engineering Society Convention Paper Presented at the 131st Convention 2011 October 20 23 New York, NY, USA This Convention paper was selected based on a submitted abstract and 750-word precis that

More information

A Virtual Audio Environment for Testing Dummy- Head HRTFs modeling Real Life Situations

A Virtual Audio Environment for Testing Dummy- Head HRTFs modeling Real Life Situations A Virtual Audio Environment for Testing Dummy- Head HRTFs modeling Real Life Situations György Wersényi Széchenyi István University, Hungary. József Répás Széchenyi István University, Hungary. Summary

More information

Modulating motion-induced blindness with depth ordering and surface completion

Modulating motion-induced blindness with depth ordering and surface completion Vision Research 42 (2002) 2731 2735 www.elsevier.com/locate/visres Modulating motion-induced blindness with depth ordering and surface completion Erich W. Graf *, Wendy J. Adams, Martin Lages Department

More information

Binaural hearing. Prof. Dan Tollin on the Hearing Throne, Oldenburg Hearing Garden

Binaural hearing. Prof. Dan Tollin on the Hearing Throne, Oldenburg Hearing Garden Binaural hearing Prof. Dan Tollin on the Hearing Throne, Oldenburg Hearing Garden Outline of the lecture Cues for sound localization Duplex theory Spectral cues do demo Behavioral demonstrations of pinna

More information

Proceedings of Meetings on Acoustics

Proceedings of Meetings on Acoustics Proceedings of Meetings on Acoustics Volume 19, 2013 http://acousticalsociety.org/ ICA 2013 Montreal Montreal, Canada 2-7 June 2013 Psychological and Physiological Acoustics Session 2aPPa: Binaural Hearing

More information

Spatial Audio Reproduction: Towards Individualized Binaural Sound

Spatial Audio Reproduction: Towards Individualized Binaural Sound Spatial Audio Reproduction: Towards Individualized Binaural Sound WILLIAM G. GARDNER Wave Arts, Inc. Arlington, Massachusetts INTRODUCTION The compact disc (CD) format records audio with 16-bit resolution

More information

Psychoacoustic Cues in Room Size Perception

Psychoacoustic Cues in Room Size Perception Audio Engineering Society Convention Paper Presented at the 116th Convention 2004 May 8 11 Berlin, Germany 6084 This convention paper has been reproduced from the author s advance manuscript, without editing,

More information

A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency

A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency Shunsuke Hamasaki, Atsushi Yamashita and Hajime Asama Department of Precision

More information

Creating three dimensions in virtual auditory displays *

Creating three dimensions in virtual auditory displays * Salvendy, D Harris, & RJ Koubek (eds.), (Proc HCI International 2, New Orleans, 5- August), NJ: Erlbaum, 64-68. Creating three dimensions in virtual auditory displays * Barbara Shinn-Cunningham Boston

More information

THE INTERACTION BETWEEN HEAD-TRACKER LATENCY, SOURCE DURATION, AND RESPONSE TIME IN THE LOCALIZATION OF VIRTUAL SOUND SOURCES

THE INTERACTION BETWEEN HEAD-TRACKER LATENCY, SOURCE DURATION, AND RESPONSE TIME IN THE LOCALIZATION OF VIRTUAL SOUND SOURCES THE INTERACTION BETWEEN HEAD-TRACKER LATENCY, SOURCE DURATION, AND RESPONSE TIME IN THE LOCALIZATION OF VIRTUAL SOUND SOURCES Douglas S. Brungart Brian D. Simpson Richard L. McKinley Air Force Research

More information

Upper hemisphere sound localization using head-related transfer functions in the median plane and interaural differences

Upper hemisphere sound localization using head-related transfer functions in the median plane and interaural differences Acoust. Sci. & Tech. 24, 5 (23) PAPER Upper hemisphere sound localization using head-related transfer functions in the median plane and interaural differences Masayuki Morimoto 1;, Kazuhiro Iida 2;y and

More information

A Three-Channel Model for Generating the Vestibulo-Ocular Reflex in Each Eye

A Three-Channel Model for Generating the Vestibulo-Ocular Reflex in Each Eye A Three-Channel Model for Generating the Vestibulo-Ocular Reflex in Each Eye LAURENCE R. HARRIS, a KARL A. BEYKIRCH, b AND MICHAEL FETTER c a Department of Psychology, York University, Toronto, Canada

More information

Chapter 8: Perceiving Motion

Chapter 8: Perceiving Motion Chapter 8: Perceiving Motion Motion perception occurs (a) when a stationary observer perceives moving stimuli, such as this couple crossing the street; and (b) when a moving observer, like this basketball

More information

Auditory Localization

Auditory Localization Auditory Localization CMPT 468: Sound Localization Tamara Smyth, tamaras@cs.sfu.ca School of Computing Science, Simon Fraser University November 15, 2013 Auditory locatlization is the human perception

More information

Thresholds for Dynamic Changes in a Rotary Switch

Thresholds for Dynamic Changes in a Rotary Switch Proceedings of EuroHaptics 2003, Dublin, Ireland, pp. 343-350, July 6-9, 2003. Thresholds for Dynamic Changes in a Rotary Switch Shuo Yang 1, Hong Z. Tan 1, Pietro Buttolo 2, Matthew Johnston 2, and Zygmunt

More information

Intensity Discrimination and Binaural Interaction

Intensity Discrimination and Binaural Interaction Technical University of Denmark Intensity Discrimination and Binaural Interaction 2 nd semester project DTU Electrical Engineering Acoustic Technology Spring semester 2008 Group 5 Troels Schmidt Lindgreen

More information

HRTF adaptation and pattern learning

HRTF adaptation and pattern learning HRTF adaptation and pattern learning FLORIAN KLEIN * AND STEPHAN WERNER Electronic Media Technology Lab, Institute for Media Technology, Technische Universität Ilmenau, D-98693 Ilmenau, Germany The human

More information

The shape of luminance increments at the intersection alters the magnitude of the scintillating grid illusion

The shape of luminance increments at the intersection alters the magnitude of the scintillating grid illusion The shape of luminance increments at the intersection alters the magnitude of the scintillating grid illusion Kun Qian a, Yuki Yamada a, Takahiro Kawabe b, Kayo Miura b a Graduate School of Human-Environment

More information

Binaural Mechanisms that Emphasize Consistent Interaural Timing Information over Frequency

Binaural Mechanisms that Emphasize Consistent Interaural Timing Information over Frequency Binaural Mechanisms that Emphasize Consistent Interaural Timing Information over Frequency Richard M. Stern 1 and Constantine Trahiotis 2 1 Department of Electrical and Computer Engineering and Biomedical

More information

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 A MODEL OF THE HEAD-RELATED TRANSFER FUNCTION BASED ON SPECTRAL CUES

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 A MODEL OF THE HEAD-RELATED TRANSFER FUNCTION BASED ON SPECTRAL CUES 19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, -7 SEPTEMBER 007 A MODEL OF THE HEAD-RELATED TRANSFER FUNCTION BASED ON SPECTRAL CUES PACS: 43.66.Qp, 43.66.Pn, 43.66Ba Iida, Kazuhiro 1 ; Itoh, Motokuni

More information

GROUPING BASED ON PHENOMENAL PROXIMITY

GROUPING BASED ON PHENOMENAL PROXIMITY Journal of Experimental Psychology 1964, Vol. 67, No. 6, 531-538 GROUPING BASED ON PHENOMENAL PROXIMITY IRVIN ROCK AND LEONARD BROSGOLE l Yeshiva University The question was raised whether the Gestalt

More information

Acoustics Research Institute

Acoustics Research Institute Austrian Academy of Sciences Acoustics Research Institute Spatial SpatialHearing: Hearing: Single SingleSound SoundSource Sourcein infree FreeField Field Piotr PiotrMajdak Majdak&&Bernhard BernhardLaback

More information

Experiments on the locus of induced motion

Experiments on the locus of induced motion Perception & Psychophysics 1977, Vol. 21 (2). 157 161 Experiments on the locus of induced motion JOHN N. BASSILI Scarborough College, University of Toronto, West Hill, Ontario MIC la4, Canada and JAMES

More information

Tone-in-noise detection: Observed discrepancies in spectral integration. Nicolas Le Goff a) Technische Universiteit Eindhoven, P.O.

Tone-in-noise detection: Observed discrepancies in spectral integration. Nicolas Le Goff a) Technische Universiteit Eindhoven, P.O. Tone-in-noise detection: Observed discrepancies in spectral integration Nicolas Le Goff a) Technische Universiteit Eindhoven, P.O. Box 513, NL-5600 MB Eindhoven, The Netherlands Armin Kohlrausch b) and

More information

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience Ryuta Okazaki 1,2, Hidenori Kuribayashi 3, Hiroyuki Kajimioto 1,4 1 The University of Electro-Communications,

More information

A Vestibular Sensation: Probabilistic Approaches to Spatial Perception (II) Presented by Shunan Zhang

A Vestibular Sensation: Probabilistic Approaches to Spatial Perception (II) Presented by Shunan Zhang A Vestibular Sensation: Probabilistic Approaches to Spatial Perception (II) Presented by Shunan Zhang Vestibular Responses in Dorsal Visual Stream and Their Role in Heading Perception Recent experiments

More information

The Shape-Weight Illusion

The Shape-Weight Illusion The Shape-Weight Illusion Mirela Kahrimanovic, Wouter M. Bergmann Tiest, and Astrid M.L. Kappers Universiteit Utrecht, Helmholtz Institute Padualaan 8, 3584 CH Utrecht, The Netherlands {m.kahrimanovic,w.m.bergmanntiest,a.m.l.kappers}@uu.nl

More information

An Auditory Localization and Coordinate Transform Chip

An Auditory Localization and Coordinate Transform Chip An Auditory Localization and Coordinate Transform Chip Timothy K. Horiuchi timmer@cns.caltech.edu Computation and Neural Systems Program California Institute of Technology Pasadena, CA 91125 Abstract The

More information

The analysis of multi-channel sound reproduction algorithms using HRTF data

The analysis of multi-channel sound reproduction algorithms using HRTF data The analysis of multichannel sound reproduction algorithms using HRTF data B. Wiggins, I. PatersonStephens, P. Schillebeeckx Processing Applications Research Group University of Derby Derby, United Kingdom

More information

Lecture 14. Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017

Lecture 14. Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017 Motion Perception Chapter 8 Lecture 14 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017 1 (chap 6 leftovers) Defects in Stereopsis Strabismus eyes not aligned, so diff images fall on

More information

AUDITORY ILLUSIONS & LAB REPORT FORM

AUDITORY ILLUSIONS & LAB REPORT FORM 01/02 Illusions - 1 AUDITORY ILLUSIONS & LAB REPORT FORM NAME: DATE: PARTNER(S): The objective of this experiment is: To understand concepts such as beats, localization, masking, and musical effects. APPARATUS:

More information

Externalization in binaural synthesis: effects of recording environment and measurement procedure

Externalization in binaural synthesis: effects of recording environment and measurement procedure Externalization in binaural synthesis: effects of recording environment and measurement procedure F. Völk, F. Heinemann and H. Fastl AG Technische Akustik, MMK, TU München, Arcisstr., 80 München, Germany

More information

Discriminating direction of motion trajectories from angular speed and background information

Discriminating direction of motion trajectories from angular speed and background information Atten Percept Psychophys (2013) 75:1570 1582 DOI 10.3758/s13414-013-0488-z Discriminating direction of motion trajectories from angular speed and background information Zheng Bian & Myron L. Braunstein

More information

2920 J. Acoust. Soc. Am. 102 (5), Pt. 1, November /97/102(5)/2920/5/$ Acoustical Society of America 2920

2920 J. Acoust. Soc. Am. 102 (5), Pt. 1, November /97/102(5)/2920/5/$ Acoustical Society of America 2920 Detection and discrimination of frequency glides as a function of direction, duration, frequency span, and center frequency John P. Madden and Kevin M. Fire Department of Communication Sciences and Disorders,

More information

The Effect of Brainwave Synchronization on Concentration and Performance: An Examination of German Students

The Effect of Brainwave Synchronization on Concentration and Performance: An Examination of German Students The Effect of Brainwave Synchronization on Concentration and Performance: An Examination of German Students Published online by the Deluwak UG Research Department, December 2016 Abstract This study examines

More information

Methods. Experimental Stimuli: We selected 24 animals, 24 tools, and 24

Methods. Experimental Stimuli: We selected 24 animals, 24 tools, and 24 Methods Experimental Stimuli: We selected 24 animals, 24 tools, and 24 nonmanipulable object concepts following the criteria described in a previous study. For each item, a black and white grayscale photo

More information

DETERMINATION OF EQUAL-LOUDNESS RELATIONS AT HIGH FREQUENCIES

DETERMINATION OF EQUAL-LOUDNESS RELATIONS AT HIGH FREQUENCIES DETERMINATION OF EQUAL-LOUDNESS RELATIONS AT HIGH FREQUENCIES Rhona Hellman 1, Hisashi Takeshima 2, Yo^iti Suzuki 3, Kenji Ozawa 4, and Toshio Sone 5 1 Department of Psychology and Institute for Hearing,

More information

NEAR-FIELD VIRTUAL AUDIO DISPLAYS

NEAR-FIELD VIRTUAL AUDIO DISPLAYS NEAR-FIELD VIRTUAL AUDIO DISPLAYS Douglas S. Brungart Human Effectiveness Directorate Air Force Research Laboratory Wright-Patterson AFB, Ohio Abstract Although virtual audio displays are capable of realistically

More information

Aviation Medicine Seminar Series. Aviation Medicine Seminar Series

Aviation Medicine Seminar Series. Aviation Medicine Seminar Series Aviation Medicine Seminar Series Aviation Medicine Seminar Series Bruce R. Gilbert, M.D., Ph.D. Associate Clinical Professor of Urology Weill Cornell Medical College Stony Brook University Medical College

More information

University of Huddersfield Repository

University of Huddersfield Repository University of Huddersfield Repository Moore, David J. and Wakefield, Jonathan P. Surround Sound for Large Audiences: What are the Problems? Original Citation Moore, David J. and Wakefield, Jonathan P.

More information

The relation between perceived apparent source width and interaural cross-correlation in sound reproduction spaces with low reverberation

The relation between perceived apparent source width and interaural cross-correlation in sound reproduction spaces with low reverberation Downloaded from orbit.dtu.dk on: Feb 05, 2018 The relation between perceived apparent source width and interaural cross-correlation in sound reproduction spaces with low reverberation Käsbach, Johannes;

More information

Computational Perception /785

Computational Perception /785 Computational Perception 15-485/785 Assignment 1 Sound Localization due: Thursday, Jan. 31 Introduction This assignment focuses on sound localization. You will develop Matlab programs that synthesize sounds

More information

VISUAL VESTIBULAR INTERACTIONS FOR SELF MOTION ESTIMATION

VISUAL VESTIBULAR INTERACTIONS FOR SELF MOTION ESTIMATION VISUAL VESTIBULAR INTERACTIONS FOR SELF MOTION ESTIMATION Butler J 1, Smith S T 2, Beykirch K 1, Bülthoff H H 1 1 Max Planck Institute for Biological Cybernetics, Tübingen, Germany 2 University College

More information

Computational Perception. Sound localization 2

Computational Perception. Sound localization 2 Computational Perception 15-485/785 January 22, 2008 Sound localization 2 Last lecture sound propagation: reflection, diffraction, shadowing sound intensity (db) defining computational problems sound lateralization

More information

Sound source localization and its use in multimedia applications

Sound source localization and its use in multimedia applications Notes for lecture/ Zack Settel, McGill University Sound source localization and its use in multimedia applications Introduction With the arrival of real-time binaural or "3D" digital audio processing,

More information

IOC, Vector sum, and squaring: three different motion effects or one?

IOC, Vector sum, and squaring: three different motion effects or one? Vision Research 41 (2001) 965 972 www.elsevier.com/locate/visres IOC, Vector sum, and squaring: three different motion effects or one? L. Bowns * School of Psychology, Uni ersity of Nottingham, Uni ersity

More information

Lecture IV. Sensory processing during active versus passive movements

Lecture IV. Sensory processing during active versus passive movements Lecture IV Sensory processing during active versus passive movements The ability to distinguish sensory inputs that are a consequence of our own actions (reafference) from those that result from changes

More information

Proceedings of Meetings on Acoustics

Proceedings of Meetings on Acoustics Proceedings of Meetings on Acoustics Volume 19, 2013 http://acousticalsociety.org/ ICA 2013 Montreal Montreal, Canada 2-7 June 2013 Engineering Acoustics Session 2pEAb: Controlling Sound Quality 2pEAb10.

More information

Enhancing 3D Audio Using Blind Bandwidth Extension

Enhancing 3D Audio Using Blind Bandwidth Extension Enhancing 3D Audio Using Blind Bandwidth Extension (PREPRINT) Tim Habigt, Marko Ðurković, Martin Rothbucher, and Klaus Diepold Institute for Data Processing, Technische Universität München, 829 München,

More information

Convention Paper 9712 Presented at the 142 nd Convention 2017 May 20 23, Berlin, Germany

Convention Paper 9712 Presented at the 142 nd Convention 2017 May 20 23, Berlin, Germany Audio Engineering Society Convention Paper 9712 Presented at the 142 nd Convention 2017 May 20 23, Berlin, Germany This convention paper was selected based on a submitted abstract and 750-word precis that

More information

ORIENTATION IN SIMPLE VIRTUAL AUDITORY SPACE CREATED WITH MEASURED HRTF

ORIENTATION IN SIMPLE VIRTUAL AUDITORY SPACE CREATED WITH MEASURED HRTF ORIENTATION IN SIMPLE VIRTUAL AUDITORY SPACE CREATED WITH MEASURED HRTF F. Rund, D. Štorek, O. Glaser, M. Barda Faculty of Electrical Engineering Czech Technical University in Prague, Prague, Czech Republic

More information

Receptive Fields and Binaural Interactions for Virtual-Space Stimuli in the Cat Inferior Colliculus

Receptive Fields and Binaural Interactions for Virtual-Space Stimuli in the Cat Inferior Colliculus Receptive Fields and Binaural Interactions for Virtual-Space Stimuli in the Cat Inferior Colliculus BERTRAND DELGUTTE, 1,2 PHILIP X. JORIS, 3 RUTH Y. LITOVSKY, 1,3 AND TOM C. T. YIN 3 1 Eaton-Peabody Laboratory,

More information

Here I present more details about the methods of the experiments which are. described in the main text, and describe two additional examinations which

Here I present more details about the methods of the experiments which are. described in the main text, and describe two additional examinations which Supplementary Note Here I present more details about the methods of the experiments which are described in the main text, and describe two additional examinations which assessed DF s proprioceptive performance

More information

3D sound image control by individualized parametric head-related transfer functions

3D sound image control by individualized parametric head-related transfer functions D sound image control by individualized parametric head-related transfer functions Kazuhiro IIDA 1 and Yohji ISHII 1 Chiba Institute of Technology 2-17-1 Tsudanuma, Narashino, Chiba 275-001 JAPAN ABSTRACT

More information

The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System

The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System The Influence of Visual Illusion on Visually Perceived System and Visually Guided Action System Yu-Hung CHIEN*, Chien-Hsiung CHEN** * Graduate School of Design, National Taiwan University of Science and

More information

The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces

The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces Studies in Perception and Action VII S. Rogers & J. Effken (Eds.)! 2003 Lawrence Erlbaum Associates, Inc. The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces Sheena Rogers 1,

More information

Vision V Perceiving Movement

Vision V Perceiving Movement Vision V Perceiving Movement Overview of Topics Chapter 8 in Goldstein (chp. 9 in 7th ed.) Movement is tied up with all other aspects of vision (colour, depth, shape perception...) Differentiating self-motion

More information

The Persistence of Vision in Spatio-Temporal Illusory Contours formed by Dynamically-Changing LED Arrays

The Persistence of Vision in Spatio-Temporal Illusory Contours formed by Dynamically-Changing LED Arrays The Persistence of Vision in Spatio-Temporal Illusory Contours formed by Dynamically-Changing LED Arrays Damian Gordon * and David Vernon Department of Computer Science Maynooth College Ireland ABSTRACT

More information

Vision V Perceiving Movement

Vision V Perceiving Movement Vision V Perceiving Movement Overview of Topics Chapter 8 in Goldstein (chp. 9 in 7th ed.) Movement is tied up with all other aspects of vision (colour, depth, shape perception...) Differentiating self-motion

More information

7Motion Perception. 7 Motion Perception. 7 Computation of Visual Motion. Chapter 7

7Motion Perception. 7 Motion Perception. 7 Computation of Visual Motion. Chapter 7 7Motion Perception Chapter 7 7 Motion Perception Computation of Visual Motion Eye Movements Using Motion Information The Man Who Couldn t See Motion 7 Computation of Visual Motion How would you build a

More information

Egocentric reference frame bias in the palmar haptic perception of surface orientation. Allison Coleman and Frank H. Durgin. Swarthmore College

Egocentric reference frame bias in the palmar haptic perception of surface orientation. Allison Coleman and Frank H. Durgin. Swarthmore College Running head: HAPTIC EGOCENTRIC BIAS Egocentric reference frame bias in the palmar haptic perception of surface orientation Allison Coleman and Frank H. Durgin Swarthmore College Reference: Coleman, A.,

More information

Evaluation of a new stereophonic reproduction method with moving sweet spot using a binaural localization model

Evaluation of a new stereophonic reproduction method with moving sweet spot using a binaural localization model Evaluation of a new stereophonic reproduction method with moving sweet spot using a binaural localization model Sebastian Merchel and Stephan Groth Chair of Communication Acoustics, Dresden University

More information

The effect of 3D audio and other audio techniques on virtual reality experience

The effect of 3D audio and other audio techniques on virtual reality experience The effect of 3D audio and other audio techniques on virtual reality experience Willem-Paul BRINKMAN a,1, Allart R.D. HOEKSTRA a, René van EGMOND a a Delft University of Technology, The Netherlands Abstract.

More information

What has been learnt from space

What has been learnt from space What has been learnt from space Gilles Clément Director of Research, CNRS Laboratoire Cerveau et Cognition, Toulouse, France Oliver Angerer ESA Directorate of Strategy and External Relations, ESTEC, Noordwijk,

More information

THE MATLAB IMPLEMENTATION OF BINAURAL PROCESSING MODEL SIMULATING LATERAL POSITION OF TONES WITH INTERAURAL TIME DIFFERENCES

THE MATLAB IMPLEMENTATION OF BINAURAL PROCESSING MODEL SIMULATING LATERAL POSITION OF TONES WITH INTERAURAL TIME DIFFERENCES THE MATLAB IMPLEMENTATION OF BINAURAL PROCESSING MODEL SIMULATING LATERAL POSITION OF TONES WITH INTERAURAL TIME DIFFERENCES J. Bouše, V. Vencovský Department of Radioelectronics, Faculty of Electrical

More information

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON Proceedings of ICAD -Tenth Meeting of the International Conference on Auditory Display, Sydney, Australia, July -9, AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON Matti Gröhn CSC - Scientific

More information

Jason Schickler Boston University Hearing Research Center, Department of Biomedical Engineering, Boston University, Boston, Massachusetts 02215

Jason Schickler Boston University Hearing Research Center, Department of Biomedical Engineering, Boston University, Boston, Massachusetts 02215 Spatial unmasking of nearby speech sources in a simulated anechoic environment Barbara G. Shinn-Cunningham a) Boston University Hearing Research Center, Departments of Cognitive and Neural Systems and

More information

PERSONALIZED HEAD RELATED TRANSFER FUNCTION MEASUREMENT AND VERIFICATION THROUGH SOUND LOCALIZATION RESOLUTION

PERSONALIZED HEAD RELATED TRANSFER FUNCTION MEASUREMENT AND VERIFICATION THROUGH SOUND LOCALIZATION RESOLUTION PERSONALIZED HEAD RELATED TRANSFER FUNCTION MEASUREMENT AND VERIFICATION THROUGH SOUND LOCALIZATION RESOLUTION Michał Pec, Michał Bujacz, Paweł Strumiłło Institute of Electronics, Technical University

More information

Spectro-Temporal Methods in Primary Auditory Cortex David Klein Didier Depireux Jonathan Simon Shihab Shamma

Spectro-Temporal Methods in Primary Auditory Cortex David Klein Didier Depireux Jonathan Simon Shihab Shamma Spectro-Temporal Methods in Primary Auditory Cortex David Klein Didier Depireux Jonathan Simon Shihab Shamma & Department of Electrical Engineering Supported in part by a MURI grant from the Office of

More information

Binaural auralization based on spherical-harmonics beamforming

Binaural auralization based on spherical-harmonics beamforming Binaural auralization based on spherical-harmonics beamforming W. Song a, W. Ellermeier b and J. Hald a a Brüel & Kjær Sound & Vibration Measurement A/S, Skodsborgvej 7, DK-28 Nærum, Denmark b Institut

More information

A CLOSER LOOK AT THE REPRESENTATION OF INTERAURAL DIFFERENCES IN A BINAURAL MODEL

A CLOSER LOOK AT THE REPRESENTATION OF INTERAURAL DIFFERENCES IN A BINAURAL MODEL 9th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, -7 SEPTEMBER 7 A CLOSER LOOK AT THE REPRESENTATION OF INTERAURAL DIFFERENCES IN A BINAURAL MODEL PACS: PACS:. Pn Nicolas Le Goff ; Armin Kohlrausch ; Jeroen

More information

Cross-modal integration of auditory and visual apparent motion signals: not a robust process

Cross-modal integration of auditory and visual apparent motion signals: not a robust process Cross-modal integration of auditory and visual apparent motion signals: not a robust process D.Z. van Paesschen supervised by: M.J. van der Smagt M.H. Lamers Media Technology MSc program Leiden Institute

More information

Analysis of Frontal Localization in Double Layered Loudspeaker Array System

Analysis of Frontal Localization in Double Layered Loudspeaker Array System Proceedings of 20th International Congress on Acoustics, ICA 2010 23 27 August 2010, Sydney, Australia Analysis of Frontal Localization in Double Layered Loudspeaker Array System Hyunjoo Chung (1), Sang

More information

Muscular Torque Can Explain Biases in Haptic Length Perception: A Model Study on the Radial-Tangential Illusion

Muscular Torque Can Explain Biases in Haptic Length Perception: A Model Study on the Radial-Tangential Illusion Muscular Torque Can Explain Biases in Haptic Length Perception: A Model Study on the Radial-Tangential Illusion Nienke B. Debats, Idsart Kingma, Peter J. Beek, and Jeroen B.J. Smeets Research Institute

More information

Perception of Self-motion and Presence in Auditory Virtual Environments

Perception of Self-motion and Presence in Auditory Virtual Environments Perception of Self-motion and Presence in Auditory Virtual Environments Pontus Larsson 1, Daniel Västfjäll 1,2, Mendel Kleiner 1,3 1 Department of Applied Acoustics, Chalmers University of Technology,

More information

Convention Paper 9870 Presented at the 143 rd Convention 2017 October 18 21, New York, NY, USA

Convention Paper 9870 Presented at the 143 rd Convention 2017 October 18 21, New York, NY, USA Audio Engineering Society Convention Paper 987 Presented at the 143 rd Convention 217 October 18 21, New York, NY, USA This convention paper was selected based on a submitted abstract and 7-word precis

More information

Perception of the Spatial Vertical During Centrifugation and Static Tilt

Perception of the Spatial Vertical During Centrifugation and Static Tilt Perception of the Spatial Vertical During Centrifugation and Static Tilt Authors Gilles Clément, Alain Berthoz, Bernard Cohen, Steven Moore, Ian Curthoys, Mingjia Dai, Izumi Koizuka, Takeshi Kubo, Theodore

More information

WAVELET-BASED SPECTRAL SMOOTHING FOR HEAD-RELATED TRANSFER FUNCTION FILTER DESIGN

WAVELET-BASED SPECTRAL SMOOTHING FOR HEAD-RELATED TRANSFER FUNCTION FILTER DESIGN WAVELET-BASE SPECTRAL SMOOTHING FOR HEA-RELATE TRANSFER FUNCTION FILTER ESIGN HUSEYIN HACIHABIBOGLU, BANU GUNEL, AN FIONN MURTAGH Sonic Arts Research Centre (SARC), Queen s University Belfast, Belfast,

More information

Speech, Hearing and Language: work in progress. Volume 12

Speech, Hearing and Language: work in progress. Volume 12 Speech, Hearing and Language: work in progress Volume 12 2 Construction of a rotary vibrator and its application in human tactile communication Abbas HAYDARI and Stuart ROSEN Department of Phonetics and

More information

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 4: 7 Feb A. Faulkner.

Perception of pitch. Definitions. Why is pitch important? BSc Audiology/MSc SHS Psychoacoustics wk 4: 7 Feb A. Faulkner. Perception of pitch BSc Audiology/MSc SHS Psychoacoustics wk 4: 7 Feb 2008. A. Faulkner. See Moore, BCJ Introduction to the Psychology of Hearing, Chapter 5. Or Plack CJ The Sense of Hearing Lawrence Erlbaum,

More information

Appendix E. Gulf Air Flight GF-072 Perceptual Study 23 AUGUST 2000 Gulf Air Airbus A (A40-EK) NIGHT LANDING

Appendix E. Gulf Air Flight GF-072 Perceptual Study 23 AUGUST 2000 Gulf Air Airbus A (A40-EK) NIGHT LANDING Appendix E E1 A320 (A40-EK) Accident Investigation Appendix E Gulf Air Flight GF-072 Perceptual Study 23 AUGUST 2000 Gulf Air Airbus A320-212 (A40-EK) NIGHT LANDING Naval Aerospace Medical Research Laboratory

More information

Robotic Sound Localization. the time we don t even notice when we orient ourselves towards a speaker. Sound

Robotic Sound Localization. the time we don t even notice when we orient ourselves towards a speaker. Sound Robotic Sound Localization Background Using only auditory cues, humans can easily locate the source of a sound. Most of the time we don t even notice when we orient ourselves towards a speaker. Sound localization

More information

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 TEMPORAL ORDER DISCRIMINATION BY A BOTTLENOSE DOLPHIN IS NOT AFFECTED BY STIMULUS FREQUENCY SPECTRUM VARIATION. PACS: 43.80. Lb Zaslavski

More information

THE TEMPORAL and spectral structure of a sound signal

THE TEMPORAL and spectral structure of a sound signal IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, VOL. 13, NO. 1, JANUARY 2005 105 Localization of Virtual Sources in Multichannel Audio Reproduction Ville Pulkki and Toni Hirvonen Abstract The localization

More information

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 AUDITORY EVOKED MAGNETIC FIELDS AND LOUDNESS IN RELATION TO BANDPASS NOISES

19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 AUDITORY EVOKED MAGNETIC FIELDS AND LOUDNESS IN RELATION TO BANDPASS NOISES 19 th INTERNATIONAL CONGRESS ON ACOUSTICS MADRID, 2-7 SEPTEMBER 2007 AUDITORY EVOKED MAGNETIC FIELDS AND LOUDNESS IN RELATION TO BANDPASS NOISES PACS: 43.64.Ri Yoshiharu Soeta; Seiji Nakagawa 1 National

More information

Introduction. 1.1 Surround sound

Introduction. 1.1 Surround sound Introduction 1 This chapter introduces the project. First a brief description of surround sound is presented. A problem statement is defined which leads to the goal of the project. Finally the scope of

More information

Self-motion perception from expanding and contracting optical flows overlapped with binocular disparity

Self-motion perception from expanding and contracting optical flows overlapped with binocular disparity Vision Research 45 (25) 397 42 Rapid Communication Self-motion perception from expanding and contracting optical flows overlapped with binocular disparity Hiroyuki Ito *, Ikuko Shibata Department of Visual

More information

Vection in depth during consistent and inconsistent multisensory stimulation

Vection in depth during consistent and inconsistent multisensory stimulation University of Wollongong Research Online Faculty of Health and Behavioural Sciences - Papers (Archive) Faculty of Science, Medicine and Health 2011 Vection in depth during consistent and inconsistent multisensory

More information

Spatial Audio & The Vestibular System!

Spatial Audio & The Vestibular System! ! Spatial Audio & The Vestibular System! Gordon Wetzstein! Stanford University! EE 267 Virtual Reality! Lecture 13! stanford.edu/class/ee267/!! Updates! lab this Friday will be released as a video! TAs

More information