Lateralization of nonspeech audio-visual stimulus combinations.

Research output: Contribution to journalArticlepeer-review

Abstract

Ability to lateralize stimuli was measured in eight normally hearing subjects. In experiment 1 auditory or visual stimuli were presented. Subjects responded with an auditory or visual pointer in conditions where stimulus and response modalities were the same (uni‐modal) or different (cross‐modal). A linear relationship was found between the position of the target stimuli and the perceived lateral position, establishing the correspondence between auditorily and visually presented positions, consistent with Yost [J. Acoust. Soc. Am. 70, 397–409 (1981)]. Mean judgments of linear position were independent of stimulus or response modality. In experiment 2 subjects were presented with bi‐modal audio‐visual stimuli with spatially and temporally correspondent modal components and subjects responded with an auditory pointer. Mean judgments of position were similar to those in experiment 1 but standard deviations were significantly smaller for the bi‐modal stimuli relative to uni‐modal stimuli. Experiments 3 and 4 involved manipulations of the spatial or temporal relationship between modal components of bi‐modal stimuli. Whereas the relative importance of the visual modality was confirmed [Colavita, Percept. Psychophys. 15(2), 409–412 (1974)] the results of both experiments indicated that perception of the location of an audio‐visual stimulus is influenced by information conveyed in both modalities. [Work supported by UKBBSRC.]
Original languageEnglish
Article number2597
JournalJournal of the Acoustical Society of America
Volume99
Issue number4
DOIs
Publication statusPublished - 01 Jan 1996

Fingerprint

Dive into the research topics of 'Lateralization of nonspeech audio-visual stimulus combinations.'. Together they form a unique fingerprint.

Cite this