Audio-Visual Multisensory Integration in Superior Parietal Lobule Revealed by Human Intracranial Recordings

Share Embed


Descripción

Page 1 of 47

Articles in PresS. J Neurophysiol (May 10, 2006). doi:10.1152/jn.00285.2006

Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordings. Sophie Molholm 1,5 ±, Pejman Sehatpour 1, Ashesh D. Mehta 2, Marina Shpaner 1,5, Manuel Gomez-Ramirez 1,5, Stephanie Ortigue 4, Jonathan P. Dyke 3, Theodore H. Schwartz 2, & John J. Foxe 1,5,±. 1

The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia, Nathan S. Kline Institute for Psychiatric Research, 140 Old Orangeburg Road, Orangeburg, NY 10962 2

4

Department of Neurological Surgery 3 Department of Radiology Weill Cornell Medical College New York Presbyterian Hospital 525 East 68th St. New York, N.Y. 10021

Dartmouth Functional Brain Imaging Center Center for Cognitive Neuroscience Dartmouth College 6162 Moore Hall Hanover, NH, 03755 5

Program in Cognitive Neuroscience Department of Psychology The City College of the City University of New York North Academic Complex 138th St. & Convent Avenue New York, NY 10031

±

Correspondence to SM ([email protected]) or JJF ([email protected]). Running head: Multisensory Integration in Human SPL The Cognitive Neurophysiology Laboratory, Program in Cognitive Neuroscience and Schizophrenia

Copyright © 2006 by the American Physiological Society.

Page 2 of 47

Multisensory Integration in Human SPL ABSTRACT

Intracranial recordings from three human subjects provide the first direct electrophysiological evidence for audio-visual multisensory processing in the human Superior Parietal Lobule (SPL). Auditory and visual sensory inputs project to the same highly localized region of the parietal cortex with auditory inputs arriving considerably earlier (30 ms) than visual inputs (75 ms). Multisensory integration processes in this region were assessed by comparing the response to simultaneous audio-visual stimulation with the algebraic sum of responses to the constituent auditory and visual unisensory stimulus conditions. Significant integration effects were seen with almost identical morphology across the three subjects, beginning between 120-160 ms. These results are discussed in the context of SPL’s role in supramodal spatial attention and sensory-motor transformations.

1

Page 3 of 47

Multisensory Integration in Human SPL INTRODUCTION

The bulk of our knowledge regarding multisensory processing in the parietal cortex comes from intracranial recordings in animals (Andersen et al. 1997; Barth et al. 1995; Brett-Green et al. 2004; Di et al. 1994; Cohen, Cohen, & Gifford, III, 2004; Mazzoni et al. 1996; Schlack et al. 2005; Wallace et al. 1993; 2004). Single unit recordings in non-human primates, with the greatest across-species anatomical correspondence to humans, have revealed multisensory neurons in the inferior parietal sulcus (IPS) that are responsive to combinations of visual, auditory, and tactile stimuli (Andersen et al. 1997; Cohen, Cohen, & Gifford, III, 2004; Mazzoni et al. 1996; Schlack et al. 2005). However the homologies between primate and human multisensory parietal regions remain to be fully established (Astafiev et al. 2003; Sereno and Tootell 2005). Human functional imaging studies have shown that multiple sensory inputs are indeed co-localized to regions of the parietal lobe, including the IPS and the superior parietal lobule (SPL) (Bremmer et al. 2001; Bushara et al. 1999; Calvert et al. 2001; Lewis et al. 2000; 2005; Macaluso and Driver 2001). A subset of these studies also shows nonlinear interactions of multisensory inputs suggesting that this information is integrated (Calvert et al. 2001; Lewis et al. 2000; Miller and D'Esposito 2005). That is, these studies have shown that the response to a bisensory stimulus differs from the sum of the responses to its unisensory constituents (so-called super- or sub- additivity; see e.g. Stanford et al. 2005). While hemodynamic imaging has provided excellent spatial localization of multisensory processing in humans, the temporal resolution of this method precludes the 2

Page 4 of 47

Multisensory Integration in Human SPL study of dynamic information processing, where meaningful distinctions are seen on the order of 10s and 100s of milliseconds. Hence, it is not possible to resolve whether this multisensory processing represents direct sensory-perceptual level interactions, or if it reflects later cognitive processes (Foxe and Schroeder 2005; Schroeder and Foxe 2005). This lack of timing information may be the reason that imaging data can lend itself to alternate and equally plausible interpretations. For instance, Ojanen and colleagues (2005) attribute SPL activation for conflicting auditory-visual speech compared to matching auditory-visual speech to increased attentional processing, a function more often associated with superior parietal regions than multisensory processing. Here, we took advantage of the excellent spatial and temporal resolution provided by intracranial electrical recordings in humans to directly investigate multisensory processing in parietal cortex. Using a simple reaction-time task (Molholm et al. 2002) in which subjects responded to visual and auditory stimuli that were presented simultaneously or alone, we identified a highly localized region of parietal cortex, in the region of the lateral superior parietal lobule, that responded to both auditory and visual stimulation. Auditory and visual inputs to this region occurred early in time (
Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.