美國和法國研究人員發(fā)布報(bào)告說,,人的視覺系統(tǒng)能利用周遭聲音以看得更清楚,,聽覺系統(tǒng)能處理外界光線以聽得更明白。
研究人員以猴子為實(shí)驗(yàn)對(duì)象,訓(xùn)練它們尋找屏幕上的閃光,。光線明亮?xí)r,猴子比較容易找準(zhǔn)屏幕上的閃光區(qū)域,。光線昏暗時(shí),,猴子需要耗費(fèi)較長時(shí)間才能完成任務(wù)。然而,,當(dāng)昏暗光線與短促聲音同時(shí)出現(xiàn)時(shí),,猴子便能迅速確定屏幕上的閃光區(qū)域。
研究人員對(duì)49個(gè)負(fù)責(zé)處理初級(jí)圖像的神經(jīng)元加以研究,,發(fā)現(xiàn)這些細(xì)胞在“聽”到聲音時(shí)反應(yīng)較快,,與“看”到明亮光線時(shí)的反應(yīng)相似。研究人員說,,這表明大腦負(fù)責(zé)視與聽的兩套系統(tǒng)存在直接聯(lián)系,。
法國研究人員帕斯卡爾·巴龍說,人和一些動(dòng)物的感官細(xì)胞可以有選擇地處理不同知覺,。盲人不能利用視覺系統(tǒng)去看,,卻可以用它去“聽”,。因此,,盲人往往聽力較好,聾人大多視力較佳,。
這項(xiàng)研究顛覆了傳統(tǒng)理論,。研究人員先前認(rèn)為,人的視覺系統(tǒng)處理圖像,,聽覺系統(tǒng)處理聲音,,兩套系統(tǒng)“井水不犯河水”,而由大腦綜合圖像和聲音,,使我們獲得“有聲有色”的人生,。(生物谷Bioon.com)
生物谷推薦原始出處:
BMC Neuroscience 2008, 9:79doi:10.1186/1471-2202-9-79
Visuo-auditory interactions in the primary visual cortex of the behaving monkey. Electrophysiological evidence.
Ye Wang , Simona Celebrini , Yves Trotter and Pascal Barone
Background
Visual, tactile and auditory information is processed from the periphery to the cortical level through separate channels that target primary sensory cortices, from which it is further distributed to functionally specialized areas. Multisensory integration is classically assigned to higher hierarchical cortical areas, but there is growing electrophysiological evidence in man and monkey of multimodal interactions in areas thought to be unimodal, interactions that can occur at very short latencies. Such fast timing of multisensory interactions rules out the possibility of an origin in the polymodal areas mediated through back projections, but is rather in favor of heteromodal connections such as the direct projections observed in the monkey, from auditory areas directly to the primary visual cortex V1. Based on the existence of such AI to V1 projections, we looked for modulation of neuronal visual responses in V1 by an auditory stimulus in the awake behaving monkey.
Results
Behavioral or electrophysiological data were obtained from two behaving monkeys. One monkey was trained to maintain a passive central fixation while a peripheral visual (V) or visuo-auditory (AV) stimulus was presented. From a population of 45 V1 neurons, there was no difference in the mean latencies or strength of visual responses when comparing V and AV conditions. In a second active task, the monkey was required to orient his gaze toward the visual or visuo-auditory stimulus. From a population of 49 cells recorded during this saccadic task, we observed a significant reduction in response latencies in the visuo-auditory condition compared to the visual condition (mean 61.0 vs. 64.5ms) only when the visual stimulus was at midlevel contrast. No effect was observed at high contrast.
Conclusion
Our data show that single neurons from a primary sensory cortex such as V1 can integrate sensory information of a different modality, a result that argues against a strict hierarchical model of multisensory integration. Multisensory integration in V1 is, in our experiment, expressed by a significant reduction in visual response latencies specifically in suboptimal conditions and depending on the task demand. This suggests that neuronal mechanisms of multisensory integration are specific and adapted to the perceptual features of behavior.