Audio-visual integration for objects, location and low-level dynamic stimuli:
novel insights from studying sensory substitution and topographical mapping

Results based on use of The vOICe were presented at

IMRF 2008, the 9th Annual Meeting of the  International Multisensory Research Forum

July 16-19, 2008, Hamburg, Germany, on Wednesday July 16, 2008.

International Multisensory Research Forum Authors

Amir Amedi 1, William Stern 2, Lotfi Merabet 2, Ella Striem 1, Uri Hertz 1, Peter Meijer 3, and Alvaro Pascual-Leone 2.

1 Department of Physiology and Program of Cognitive Science, Hebrew University, Jerusalem, Israel.
2 Berenson-Allen Center for Noninvasive Brain Stimulation, Department of Neurology, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, USA.
3 Developer of The vOICe, Eindhoven, The Netherlands.

Audio-visual integration for objects, location and low-level dynamic stimuli:
novel insights from studying sensory substitution and topographical mapping

 Abstract

The talk will present fMRI and behavioral experiments of auditory-visual integration in humans. It will focus on integration in sighted but also in sight restoration set-up, looking into the effects of learning, brain development and brain plasticity. New findings regarding the nature of sensory representations for dynamic stimuli ranging from pure tones to complex, natural object sounds will be presented. I will highlight the use of sensory substitution devices (SSDs) in the context of blindness. In SSDs, visual information captured by an artificial receptor is delivered to the brain using non-visual sensory information. Using an auditory-to-visual SSD called "The vOICe" we find that blind achieve successful performance on object recognition tasks, and specific recruitment of ventral and dorsal 'visual' structures. Comparable recruitment was observed also in sighted learning to use this device but not in sighted learning arbitrary associations between sounds and object identity. We also find using phase locking Fourier Techniques an array of topographic maps which can serve as a basis for such audio-visual integration. Finally, these results suggest "The vOICe" can be useful for blind individuals' daily activities but it also has a potential use to ‘guide’ visual cortex to interpret visual information arriving from prosthesis.


Related is the June 2007 Nature Neuroscience publication titled "Shape conveyed by visual-to-auditory sensory substitution activates the lateral occipital complex". Also related are the conference presentations at IMRF 2007, IMRF 2005 and HBM 2006.

Note: The vOICe technology is being explored and developed under the Open Innovation paradigm together with R&D partners around the world.

Copyright © 1996 - 2024 Peter B.L. Meijer