Navigating realistic acoustic scenes

Seminar: 
Applied Mathematics
Event time: 
Tuesday, March 29, 2016 - 12:15pm to 1:15pm
Location: 
AKW 000
Speaker: 
Shihab Shamma
Speaker affiliation: 
University of Maryland
Event description: 

Humans can attend to one of multiple sounds or objects in a visual scene, segregate it and follow it selectively over time. The neural underpinnings of this perceptual feat are the object of extensive investigations. I will review the fundamentals of sound representation in the auditory cortex. I will then explain how source segregation depends primarily on the rapid plasticity in auditory cortical responses that can track the temporally coherent responses induced by simultaneous sources. I will then review recent neurophysiological results in support of these ideas, and discuss algorithms that can segregate sources with no prior information or training and that are inspired by common sensory cortical mechanisms.