We are interested in how people understand speech in challenging situations, where the speech signals are degraded. The signal degradation may come from the environment, such as being at a noisy cocktail party or being in a large cathedral. Or the degradation may be an intrinsic part of the listener, for example people who need a hearing aid or cochlear implant. This problem is related to how people localize sound sources. We perform psychoacoustical tests on humans with normal hearing, hearing impairments, or with cochlear implants and then try to understand the results with neural models. The majority of my work lately is concerned with people with cochlear implants, specifically those with two (bilateral) implants. Bilateral cochlear-implant users typically struggle to localize sounds and understand speech in noise. The signal presented with a cochlear implant is highly degraded with respect to spectral content and there are a myriad of challenges when trying to understand what information is truly presented to an implant user. The signal in one ear is likely to be very different from the other ear. Yet, bilateral implant users often learn to integrate this contradictory information somehow.

Matthew Goupell
Hearing and Speech Sciences, 0241 LeFrak Hall
BSOS
Phone
Email
goupell [at] umd.edu