In 1982, after Melbourne man Graham Carrick had experienced 17 years of silence, the device implanted in his inner ear was switched on and sound miraculously flooded in. Carrick was the first, and to date roughly 200,000 people worldwide have had the same life-changing experience; cochlear implant surgery has become a relatively routine procedure to restore the sense of sound to the profoundly deaf.
The part of the story not often told is that after the initial astonishment wears off, many cochlear implant users find they still have trouble inferring a speaker’s emotion based on tone of voice, something that comes naturally to people with normal hearing.
While it may seem inconsequential, in a study recently published in the Journal of the Acoustical Society of America, Arizona State University's Xin Luo found that the ability to detect emotional nuance in speech is actually strongly correlated to adult cochlear implant users’ overall quality of life.
“Vocal emotional recognition is so important for social communication,” said Luo, an assistant professor in the College of Health Solutions.
Not being able to tell if someone is joking or angry or sad is not only frustrating but over time, it can lead to feelings of isolation and even depression.
Although deafness can be caused by a variety of things, the underlying issue is damage to the inner ear. Cochlear implants utilize a surgically implanted ray of electrodes to bypass the damage and directly stimulate surviving auditory neurons, allowing patients to hear.
Currently, the gold standard for testing the efficacy of the implant is sentence recognition, focusing only on whether the user can hear what words are being spoken — not necessarily how they are spoken. Luo and his team decided to focus on the how.
“The reason we are interested in emotion recognition is that we know the cochlear implant can restore hearing sensation to profoundly deaf people but the device is limited in its spectral resolution and temporal resolution,” said Luo, whose background is in electrical engineering, studying how speech can be produced and recognized by computers.
The limitations of the cochlear implant mean that implant users don’t have the same ability as people with normal hearing to differentiate tone of voice.
To find out how that affects quality of life, Luo and his team tested 12 adult cochlear implant users’ ability to detect emotional nuance in speech at his Tempe campus lab during spring 2018. In soundproof booths, implant users listened to recordings of a male and female actor speaking neutral sentences such as, “It’s snowing outside,” in 100 different emotional utterances ranging from happy to sad to anxious. After each sentence, the listener was asked what emotion the speaker was conveying.
Often, implant users confused happy and angry, something Luo said could be due to the fact that both emotions can cause a speaker to increase their volume. Implant users were usually good at detecting sadness, however, which could be due to an associated slower speaking pace and lower, even pitch.
Implant users also filled out a survey regarding their quality of life that asked questions about social interaction and self-esteem, among others. Those who had correctly identified more emotional utterances were found to also report better quality of life.
“The correlation was quite strong despite the small number of participants,” Luo said. “So it’s definitely an area that needs more research so that we can improve emotional recognition for implant users.”
The first step is a simple one: Amend efficacy tests of cochlear implants to include emotional recognition, in addition to sentence recognition. While being able to hear and understand speech is kind of the whole point, Luo and his team found that that ability alone does not correlate with implant users’ perceived quality of life.
“Vocal emotional recognition has a much stronger predictive power for the subjective quality of life for implant users,” he said.
Top photo: ASU College of Health Solutions Assistant Professor Xin Luo in his cochlear implant research lab on the Tempe campus. Photo by Charlie Leight/ASU Now
More Science and technology
Podcast explores the future in a rapidly evolving world
What will it mean to be human in the future? Who owns data and who owns us? Can machines think?These are some of the questions pondered on a newly launched podcast titled “Modem Futura.” Co-…
New NIH-funded program will train ASU students for the future of AI-powered medicine
The medical sector is increasingly exploring the use of artificial intelligence, or AI, to make health care more affordable and to improve patient outcomes, but new programs are needed to train…
Cosmic clues: Metal-poor regions unveil potential method for galaxy growth
For decades, astronomers have analyzed data from space and ground telescopes to learn more about galaxies in the universe. Understanding how galaxies behave in metal-poor regions could play a crucial…