Skip to main content

Tuning in to emotion

ASU professor finds correlation between cochlear implant users' vocal emotional recognition, quality of life

man looking at a computer in a lab

January 15, 2019

In 1982, after Melbourne man Graham Carrick had experienced 17 years of silence, the device implanted in his inner ear was switched on and sound miraculously flooded in. Carrick was the first, and to date roughly 200,000 people worldwide have had the same life-changing experience; cochlear implant surgery has become a relatively routine procedure to restore the sense of sound to the profoundly deaf.

The part of the story not often told is that after the initial astonishment wears off, many cochlear implant users find they still have trouble inferring a speaker’s emotion based on tone of voice, something that comes naturally to people with normal hearing.

While it may seem inconsequential, in a study recently published in the Journal of the Acoustical Society of America, Arizona State University's Xin Luo found that the ability to detect emotional nuance in speech is actually strongly correlated to adult cochlear implant users’ overall quality of life.

“Vocal emotional recognition is so important for social communication,” said Luo, an assistant professor in the College of Health Solutions.

Not being able to tell if someone is joking or angry or sad is not only frustrating but over time, it can lead to feelings of isolation and even depression.

Although deafness can be caused by a variety of things, the underlying issue is damage to the inner ear. Cochlear implants utilize a surgically implanted ray of electrodes to bypass the damage and directly stimulate surviving auditory neurons, allowing patients to hear.

Currently, the gold standard for testing the efficacy of the implant is sentence recognition, focusing only on whether the user can hear what words are being spoken — not necessarily how they are spoken. Luo and his team decided to focus on the how.

“The reason we are interested in emotion recognition is that we know the cochlear implant can restore hearing sensation to profoundly deaf people but the device is limited in its spectral resolution and temporal resolution,” said Luo, whose background is in electrical engineering, studying how speech can be produced and recognized by computers.

cochlear implant device

A cochlear implant device for deaf or hearing-impaired people. Photo by Elizabeth Hoffmann/Getty Images/iStockphoto

The limitations of the cochlear implant mean that implant users don’t have the same ability as people with normal hearing to differentiate tone of voice.

To find out how that affects quality of life, Luo and his team tested 12 adult cochlear implant users’ ability to detect emotional nuance in speech at his Tempe campus lab during spring 2018. In soundproof booths, implant users listened to recordings of a male and female actor speaking neutral sentences such as, “It’s snowing outside,” in 100 different emotional utterances ranging from happy to sad to anxious. After each sentence, the listener was asked what emotion the speaker was conveying.

Often, implant users confused happy and angry, something Luo said could be due to the fact that both emotions can cause a speaker to increase their volume. Implant users were usually good at detecting sadness, however, which could be due to an associated slower speaking pace and lower, even pitch.

Implant users also filled out a survey regarding their quality of life that asked questions about social interaction and self-esteem, among others. Those who had correctly identified more emotional utterances were found to also report better quality of life.

“The correlation was quite strong despite the small number of participants,” Luo said. “So it’s definitely an area that needs more research so that we can improve emotional recognition for implant users.”

The first step is a simple one: Amend efficacy tests of cochlear implants to include emotional recognition, in addition to sentence recognition. While being able to hear and understand speech is kind of the whole point, Luo and his team found that that ability alone does not correlate with implant users’ perceived quality of life.

“Vocal emotional recognition has a much stronger predictive power for the subjective quality of life for implant users,” he said.

Top photo: ASU College of Health Solutions Assistant Professor Xin Luo in his cochlear implant research lab on the Tempe campus. Photo by Charlie Leight/ASU Now

More Science and technology


Photo of two students talking at a table with a poster that reads "RISE Ambassador."

Sara Brownell named among inaugural Charter Professors

Sara Brownell, President’s Professor in the School of Life Sciences and Center for Biology and Society at Arizona State University, has been named an inaugural Charter Professor in recognition of her…

Two men in suits shake hands

Department of State and ASU announce new initiative to build resilient international microelectronics supply chain

Well known by now is that the CHIPS and Science Act of 2022 was designed to re-establish semiconductor manufacturing, research and development in the United States — onshoring, as it is sometimes…

A group of people in suits pose next to a U.S. flag.

ASU president, national council urge action to fuel US tech leadership

Arizona State University President Michael Crow and other members of a national advisory council on innovation and entrepreneurship presented their findings at the White House last week, calling for…