Skip to main content

Tuning in to emotion

ASU professor finds correlation between cochlear implant users' vocal emotional recognition, quality of life

man looking at a computer in a lab

January 15, 2019

In 1982, after Melbourne man Graham Carrick had experienced 17 years of silence, the device implanted in his inner ear was switched on and sound miraculously flooded in. Carrick was the first, and to date roughly 200,000 people worldwide have had the same life-changing experience; cochlear implant surgery has become a relatively routine procedure to restore the sense of sound to the profoundly deaf.

The part of the story not often told is that after the initial astonishment wears off, many cochlear implant users find they still have trouble inferring a speaker’s emotion based on tone of voice, something that comes naturally to people with normal hearing.

While it may seem inconsequential, in a study recently published in the Journal of the Acoustical Society of America, Arizona State University's Xin Luo found that the ability to detect emotional nuance in speech is actually strongly correlated to adult cochlear implant users’ overall quality of life.

“Vocal emotional recognition is so important for social communication,” said Luo, an assistant professor in the College of Health Solutions.

Not being able to tell if someone is joking or angry or sad is not only frustrating but over time, it can lead to feelings of isolation and even depression.

Although deafness can be caused by a variety of things, the underlying issue is damage to the inner ear. Cochlear implants utilize a surgically implanted ray of electrodes to bypass the damage and directly stimulate surviving auditory neurons, allowing patients to hear.

Currently, the gold standard for testing the efficacy of the implant is sentence recognition, focusing only on whether the user can hear what words are being spoken — not necessarily how they are spoken. Luo and his team decided to focus on the how.

“The reason we are interested in emotion recognition is that we know the cochlear implant can restore hearing sensation to profoundly deaf people but the device is limited in its spectral resolution and temporal resolution,” said Luo, whose background is in electrical engineering, studying how speech can be produced and recognized by computers.

cochlear implant device

A cochlear implant device for deaf or hearing-impaired people. Photo by Elizabeth Hoffmann/Getty Images/iStockphoto

The limitations of the cochlear implant mean that implant users don’t have the same ability as people with normal hearing to differentiate tone of voice.

To find out how that affects quality of life, Luo and his team tested 12 adult cochlear implant users’ ability to detect emotional nuance in speech at his Tempe campus lab during spring 2018. In soundproof booths, implant users listened to recordings of a male and female actor speaking neutral sentences such as, “It’s snowing outside,” in 100 different emotional utterances ranging from happy to sad to anxious. After each sentence, the listener was asked what emotion the speaker was conveying.

Often, implant users confused happy and angry, something Luo said could be due to the fact that both emotions can cause a speaker to increase their volume. Implant users were usually good at detecting sadness, however, which could be due to an associated slower speaking pace and lower, even pitch.

Implant users also filled out a survey regarding their quality of life that asked questions about social interaction and self-esteem, among others. Those who had correctly identified more emotional utterances were found to also report better quality of life.

“The correlation was quite strong despite the small number of participants,” Luo said. “So it’s definitely an area that needs more research so that we can improve emotional recognition for implant users.”

The first step is a simple one: Amend efficacy tests of cochlear implants to include emotional recognition, in addition to sentence recognition. While being able to hear and understand speech is kind of the whole point, Luo and his team found that that ability alone does not correlate with implant users’ perceived quality of life.

“Vocal emotional recognition has a much stronger predictive power for the subjective quality of life for implant users,” he said.

Top photo: ASU College of Health Solutions Assistant Professor Xin Luo in his cochlear implant research lab on the Tempe campus. Photo by Charlie Leight/ASU Now

More Science and technology


An overhead view of the main Los Alamos National Laboratory campus in New Mexico. The lab and the Fulton Schools are currently in a $3 million partnership.

Los Alamos National Laboratory, ASU celebrate first year of educational alliance

Nestled in the heart of New Mexico, Los Alamos National Laboratory, or LANL, is known for producing advances in nuclear science and national security. As a home to world-class scientists and…

Two teenagers hug and smile at each other.

ASU study: Support from romantic partners protects against negative relationship stress in teens

Adolescents regularly deal with high levels of stress, which can increase the risk of substance use and experiencing mental health challenges such as anxiety or depression. Stress can also affect…

A large bluish-white planet in space.

ASU scientists help resolve 'missing methane' problem of giant exoplanet

In the quest to understand the enigmatic nature of a warm gas-giant exoplanet, Arizona State University researchers have played a pivotal role in uncovering its secrets. WASP-107b has puzzled…