Effortful Listening to Speech Leads to Decreased Eye Movements

Eye Movements
Image by Freepik

Approximately 40% of persons over a certain age have some hearing loss. Age-related hearing loss can begin much earlier than that, in people’s 40s or 50s, even though it is most common in persons over the age of 65.

Despite being widely used, current diagnostic methods may not be able to identify earlier indications of hearing loss, such as the loss of the capacity to hear speech in noisy or crowded settings. In order to address milder kinds of hearing loss early, before they become irreversible, some researchers have been working to develop effective approaches.

To this purpose, recent research by two neuroscientists at the Rotman Research Institute in Canada has looked at the connection between active listening and eye movements. Their most recent study, which was published in The Journal of Neuroscience, reveals that young people’ eye movements tend to slow down as they concentrate more intently on hearing speech.

“We typically diagnose hearing loss using pure-tone audiometry, involving a person listening to pure tones at different sound levels to determine the sound level at which the person can just hear a tone before it is too quiet; we call this point the hearing threshold,” Björn Herrmann, co-author of the paper, told Medical Xpress.

“If the hearing threshold is too high, meaning that the level of a tone must be relatively intense for the tone to be heard, we would possibly prescribe hearing aids. However, age-related hearing loss develops gradually over time, often starting when people are in their 40s or 50s.”

Many people begin to have trouble recognizing speech in noisy settings in their 40s or 50s, including packed restaurants, shopping malls, and other public locations. These hearing issues frequently signal more serious hearing loss that develops later in life.

“Pure-tone audiometric thresholds are not very indicative of such speech-in-noise perception challenges at the early stages of hearing loss,” Herrmann explained. “As a result, we typically diagnose hearing loss (using pure tone audiometry) a decade or two after first signs of speech perception difficulties emerge.”

Researchers have been working to create more diagnostic instruments that are better at capturing the subtle features of a patient’s hearing in order to detect hearing issues earlier. These include physical indicators that a person is straining more to understand speech in noisy settings, as this may signify that they are experiencing early hearing loss. These quantifiable physiological indicators may enable doctors to more accurately diagnose hearing loss in both new patients and those who have already received therapy for it (i.e., to measure the efficacy of the treatment), if they are properly found.

“Researchers and clinicians would like to measure listening effort objectively, which typically means using physiological responses, because asking a person how effortful they find listening can be influenced by their specific meaning of the word effort,” Herrmann said. “People may also find it hard to separate how much effort they exerted from how well they were able to comprehend speech. While it is certainly important to understand a person’s subjective experiences, objective measures are seen as advantageous in clinical and research contexts.”

Previous research has outlined a range of various physiological reactions that take place when someone is actively listening. A change in pupil size, which can be assessed using pupillometry—a method that uses a camera to record eyes and determine the diameter of pupils at various times in time—is one that is frequently cited in the literature now in circulation.

“We have known for a long time that the pupil size increases when a person is investing cognitively, for example when they have to keep many numbers in memory,” Herrmann said. “A lot of research over the past decade also shows that the pupil size increases when listening to speech is effortful, for example, when speech is masked by background noise.

“The problem with measuring the pupil size is that it is very sensitive to changes in light (i.e., our pupil gets smaller when our environment gets brighter and vice versa). The measurement of the pupil is also affected by the angle of the pupil relative to the camera that measures it, such that pupil size appears to change without actually changing when a person looks to the left or right, which is why participants typically fixate on a fixation point in the middle of a screen while listening to speech.”

In the end, it doesn’t seem ideal to measure a patient’s pupils during a hearing test because various things could skew the results. Herrmann and his colleague M. Eric Cui therefore set out to find a different approach to spotting active listening.

“There has been a little bit of work in non-hearing related research areas that show that eye movements may indicate when a person is cognitively taxed, for example, keeping many numbers in memory,” Herrmann said. “People’s eye movements decrease under such cognitive challenges. We thus wondered whether eye movements may also indicate cognitive challenges during listening, that is, listening effort.

Moreover, research investigating the auditory cortex in animals—that is, the brain region responding to sound—found that when animals reduce their movements, the auditory cortex becomes more sensitive to sound. We thus thought that reduced eye movements could also be associated with higher auditory sensitivity to speech.”

Herrmann and Cui conducted a series of studies with 26 young individuals between the ages of 18 and 35 to test their theory. They were trying to figure out if these individuals’ eye movements slowed down as they concentrated harder on their listening.

“Participants who came to our lab sat in a comfortable chair inside a sound booth,” Herrmann said. “They rested their head in a chin rest, which helps stabilize the person’s head, and faced a computer monitor. They also wore headphones over which we played spoken speech. We used an eye tracker, a camera-based device that can track a person’s eyes, to determine where participants looked on a computer screen.”

The experiment by the researchers included numerous trials. In each of these trials, the participants watched a new object on the screen in front of them, such as a stationary dot, a moving dot, many moving dots, or a blank screen, while listening to phrases and spoken stories through a pair of headphones. The researchers wanted to see if changes in the participants’ eye movements happened regardless of what they were looking at by switching their object of attention.

“Participants were told that they could look wherever they liked on the computer screen,” Herrmann said. “The critical manipulation was the degree of speech clarity. Sentences and stories were played either with very minimal background noise that would require little effort for the participants to understand what is said or with severe background noise for which speech comprehension required a lot of effort. While participants listened to the speech, we recorded their eye movements.”

Fixation length and gaze dispersion—two distinct features of eye movements—were the main subjects of Herrmann and Cui’s investigations. The duration that a person’s eyes are locked on a certain object or point is measured by the first, and the frequency that they move across the screen is measured by the second.

“We found that under the more effortful listening conditions, that is, when the degree of speech masking through background noise was high, individuals’ eye movements decreased as reflected in longer fixation durations and reduced gaze dispersion, compared to more favorable listening conditions,” Herrmann said.

“We show this for simple disconnected sentences, the type that is commonly used in clinical contexts, as well as for spoken stories, which reflect more naturalistic speech we encounter in everyday life. We also show the reduction in eye movements when listening is effortful for the different visual presentation conditions.”

Overall, the research’s findings show the potential benefit of using eye movement records to gauge how hard someone is trying to listen in various situations. In the future, new tests to identify hearing loss in clinical settings could be developed using this indicator of active listening.

“Our study and another study published around the same time as ours are the first to show that listening effort is associated with reduced eye movements,” Herrmann said.

“However, we still need to better understand how changes in eye movements relate to changes in pupil size under listening effort. Perhaps both measures capture different facets of listening effort, for example, a more automatic vs. a more voluntary physiological effort response. This would enable us to capture listening effort more exhaustively.”

In order to better anticipate challenges that would encourage a reduction in eye movements, Herrmann and Cui would also like to investigate the mechanisms underlying such a reduction. They also intend to investigate the relationship between eye movements and active listening further in order to recognize and take into account variations in “listening efforts.”

In actuality, hearing loss may not necessarily be associated with active listening. For instance, people might exert more effort while digesting syntactically challenging or ambiguous sentences, or when listening to speech in a language they do not speak fluently.

“In our initial work we only investigated eye movements in younger healthy adults,” Herrmann added. “From a clinical perspective, the next steps are certainly to investigate whether eye movements also indicate listening effort in older adults, because this is the population for which our new approach may be most useful. Moreover, we plan to investigate whether eye movements indicate reduced listening effort when individuals are treated with hearing aids; as this could help to assess how much a person benefits from their hearing-aid prescription.”

Source Link

more recommended stories