summary: Researchers have discovered that our ears emit subtle sounds in response to eye movements, allowing us to pinpoint where someone is looking.
This study shows that these ear sounds, which can be caused by muscle contractions or hair cell activation, can reveal the location of the eyes.
This finding challenges existing ideas about the function of the ear and suggests that sounds in the ear may help synchronize visual and sound perception. This team’s innovative approach could lead to new clinical audiology tests and a deeper understanding of sensory integration.
Important facts:
- The study revealed that subtle sounds in the ear correspond to eye movements, providing insight into where a person is looking.
- This phenomenon may be caused by the brain coordinating ear muscle contractions and hair cell activation with eye movements.
- This discovery offers the potential for new clinical tests and a better understanding of how the brain integrates visual and auditory information.
sauce: duke university
Scientists can now determine exactly where a person’s eyes are looking just by listening to their ears.
“Just by putting a microphone in your ear canal and recording it, you can actually estimate the movement of the eyes and the location of what the eyes are trying to look at,” said Dr. Jennifer Groh, senior author of the paper. Masu. In the new report, he is also a professor of psychology, neuroscience, and neurobiology at Duke University.
In 2018, Grow’s team discovered that when the eyes move, the ears emit a faint, imperceptible noise. In a new report to be published in the journal the week of November 20th Proceedings of the National Academy of SciencesDuke’s team has now shown that these sounds can reveal where your eyes are looking.
The same applies vice versa. Just by knowing where someone was looking, Groh and her team were able to predict what the subtle ear sound waveforms would look like.
These sounds are caused by eye movements stimulating the brain to contract either the muscles in the middle ear, which normally help dampen loud sounds, or the hair cells, which help amplify quiet sounds. Mr. Groh thinks it may be possible.
The exact purpose of this tinnitus is unknown, but Groh’s first hunch is that it might help sharpen people’s awareness.
“Even if the head and ears don’t move, the eyes move, and we think this is part of the brain’s system for aligning vision and hearing,” Groh says.
Understanding the subtle relationship between sound and vision in the ear may lead to the development of new clinical tests for hearing.
“If each part of the ear contributes a separate rule to the tympanic membrane signal, it could be used as a kind of clinical tool to assess which parts of the ear anatomy are malfunctioning. “Yes,” said Stephanie Lovich, one of the study’s lead authors. is an author on the paper and a graduate student in psychology and neuroscience at Duke University.
Just as the pupil constricts and dilates like the aperture on a camera to regulate the amount of light let in, the ear has its own way of regulating hearing. Scientists have long believed that these acoustic adjustment mechanisms only serve to amplify soft sounds or dampen loud sounds.
However, in 2018 Groh and her team discovered that these same acoustic modulation mechanisms were also activated by eye movements, suggesting that the brain is communicating information to the ears about eye movements. are doing.
In the latest study, the researchers followed up on their initial findings and investigated whether subtle auditory signals contain detailed information about eye movements.
To decipher the sounds in people’s ears, Grow’s team at Duke University and Christopher Shera, Ph.D., at the University of Southern California, and colleagues at the University of Southern California sent 16 adults with no visual or hearing impairments to the Glow in Durham. They were brought to a laboratory and given a fairly simple vision test.
Participants looked at a stationary green dot on a computer screen and, without moving their heads, followed the dot with their eyes until it disappeared and reappeared up, down, left, right, or diagonally from the starting point. This allowed Groh’s team to obtain a wide range of auditory signals produced when the eye moves horizontally, vertically, or diagonally.
An eye tracker recorded where the participants’ pupils darted around for comparison with ear sounds captured using earphones with built-in microphones.
The research team analyzed the sounds in the ears and found unique characteristics for different directions of movement. This makes it possible to decipher the sound code in the ear and calculate where people are looking just by scrutinizing the sound waves.
“Because diagonal eye movements are just horizontal and vertical components, my labmate and co-author David Murphy realized that he could take those two components and infer what would happen if they were combined. ,” says Lovich.
“Then you can go in the opposite direction, observe the vibrations, and predict that someone is looking 30 degrees to the left.”
Groh is now beginning to investigate whether these tinnitus sounds play a role in perception.
One of the series of projects focuses on how sounds in the ears differ depending on eye movements for people who have lost hearing or vision.
Groh also studied how well people without hearing or vision impairments do at sound localization tasks that rely on mapping auditory information to visual information, such as locating an ambulance while driving. They are also testing whether they produce predictable ear signals. scene.
“Some people are getting really reproducible signals day in and day out, and they can measure them right away,” Groh says. “You might expect these people to be very good at visual and auditory tasks compared to other people, but visual and auditory tasks are more variable.”
Funding: Groh’s research was supported by a grant from the National Institutes of Health (NIDCD DC017532).
About this visual and auditory neuroscience research news
author: Dan Vahava
sauce: duke university
contact: Dan Vahava – Duke University
image: Image credited to Neuroscience News
Original research: Open access.
“Parametric information about eye movements is sent to the ears” written by Jennifer Groh et al. PNAS
abstract
Parametric information about eye movements is sent to the ears
As the eyes move, the positional relationship between the visual and auditory scenes changes. We are not perceptually aware of these changes. This indicates that the brain needs to incorporate precise information about eye movements into auditory and visual processing.
Here we show that small sounds produced by the brain in the ear contain precise information about simultaneous eye movements in the spatial domain. The direction and amplitude of eye movements can be inferred from these small sounds.
The underlying mechanisms likely involve various motor structures in the ear that facilitate the transformation of incoming auditory signals into the direction of the eyes and, thus, into a frame of reference fixed to the visual scene. There is likely to be.