A new study reveals that even when you’re just listening, tiny, involuntary movements of your eyes, blinks, and pupil size are intricately linked to the complex task of understanding word structure.
Think about the last conversation you had. You were likely focused on the sounds of the words, the speaker’s tone, and the meaning behind their message. It feels like a purely auditory and cognitive experience. But what if your eyes were also playing a crucial, hidden role? Neuroscientists are discovering that our eyes, far from being passive observers when we listen, are a dynamic window into the inner workings of our brain’s language-processing engine.
A recent study published in Scientific Reports delves into this fascinating connection, demonstrating that the way our brain deconstructs spoken words is reflected in a trio of involuntary ocular responses: tiny eye movements called microsaccades, the timing of our blinks, and the dilation of our pupils. The findings suggest that these subtle eye behaviors can betray the cognitive effort our brain exerts when it encounters linguistic ambiguity, even when we aren’t looking at anything related to the task.
The ‘Oculomotor Freeze’ and the Listening Brain
Even when you try to hold your gaze perfectly still, your eyes are never truly motionless. They perform constant, tiny, jittery movements called microsaccades. For years, scientists believed these were just random noise in the visual system. However, research has shown they are deeply connected to attention and perception.
When our brain detects a sudden or important event—a flash of light, an unexpected sound—a fascinating phenomenon occurs: Oculomotor Inhibition (OMI), or what you might call an “oculomotor freeze.” For a brief moment, all involuntary eye movements, including microsaccades and even blinks, temporarily stop. This pause is thought to be a mechanism that helps the brain focus its resources on processing the new sensory information without the “noise” of eye movement.
While this freeze is well-documented for simple sensory stimuli, researchers at Bar-Ilan University wanted to ask a more complex question: Is this oculomotor freeze sensitive to abstract, high-level cognitive processes, like understanding the structure of a spoken word?
A Linguistic Test for the Eyes
To find out, the researchers designed an experiment centered on a well-known psycholinguistic puzzle called the Morpheme Interference Effect (MIE). A morpheme is the smallest unit of meaning in a language—think of prefixes like ‘re-‘ or suffixes like ‘-ment’. The MIE occurs when our brain gets tripped up by pseudowords (fake words) that are built from real morphemes.
For example, your brain easily recognizes "unhappiness" as a word and "glorpment" as a non-word. But what about a word like "shootment"? The root ‘shoot’ is real, and the suffix ‘-ment’ is real, but the combination is not a word in English. Processing "shootment" requires more cognitive effort than processing "glorpmant" (where ‘glorp’ is an invented root), because your brain tries to make sense of the familiar parts, causing interference.
The study, conducted with native Hebrew speakers, used this principle. Participants listened to a stream of auditory stimuli while their eye movements were tracked with high precision. The stimuli included:
- Real Words: Common Hebrew words.
- Real-Root Pseudowords: Fake words created by combining a real Hebrew root with a valid word pattern (like "shootment").
- Invented-Root Pseudowords: Fake words created using an invented root (like "glorpmant").
Their task was simple: maintain their gaze on a central point and press a button only when they heard a real word. This clever design ensured that any eye movements recorded during the pseudoword trials were not contaminated by the physical act of pressing a button.
The Results: The Eyes Betray the Brain’s Struggle
The results were remarkably clear and consistent across three different ocular measures. When participants heard the tricky Real-Root pseudowords, their brains had to work harder to reject them, and their eyes told the story.
- Microsaccades: The period of inhibition—the “freeze”—was significantly longer for Real-Root pseudowords compared to Invented-Root ones. The brain held back the tiny eye movements for longer as it grappled with the familiar but incorrect word structure.
- Blinks: A similar pattern emerged for eye blinks. The latency, or the time it took to release from blink inhibition, was longer for the more confusing Real-Root pseudowords.
- Pupil Dilation: Our pupils dilate in response to cognitive load. The study found that the peak of pupil dilation was significantly delayed for Real-Root pseudowords, again indicating a more prolonged and effortful cognitive process.
Essentially, the extra mental gymnastics required to identify a word like "shootment" as a non-word was directly mirrored by a longer period of oculomotor suppression.

From Reading to Listening: A Cross-Modal Connection
What makes these findings particularly significant is that they generalize a phenomenon previously observed in reading to the domain of listening. While it might seem intuitive that eye movements are involved in reading, their connection to auditory processing is far more surprising.
Reading allows for non-linear processing; our eyes can jump around a word. Listening, however, is strictly sequential—the word unfolds over time, sound by sound. To account for this, the researchers anchored their analysis not to the start or end of the sound file, but to the "Uniqueness Point" (UP). The UP is the precise moment in a spoken word when it becomes distinct from all other possible words in the listener’s mental dictionary—the cognitive "aha!" moment of recognition or rejection.
By showing that the oculomotor freeze aligns with this cognitive milestone, the study provides powerful evidence that these involuntary eye behaviors are not just a low-level reflex. Instead, they are tightly coupled with the brain’s sophisticated, real-time evaluation of abstract linguistic information.
A Window into the Mind
This research does more than just reveal a fascinating quirk of our neurobiology. It opens the door to new, non-invasive methods for studying language processing. Because these oculomotor measures are covert and do not require a manual response, they could be invaluable for understanding language development and disorders.
Imagine using this technique to study how infants who can’t yet speak or read are processing language structure. It could also offer new insights into the challenges faced by individuals with dyslexia or help assess cognitive and linguistic recovery in stroke patients. By simply tracking the eye’s subtle dance, we may gain a deeper understanding of the silent, complex processes that turn sound into meaning.
This study is a powerful reminder that the brain’s systems are deeply interconnected. The simple act of listening is a full-body experience, and by watching the eyes, we can catch a glimpse of the mind hard at work.
Reference
Kadosh, O., Menashe, B., Gera, Y., Ben-Shachar, M., & Bonneh, Y. S. (2025). Oculomotor chronometry of spoken word structure processing. Scientific Reports, 15(37339). https://doi.org/10.1038/s41598-025-21869-8



