Saturday, April 4, 2026
Logo

How Your Brain Predicts Eye Movements with 94% Accuracy

New research reveals how the brain predicts visual stability during eye movements, with a consistent 6% undershoot in afterimage perception. Learn how this mechanism keeps our vision stable.

HealthBy Dr. Priya KapoorMarch 14, 20268 min read

Last updated: April 1, 2026, 2:49 PM

Share:
How Your Brain Predicts Eye Movements with 94% Accuracy

Every second, your eyes make rapid, jerky movements called saccades—yet the world around you appears perfectly stable. New research published in Science Advances reveals how the brain achieves this illusion of stability, predicting eye movements with 94% accuracy using an internal mechanism called an efference copy. The study, led by researchers from the Cluster of Excellence Science of Intelligence in Berlin, used afterimages to decode this predictive process, uncovering a systematic 6% undershoot in perception that has broad implications for neuroscience, virtual reality, and robotics.

The Saccade Paradox: Why the World Doesn’t Look Like a Shaky Camera

If you’ve ever watched a video taken with a handheld camera, you know how disorienting it can be. Yet, despite our eyes making three to five saccades per second, the visual world remains stable. This paradox has puzzled neuroscientists for decades. The brain solves this problem by generating an internal prediction of where visual objects should be after an eye movement, allowing it to compensate for the saccade before it even happens.

The Role of Afterimages in Decoding Visual Prediction

Afterimages—those ghostly shapes that linger after staring at a bright light—were the key to this discovery. Researchers Richard Schweitzer, Thomas Seel, Jörg Raisch, and Martin Rolfs conducted experiments in complete darkness, where participants fixated on a bright flash to create an afterimage and then shifted their gaze to a second light. By tracking the perceived movement of the afterimage, the team could isolate the brain’s internal predictions of eye movement.

The 94% Rule: How the Brain’s Predictions Fall Slightly Short

The study found that the brain’s internal estimate of eye movement reached about 94% of the actual distance—a phenomenon known as hypometria. This small but consistent undershoot suggests a systematic bias in the brain’s predictive mechanism. 'On average, the perceived shift of the afterimage reached about 94 percent of the actual eye movement,' says Richard Schweitzer, lead author of the study. 'Perception follows eye movements very closely, but not perfectly.'

Why a 6% Undershoot Isn’t a Flaw

The 6% discrepancy may not be a flaw but a reflection of natural eye movement patterns. Since saccades often fall slightly short of their targets due to muscle fatigue or other factors, the brain’s internal estimate aligns with this biological reality. This alignment ensures that perception remains reliably aligned with actual eye movements, even if not perfectly mathematically accurate.

Efference Copy: The Brain’s Internal Feedback Loop

The brain doesn’t rely solely on visual feedback to update its internal map. Instead, it uses an efference copy—a 'carbon copy' of the motor signal sent to the eye muscles—to predict how the visual scene should shift. This mechanism allows the brain to anticipate the consequences of eye movements before new visual input arrives, ensuring visual stability even in the absence of external cues.

How the Brain Adjusts to Changing Eye Movements

Eye movements aren’t fixed; they adapt over time. In the lab, researchers induced saccadic adaptation by shifting the target of an eye movement with each saccade. As participants’ saccades became shorter, the perceived shift of the afterimage also shortened, demonstrating that the brain’s predictive mechanism adjusts dynamically to changes in eye movement patterns.

Broader Implications: From Neuroscience to Virtual Reality

Understanding how the brain predicts eye movements has applications beyond basic vision science. In virtual reality, for example, motion sickness often occurs when there is a mismatch between what the eyes see and what the brain predicts. By fine-tuning VR environments to align with the brain’s natural predictive mechanisms, developers could reduce discomfort and improve user experience. Similarly, this research could inform robotics, where reliable sensory-motor integration is essential for autonomous systems.

Key Takeaways

  • The brain predicts eye movements with 94% accuracy, ensuring visual stability despite rapid saccades.
  • Afterimages reveal a systematic 6% undershoot in the brain’s predictive mechanism, likely reflecting natural eye movement patterns.
  • The efference copy allows the brain to anticipate visual changes before new input arrives, maintaining stability in the absence of external cues.
  • Saccadic adaptation demonstrates that the brain’s predictive mechanism adjusts dynamically to changes in eye movement patterns.
  • This research has implications for virtual reality, robotics, and clinical studies of eye-movement disorders.

Frequently Asked Questions

Frequently Asked Questions

Why do afterimages 'follow' my eyes if the real world stays still?
Afterimages appear to move with your gaze because they remain fixed on the retina. When your brain subtracts the eye movement and sees the afterimage hasn’t moved on the retina, it assumes the image must be moving through space at the same speed as your eyes.
Why is the brain’s prediction only 94% accurate? Isn’t that a flaw?
Not necessarily. Natural eye movements often fall slightly short of their targets. The brain’s 6% undershoot likely reflects this biological reality. It’s better for the brain to be reliably aligned with how our muscles actually behave than to be mathematically perfect but biologically disconnected.
Can this help with things like VR motion sickness?
Absolutely. Motion sickness often happens when there is a mismatch between what your eyes see and what your brain’s 'efference copy' predicts. Understanding that the brain naturally expects a 94% shift could help developers create Virtual Reality environments that feel more stable and natural to the human eye.
DP
Dr. Priya Kapoor

Health Reporter

Dr. Priya Kapoor reports on wellness, mental health, and medical research developments. She holds a doctorate in Public Health from Harvard and has spent a decade covering the intersection of medical research and public policy. Her reporting on mental health access and health equity has driven national conversations.

Related Stories