New Research Reveals How People With Vision Loss Judge Approaching Vehicles

New Research Reveals How People With Vision Loss Judge Approaching Vehicles
Approximate first frame of the approaching car (top) and last frame before occlusion (bottom). Credit: PLOS One (2025).

A new scientific study has taken a close look at a question that directly affects everyday safety: how people with vision loss judge when an approaching vehicle will reach them. The research focuses on individuals with age-related macular degeneration (AMD) and compares their performance with people who have normal vision. The findings challenge several common assumptions about visual impairment, especially when it comes to tasks like crossing streets and navigating traffic.

The study was published in PLOS One in 2025 and was conducted by an international, multidisciplinary team of researchers from the United States and Europe. Using a carefully designed virtual reality (VR) system, the researchers explored how visual, auditory, and combined audiovisual cues are used to estimate time-to-collision (TTC)โ€”the perceived moment when a moving vehicle would reach a pedestrian.


Why Collision Judgment Matters

Collision judgment is a critical perceptual skill. It plays a role in everyday activities such as crossing a road, avoiding oncoming traffic, judging the speed of a bicycle, or navigating crowded environments. Despite its importance, very few studies have examined how people with visual impairments perform these judgments, even though vision loss affects millions of older adults worldwide.

AMD, one of the most common causes of vision loss among older adults, primarily damages central vision while often leaving peripheral vision intact. This type of vision loss can interfere with recognizing fine details, faces, and textโ€”but how it affects motion perception and collision timing has remained unclear.


How the Virtual Reality Study Worked

To study this safely and precisely, the researchers used an advanced VR setup originally developed at Johannes Gutenberg University Mainz. Participants stood in a simulated road environment and experienced a virtual vehicle approaching them.

The experiment included three different sensory conditions:

  • Visual-only, where participants could see the approaching vehicle
  • Auditory-only, where participants could hear the vehicle but not see it
  • Audiovisual, where both sight and sound were available

At a certain point, the vehicle disappeared from the simulation. Participants then pressed a button to indicate when they believed the vehicle would have reached their position.

The sounds used in the experiment were designed to be realistic, including variations in vehicle loudness, while the visuals varied in vehicle size, allowing researchers to examine how these factors influenced judgment.

The study included adults with AMD affecting both eyes as well as a control group with normal vision.


A Surprising Core Finding

One of the most striking results of the study was that participants with AMD performed very similarly to participants with normal vision when estimating time-to-collision.

Despite impaired central vision, individuals with AMD were able to achieve nearly the same level of accuracy as those without vision loss. This outcome surprised the research team, who had expected greater reliance on sound or reduced overall performance among participants with AMD.

Instead, the findings suggest that people with AMD can adapt their perceptual strategies and make effective use of the information still available to them.


How Vision and Sound Were Used

Another key takeaway from the study is that people with vision loss did not rely solely on sound. Even with reduced central vision, participants with AMD continued to use visual information alongside auditory cues.

When both vision and sound were available, participants in both groups naturally integrated information from both senses. However, the researchers found something unexpected: having both vision and sound did not improve accuracy compared to having vision alone.

This lack of a โ€œmultimodal advantageโ€ suggests that, for this specific task, vision remains the dominant sense for judging time-to-collisionโ€”even when vision is impaired.


Perceptual Biases Were Still Present

The study also confirmed the presence of well-known perceptual shortcuts, often called heuristics, in both groups.

When participants relied on a single sensory cue, certain biases consistently appeared:

  • Louder vehicles were judged to arrive sooner than quieter ones
  • Larger vehicles were judged to arrive sooner than smaller ones

These biases occurred in both the AMD group and the normal-vision group. However, they appeared slightly more often among participants with AMD, likely because reduced visual detail encourages reliance on less precise cues. Importantly, the difference between the groups was small, indicating that these biases are a common feature of human perception rather than a major deficit caused by vision loss.


What This Means for Real-World Safety

The findings highlight an important point: clinical measures like visual acuity do not always predict real-world functioning. Someone may have significant retinal damage yet still perform reasonably well on tasks involving motion and timing.

That said, the researchers stress caution. The experiment involved a simplified scenarioโ€”a single vehicle approaching on a straight, empty road. Real-world traffic environments are far more complex, often involving multiple vehicles, unpredictable movements, varying speeds, and distractions.

The study does not suggest that people with AMD can navigate traffic with the same level of safety as people without vision loss in all situations. Instead, it shows that under controlled conditions, their perceptual abilities may be more robust than commonly assumed.


Why This Research Matters

This work has important implications for mobility training, rehabilitation, and pedestrian safety design. Understanding how people with visual impairments judge motion and timing can inform better assistive technologies, safer urban planning, and more realistic assessments of functional vision.

The researchers also point to the growing relevance of quiet vehicles, such as electric cars, which may reduce the usefulness of auditory cues for pedestrians with or without vision loss. Future studies will need to explore how these vehicles affect collision judgment.


Additional Context: Age-Related Macular Degeneration

AMD affects millions of people globally and is a leading cause of vision loss among older adults. While it primarily damages central vision, many individuals retain functional peripheral vision and motion perception. This study adds to a growing body of evidence that vision loss does not automatically translate to complete functional impairment, especially when it comes to dynamic tasks.

However, AMD can still significantly impact daily life, particularly in complex environments. Understanding the specific strengths and limitations associated with the condition is essential for promoting independence while maintaining safety.


Looking Ahead

The research team plans to investigate more complex traffic scenarios in future studies. These may include multiple vehicles, changing speeds, intersections, and varied sound environments. Such work will help determine how well these findings generalize beyond the lab.

Ultimately, the goal is to gain a deeper understanding of how people with visual impairments interact with the world so that mobility, independence, and safety can be improved through evidence-based solutions.


Research Paper Reference:
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0337549

Also Read

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments