Understanding how visual cues help improve speech perception

Characterizing the recovery of spectral, temporal, and phonemic speech information from visual cues

['FUNDING_R01'] · UNIVERSITY OF MICHIGAN AT ANN ARBOR · NIH-11009943

This study looks at how watching lip movements and facial expressions can help people, especially those with hearing difficulties from aging or injuries, better understand speech in noisy places, with the goal of improving communication and social interactions for them.

Quick facts

Phase['FUNDING_R01']
Study typeNih_funding
SexAll
SponsorUNIVERSITY OF MICHIGAN AT ANN ARBOR (nih funded)
Locations1 site (ANN ARBOR, UNITED STATES)
Trial IDNIH-11009943 on ClinicalTrials.gov

What this research studies

This research investigates how visual information, such as lip movements and facial expressions, can enhance the ability to understand speech, especially in challenging listening environments. It focuses on individuals who may struggle with hearing due to age-related decline, brain injuries, or other auditory impairments. By exploring the interactions between auditory and visual signals, the study aims to develop a better understanding of how these cues can be used to improve communication and social interactions for those affected. The research employs a dual-route perceptual model to analyze how visual signals can integrate with auditory information to support speech perception.

Who could benefit from this research

Good fit: Ideal candidates for this research include individuals experiencing age-related hearing loss, those with acquired brain injuries, or anyone facing challenges in speech perception due to auditory deficits.

Not a fit: Patients with normal hearing or those whose speech perception issues are not related to auditory-visual interactions may not benefit from this research.

Why it matters

Potential benefit: If successful, this research could lead to improved communication strategies and tools for individuals with hearing impairments, enhancing their social and emotional well-being.

How similar studies have performed: Previous research has shown promising results in using visual cues to aid speech perception, indicating that this approach has potential for further exploration.

Where this research is happening

ANN ARBOR, UNITED STATES

Researchers

About this research

  1. This is an active NIH-funded research project — typically early-stage science, not a clinical trial accepting patient enrollment.
  2. Some NIH-funded labs run parallel clinical studies or seek volunteers for related work. To check, contact the principal investigator or institution listed above.
  3. For full project details, budget, and progress reports, visit the official NIH RePORTER page below.

View on NIH RePORTER →

Conditions: Acquired brain injury

Last reviewed 2026-05-15 by the Find a Trial editorial team. Information on this page is for educational purposes and is not medical advice. Always consult qualified healthcare professionals about clinical trial participation.