Using home video and AI to track babies' emotions and caregiver interactions

Validation of a Virtual Still Face Procedure and Deep Learning Algorithms to Assess Infant Emotion Regulation and Infant-Caregiver Interactions in the Wild

['FUNDING_R01'] · UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN · NIH-11145933

This project uses smartphone video, wearable sensors, and AI to measure how infants manage emotions and interact with caregivers in everyday home settings, with attention to babies at risk from prenatal substance exposure.

Quick facts

Phase['FUNDING_R01']
Study typeNih_funding
SexAll
SponsorUNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN (nih funded)
Locations1 site (CHAMPAIGN, UNITED STATES)
Trial IDNIH-11145933 on ClinicalTrials.gov

What this research studies

We will ask caregivers to record short home videos and use small wearable sensors while doing a brief, guided interaction with their infant that mirrors a 'still-face' task adapted for everyday life. Researchers will apply deep-learning computer models to the videos and sensor data to find moment-to-moment emotional reactions and interaction patterns. The plan is to make data collection easier for families and more reflective of real home life than brief lab visits. The goal is to create tools that could help spot early signs of emotion regulation or relationship difficulty without heavy clinic burdens.

Who could benefit from this research

Good fit: Caregivers (adults) with infants or young children (0–11 years), especially families with prenatal substance exposure or who have concerns about emotion regulation or caregiver–child interactions, are ideal candidates.

Not a fit: Families without access to a smartphone or unwilling/unable to record home interactions, or people seeking immediate medical treatment for addiction rather than developmental monitoring, are unlikely to benefit directly from participation.

Why it matters

Potential benefit: Could make it easier to detect babies who need extra support and connect families to early interventions sooner, especially after prenatal substance exposure.

How similar studies have performed: Lab-based 'still-face' tests and wearable-sensor studies have shown useful signals, but applying virtual tasks and deep-learning to natural home videos is a newer and less-tested approach.

Where this research is happening

CHAMPAIGN, UNITED STATES

Researchers

About this research

  1. This is an active NIH-funded research project — typically early-stage science, not a clinical trial accepting patient enrollment.
  2. Some NIH-funded labs run parallel clinical studies or seek volunteers for related work. To check, contact the principal investigator or institution listed above.
  3. For full project details, budget, and progress reports, visit the official NIH RePORTER page below.

View on NIH RePORTER →

Last reviewed 2026-05-15 by the Find a Trial editorial team. Information on this page is for educational purposes and is not medical advice. Always consult qualified healthcare professionals about clinical trial participation.