AI Struggles with Human Social Cues: Study Exposes Gaps in Predicting Dynamic Interactions
April 24, 2025
The findings suggest that deploying AI in environments requiring the interpretation of dynamic visual information, such as manufacturing and healthcare, could be problematic.
The challenges identified in the study could have serious implications for the development of AI technologies, particularly for self-driving cars and robots designed to interact with humans.
As companies like Figure AI and Boston Dynamics develop AI-enabled humanoid robots to work alongside humans, accurate interpretation of social cues becomes increasingly critical to prevent accidents.
Leyla Isik, the study's lead author, stressed the importance of AI recognizing human intentions and actions, such as predicting pedestrian movements and understanding conversations.
Isik also noted that integrating insights from neuroscience and cognitive science into AI development is essential to enhance its effectiveness in real-world social contexts.
This research serves as a reminder that despite substantial investments in autonomous technologies, the complexity of human interaction remains a significant hurdle for AI.
Kathy Garcia, a co-first author and doctoral student, emphasized that understanding the unfolding story in a scene is crucial for advancing AI capabilities.
Current limitations in AI's understanding of complex social settings have already led to erratic behavior in driverless cars, prompting federal investigations into companies like Waymo and Zoox for safety violations.
A recent study revealed that AI models struggle to accurately predict human judgments in social interactions, as evidenced by human participants rating three-second video clips.
The findings highlight a significant gap in AI's ability to process dynamic scenes, contrasting with its relative success in analyzing static images, indicating a pressing need for improved AI model development.
The study was presented at the International Conference on Learning Representations, where experiments compared human perception with AI performance.
This research received support from grants provided by the National Science Foundation and the National Institutes of Health.
Summary based on 12 sources
Get a daily email with more AI stories
Sources

ScienceDaily • Apr 24, 2025
Awkward. Humans are still better than AI at reading the room
Popular Science • Apr 24, 2025
AI still can’t beat humans at reading social cues
The Hub • Apr 24, 2025
Humans are still better than AI at reading the room
Bloomberg Law • Apr 24, 2025
AI Falls Short in Interpreting Social Interactions, Study Shows