Summary: New research shows that our own physical movements can alter the way we perceive emotions in others’ faces. In a virtual reality experiment, participants were more likely to judge a face as angry when they were actively moving away from it, compared to when the face was moving away from them.
The results reveal a bidirectional relationship between movement and emotion recognition, where avoidance behavior increases threat perception. These insights could help improve the design of social interaction in virtual communication and emotional artificial intelligence systems.
Key facts:
Behavior shapes perception: Actively avoiding a face made participants more likely to perceive anger, suggesting that actions influence emotion recognition. Bidirectional link: Findings highlight a reciprocal relationship between body movement and emotional perception. Practical implications: Could improve social realism and empathy in communication based on virtual reality and artificial intelligence.
Source: TUT
A research team from the Cognitive Neurotechnology Unit and the Visual Perception and Cognition Laboratory at Toyohashi University of Technology has found that approach-avoidance behavior in a virtual reality (VR) environment modulates the way individuals recognize facial expressions.
In particular, the study showed that participants were more likely to perceive a facial expression as “angry” when they were actively moving away from the facial stimulus than when the face was moving away from them.
These findings contribute to a better understanding of the reciprocal relationship between perception and action in social contexts.
The study was published online July 31, 2025 in the International Journal of Affective Engineering.
Facial expressions play a fundamental role in social communication. While it is well established that the expressions of others influence our behavior (such as approaching a smiling person or avoiding an angry one), the reverse effect—that is, whether our own behavior affects how we recognize others’ expressions—has been less explored.
To address this question, the research team conducted three psychophysical experiments using virtual reality. Participants wore a head-mounted display and viewed 3D facial models (avatars) under four different approach and avoidance conditions:
Active approach: The participant approached the avatar. Active avoidance: The participant moved away from the avatar. Passive approach: The avatar approached the participant. Passive avoidance: The avatar moved away from the participant.
Facial expressions were generated by transforming between happy and angry (or fearful) expressions at seven levels. Participants were asked to judge each expression as “happy” or “angry” (or “happy” or “fearful”) depending on the experimental condition.
The results of Experiment 1 showed that participants were more likely to recognize the avatar’s expression as “angry” when they were actively avoiding the face, compared to when the avatar was moving away from them.
This suggests that one’s own avoidance behavior can enhance the perception of threat in others’ facial expressions. The pattern supports the hypothesis that behavior and perception are linked bidirectionally.
Yugo Kobayashi, first author and doctoral student in the Department of Computer Science and Engineering, commented: “In current communication environments, such as video conferencing, opportunities for physical movement are limited. These findings suggest that face-to-face communication involving bodily actions may facilitate more natural recognition of facial expressions.”
The study provides evidence that approach and avoidance behavior itself can modulate the recognition of facial expressions. Future work will examine which aspects of these behaviors (such as motor intention, visual movement, or proprioceptive feedback) are critical for this modulation.
Funds:
This work was supported by JSPS KAKENHI (grant numbers JP21K21315, JP22K17987, JP20H05956, and JP20H04273), the Nitto Foundation, and research financial support for students of doctoral courses at Toyohashi University of Technology in fiscal year 2024.
Key questions answered:
A: People were more likely to perceive a face as angry when they themselves were moving away from it, compared to when the face was moving away from them.
A: It shows that our physical behavior (approach or avoidance) directly shapes how we interpret the emotions of others, revealing a feedback loop between movement and perception.
A: The findings could improve emotional AI, telepresence, and virtual environments by integrating body-based perceptual signals.
About this social neuroscience research news
Author: Shino Okazaki
Source: TUT
Contact: Shino Okazaki – TUT
Image: Image is credited to Neuroscience News.
Original research: Open access.
“Facial expression recognition is modulated by approach and avoidance behavior” by Yugo Kobayashi et al. International Journal of Affective Engineering
Abstract
Facial expression recognition is modulated by approach and avoidance behavior
Facial expression recognition influences approach and avoidance behaviors, but can these behaviors affect facial expression recognition?
We conducted psychophysical experiments using virtual reality to investigate this reverse causal relationship.
Participants responded to static 3D facial stimuli generated by morphing expressions between happy and angry in Experiments 1 and 3.
For Experiment 2, transformed happiness and fear stimuli were used. Participants approached, avoided, or were approached or avoided by their face.
The results showed that participants recognized the face as angrier when they avoided her than when she avoided them (Experiment 1); so happy to approach and fearful to avoid, regardless of who acted (Experiment 2); and angrier when the face approached them than when they approached it if both parties were physically close (Experiment 3).
These findings suggest that approach and avoidance behavior influence the recognition of facial expressions. We postulate that unconscious learning rooted in biological instincts creates this connection.