Presentation Information
[3O2-IS-3-06]Exploring Implicit Gestures in Egocentric Daily Activities for Robot Motion Design
〇Liliana Villamar Gomez1, Chenfei Xu2, Huthaifa Ahmad1, Yutaka Nakamura1 (1. Guardian Robot Project, RIKEN, 2. Graduate School of Engineering Science, Osaka University)
work-in-progress
Keywords:
Implicit gestures,Egocentric vision,Human–robot interaction
This work explores the role of implicit body motion, distinct from explicit communicative gestures, in supporting perceptually coherent behavior for simple social robots. The objective is to characterize implicit human motion from egocentric video and examine its relevance to scalable robot control.
We analyze the amplitude and temporal continuity of head, torso, and arm motion during daily activities and use these observations to formulate a two-layer gesture model, in which continuous implicit motion is persistently present and modulated by behavioral state, while explicit gestures are event-driven. Our exploratory analysis suggests that continuous implicit motion contributes to perceived aliveness and attentional coherence even in the absence of manipulation capabilities. These findings motivate future work on dataset expansion and perceptual evaluation in real robot systems.
We analyze the amplitude and temporal continuity of head, torso, and arm motion during daily activities and use these observations to formulate a two-layer gesture model, in which continuous implicit motion is persistently present and modulated by behavioral state, while explicit gestures are event-driven. Our exploratory analysis suggests that continuous implicit motion contributes to perceived aliveness and attentional coherence even in the absence of manipulation capabilities. These findings motivate future work on dataset expansion and perceptual evaluation in real robot systems.
