human robot interaction

human robot interaction


Unobtrusive sensing of multi-modal sensory input (video, audio andautonomic physiology). The raw data is pre-processed with computer vision, audio and physiology analysis techniques, to extract various behavioral cues of the child. These are then fed into the perception module of the robot.


The robot deploys pre-learned machine learning models to perform automated estimation of affect and engagement of the child from extracted behavioral cues. For this, the proposed personalized perception network is used to infer the child’s levels of valence, arousal and engagement continuously in time.


The result of the robot perception is used to modulate the child-robot interaction in the consequent steps of therapy process. For example, the robot chooses to perform one of pre-defined behaviors based on estimated levels of affect and engagement. To sustain the child’s engagement, these are accompanied with various prompts by the robot.

Rudovic O, Lee J, Dai M, Schuller B, Picard R. Personalized Machine Learning for Robot Perception of Affect and Engagement in Autism Therapy. arXiv preprint arXiv:1802.01186. 2018 Feb 4.

More details coming soon ...