Affective Body Language of Humanoid Robots – PhD Thesis of Junchao Xu

Junchao’s PhD thesis on emotional expression through body language is available here!

Here’s a summary of his work:

Robots will be increasingly integrated with daily activities of humans. The robots will cooperate with us, assist us, and accompany us. Social abilities are important for such robots to interact with us harmoniously and to be accepted by us. The expression of affect is one of the social abilities. The expression facilitates human understanding of a robot’s behavior, rationale, and motives, and increases the perception of a robot as trustworthy, reliable, and life-like. Most of the current approaches focus on categorial emotional expressions, often with a focus on facial expressions. A few studies addressed bodily emotion expressions that are separate body actions. For enduring human-robot interactions, there is a lack of models and methods for bodily mood expressions that the robot can show during execution of functional behaviors. In this thesis, we develop body language for humanoid robots to express mood at an arbitrary time, even while executing a task, and the mood is represented in dimensional scales. We create a model for robot mood expression, validate the model, and investigate users’ perception of the robot mood and effects of the mood expression on users in dyadic and group settings.

Be Sociable, Share!

Comments are closed.

© 2011 TU Delft