Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore
INTRODUCTION In order for companion robots to be socially accepted they need to express emotions. Body language presents an ideal way of expressing emotions for humanoid robots such as: Nao: Body with 25 Degrees of freedom (0 in the face). Nadine: Body with 20 Degrees of freedom (7 in the face). It is possible to correctly identify emotions expressed through body language only. 2
Context of the work ALIZ-E European Project Long term Child-Robot interaction in the context of healthcare Feelix Growing European Project FEEL, Interact, express: a Global approach to development With INterdisciplinary Grounding, Autonomous Virtual Humans and Social Robots for Telepresence Replace a real participant by its virtual/robotic counterpart
MAIN QUESTIONS Is it possible for a robot to display emotional body language without stopping the robot ongoing activities? The relationship between the different body parts position and the interpretation is not known. This is problematic in order to display emotions without disturbing the ongoing tasks. The effect of moving one joint may depend on the position of the other parts of the body. Can a robot express emotion on a continuous space? 4
Affect Space of Expression Is it possible to adapt Breazal s affective space* to emotional body language? *C. Breazal, Designing sociable robots: MIT press, 2002.
Nao Bodily Expression of Emotions Expressive Poses Is it possible to correctly identify the emotions displayed by Nao? What is the effect of moving the Head on the interpretation of an emotion? (A: Anger, B: Sadness, C: Fear, D: Pride, E: Happiness, F: Excitement) Beck, A.; Cañamero, L.; Bard, K.A., "Towards an Affect Space for robots to display emotional body language," IEEE RO-MAN, vol., no., pp.464,469, 13-15 Sept. 2010 Beck, A.; Stevens B.; Bard, K; Cañamero, L. 2012. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2, 1
Nao Bodily Expression of Emotions The results show that: Body language can be successfully used by Nao to express emotions. Head up was always evaluated as more highly Aroused than Head straight or down. Valence and Stance depended on Head Position and the emotion displayed but were in similar directions. Head position can successfully used to change emotional expressions.
Nao Bodily Expression of Emotions Affect Space for Body Language An Affect Space was generated using the results of Experiment 1 and was tested empirically. Example of Key poses generated by the system (100% Sadness. 70% Sadness 30% Fear. 50% Sadness 50% Fear. 30% Sadness 70% Fear. 100% Fear). Beck A; Hiolle, A; Mazel, A; Cañamero, L. 2010. Interpretation of emotional body language displayed by robots. In Proceedings of AFFINE 10. ACM, New York, NY, USA, 37-42 Beck, A.; Stevens B.; Bard, K; Cañamero, L. 2012. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2, 1
Nao Bodily Expression of Emotions Experiment 2 The interpretations of the key poses suggest that the Affect Space created can be used to greatly enrich the expressiveness of the robot. It can be used to avoid displaying always the exact same expression for an emotion while still being understandable. The system can generate expressive animation on the fly.
Adding Dynamic Elements to the Expressions Add dynamic elements to the static poses using Perlin Noise to the different poses. Perlin Noise (Perlin, 1995) is a well-established tool in animation and has also been used, to a much lesser degree, in robotics. Contributions to Knowledge: Dynamic properties of movement that have been shown to express emotions are used to set the Perlin Noise parameters. Beck A; Hiolle, A; Cañamero, L. Using Perlin Noise to Generate Emotional Expressions in a Robot, Cog Sci 2013.
Expressing Emotions A perceptual study was conducted to test the effect of adding dynamic elements generated using Perlin Noise to the perception of the emotion displayed. The study looked at the speed and jerkiness aspects of the movement generated: Velocity: Time taken by the robot to move, i.e. the shorter the time the higher the velocity. Jerkiness: random variations to the duration parameter VIDEO
Results of the Perceptual Study Key pose had a significant effect on perceived Valence (F(4,72)=33.26, p<0.01) and on perceived Arousal (F(4,72)=13.29, p<0.01). Velocity had a significant effect on perceived Arousal (F(2,36)=93.60, p<0.01). Jerkiness had a significant effect on perceived Arousal (F(1,18)=27.51, p<0.01).
Combining Facial and bodily expressions In comparison to Nao, the Nadine robot can use a combination of body and facial expressions to display emotions. No SDK, everything is developed within IMI/BTC
Robot Controller Software to control the robot are developed within IMI/BTC: Allow for the synchronized display of body movements, expressions, idle movements along with speech. Respond in real time. Believable movements Expression of Emotions
Main Classes of the controller: I2p Agent Control server: i2p Interface that receives instructions from the Network Nadine Controller: Execute the command, sync the output and send 1 frame to the checker every 30ms. Text to Speech: Synthetizes the speech and produces the lip animation. Joint: Stores the trajectory and state of each joint. XML Library of Animations: Load and store the Pre-defined animations (XML). Online Movement Generation: Inverse Kinematics and Gaze
Nadine Robot Controller
Thanks a lot for you attention! Any questions? 17