Non Verbal Communication of Emotions in Social Robots

Similar documents
Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

ACE: A Platform for the Real Time Simulation of Virtual Human Agents

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

Touch Perception and Emotional Appraisal for a Virtual Agent

Understanding the Mechanism of Sonzai-Kan

Human-Robot Companionships. Mark Neerincx

Autonomous and Mobile Robotics Prof. Giuseppe Oriolo. Introduction: Applications, Problems, Architectures

James-Lange Theory Explanation

Implementation of Face Detection and Recognition of Indonesian Language in Communication Between Humans and Robots

Humanoid Robots. by Julie Chambon

Modeling Human-Robot Interaction for Intelligent Mobile Robotics

Overview Agents, environments, typical components

The design and making of a humanoid robotic hand

Proceedings of th IEEE-RAS International Conference on Humanoid Robots ! # Adaptive Systems Research Group, School of Computer Science

CS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov

Vision-based User-interfaces for Pervasive Computing. CHI 2003 Tutorial Notes. Trevor Darrell Vision Interface Group MIT AI Lab

Evaluating 3D Embodied Conversational Agents In Contrasting VRML Retail Applications

Emotional BWI Segway Robot

Children and Social Robots: An integrative framework

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

Virtual Robots Module: An effective visualization tool for Robotics Toolbox

Dynamic Emotion-Based Human-Robot Collaborative Assembly in Manufacturing:The Preliminary Concepts

The Use of Social Robot Ono in Robot Assisted Therapy

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

Behavior-Based Script Language for Anthropomorphic Avatar Animation in Virtual Environments

A*STAR Unveils Singapore s First Social Robots at Robocup2010

An Emotion Model of 3D Virtual Characters In Intelligent Virtual Environment

Learning Actions from Demonstration

Affective Animation of a Simulated Robot: A Formative Study

Prospective Teleautonomy For EOD Operations

Affective Communication System with Multimodality for the Humanoid Robot AMI

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Information and Program

AutoHabLab Addressing Design Challenges in Automotive UX. Prof. Joseph Giacomin September 4 th 2018

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Emotion Sensitive Active Surfaces

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

An interdisciplinary collaboration of Theatre Arts and Social Robotics: The creation of empathy and embodiment in social robotics

5a. Reactive Agents. COMP3411: Artificial Intelligence. Outline. History of Reactive Agents. Reactive Agents. History of Reactive Agents

HUMAN-ROBOT INTERACTION

Baset Adult-Size 2016 Team Description Paper

Tasks prioritization for whole-body realtime imitation of human motion by humanoid robots

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

Virtual General Game Playing Agent

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Trajectory Generation for a Mobile Robot by Reinforcement Learning

A Reactive Collision Avoidance Approach for Mobile Robot in Dynamic Environments

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

YDDON. Humans, Robots, & Intelligent Objects New communication approaches

Associated Emotion and its Expression in an Entertainment Robot QRIO

Empathy Objects: Robotic Devices as Conversation Companions

Robotic Systems ECE 401RB Fall 2007

ReVRSR: Remote Virtual Reality for Service Robots

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

Building Perceptive Robots with INTEL Euclid Development kit

Modalities for Building Relationships with Handheld Computer Agents

A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

Emotional Architecture for the Humanoid Robot Head ROMAN

CS325 Artificial Intelligence Robotics I Autonomous Robots (Ch. 25)

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback?

A Design Platform for Emotion-Aware User Interfaces

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

This is a repository copy of Bayesian perception of touch for control of robot emotion.

Human Robot Dialogue Interaction. Barry Lumpkin

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Humanoid Robot NAO: Developing Behaviors for Football Humanoid Robots

Virtual Life Network: a Body-Centered Networked Virtual Environment*

Perception of Affective Body Movements in HRI Across Age Groups: Comparison Between Results from Denmark and Japan

CS 378: Autonomous Intelligent Robotics. Instructor: Jivko Sinapov

1 Chapter 7: Steady-State Errors. Chapter 7. Steady-State Errors. 2000, John Wiley & Sons, Inc. Nise/Control Systems Engineering, 3/e

Subtle Expressivity in a Robotic Computer

Development and Evaluation of a Centaur Robot

A New Architecture for Simulating the Behavior of Virtual Agents

AFFECTIVE COMPUTING FOR HCI

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Michael Cowling, CQUniversity. This work is licensed under a Creative Commons Attribution 4.0 International License

Robot Task-Level Programming Language and Simulation

Design and evaluation of Hapticons for enriched Instant Messaging

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

Generating Personality Character in a Face Robot through Interaction with Human

Playing Tangram with a Humanoid Robot

INTERACTIONS WITH ROBOTS:

Design and Emotional Expressiveness of Gertie (An Open Hardware Robotic Desk Lamp)*

Short Course on Computational Illumination

Multi-Platform Soccer Robot Development System

Cynthia Breazeal and Brian Scassellati

A system for creating virtual reality content from make-believe games

Chapter 2 Intelligent Control System Architectures

Teaching Robot s Proactive Behavior Using Human Assistance

Designing 3D Virtual Worlds as a Society of Agents

Dependable AI Systems

HUMAN ROBOT INTERACTION (HRI) is a newly

Transcription:

Non Verbal Communication of Emotions in Social Robots Aryel Beck Supervisor: Prof. Nadia Thalmann BeingThere Centre, Institute for Media Innovation, Nanyang Technological University, Singapore

INTRODUCTION In order for companion robots to be socially accepted they need to express emotions. Body language presents an ideal way of expressing emotions for humanoid robots such as: Nao: Body with 25 Degrees of freedom (0 in the face). Nadine: Body with 20 Degrees of freedom (7 in the face). It is possible to correctly identify emotions expressed through body language only. 2

Context of the work ALIZ-E European Project Long term Child-Robot interaction in the context of healthcare Feelix Growing European Project FEEL, Interact, express: a Global approach to development With INterdisciplinary Grounding, Autonomous Virtual Humans and Social Robots for Telepresence Replace a real participant by its virtual/robotic counterpart

MAIN QUESTIONS Is it possible for a robot to display emotional body language without stopping the robot ongoing activities? The relationship between the different body parts position and the interpretation is not known. This is problematic in order to display emotions without disturbing the ongoing tasks. The effect of moving one joint may depend on the position of the other parts of the body. Can a robot express emotion on a continuous space? 4

Affect Space of Expression Is it possible to adapt Breazal s affective space* to emotional body language? *C. Breazal, Designing sociable robots: MIT press, 2002.

Nao Bodily Expression of Emotions Expressive Poses Is it possible to correctly identify the emotions displayed by Nao? What is the effect of moving the Head on the interpretation of an emotion? (A: Anger, B: Sadness, C: Fear, D: Pride, E: Happiness, F: Excitement) Beck, A.; Cañamero, L.; Bard, K.A., "Towards an Affect Space for robots to display emotional body language," IEEE RO-MAN, vol., no., pp.464,469, 13-15 Sept. 2010 Beck, A.; Stevens B.; Bard, K; Cañamero, L. 2012. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2, 1

Nao Bodily Expression of Emotions The results show that: Body language can be successfully used by Nao to express emotions. Head up was always evaluated as more highly Aroused than Head straight or down. Valence and Stance depended on Head Position and the emotion displayed but were in similar directions. Head position can successfully used to change emotional expressions.

Nao Bodily Expression of Emotions Affect Space for Body Language An Affect Space was generated using the results of Experiment 1 and was tested empirically. Example of Key poses generated by the system (100% Sadness. 70% Sadness 30% Fear. 50% Sadness 50% Fear. 30% Sadness 70% Fear. 100% Fear). Beck A; Hiolle, A; Mazel, A; Cañamero, L. 2010. Interpretation of emotional body language displayed by robots. In Proceedings of AFFINE 10. ACM, New York, NY, USA, 37-42 Beck, A.; Stevens B.; Bard, K; Cañamero, L. 2012. Emotional body language displayed by artificial agents. ACM Trans. Interact. Intell. Syst. 2, 1

Nao Bodily Expression of Emotions Experiment 2 The interpretations of the key poses suggest that the Affect Space created can be used to greatly enrich the expressiveness of the robot. It can be used to avoid displaying always the exact same expression for an emotion while still being understandable. The system can generate expressive animation on the fly.

Adding Dynamic Elements to the Expressions Add dynamic elements to the static poses using Perlin Noise to the different poses. Perlin Noise (Perlin, 1995) is a well-established tool in animation and has also been used, to a much lesser degree, in robotics. Contributions to Knowledge: Dynamic properties of movement that have been shown to express emotions are used to set the Perlin Noise parameters. Beck A; Hiolle, A; Cañamero, L. Using Perlin Noise to Generate Emotional Expressions in a Robot, Cog Sci 2013.

Expressing Emotions A perceptual study was conducted to test the effect of adding dynamic elements generated using Perlin Noise to the perception of the emotion displayed. The study looked at the speed and jerkiness aspects of the movement generated: Velocity: Time taken by the robot to move, i.e. the shorter the time the higher the velocity. Jerkiness: random variations to the duration parameter VIDEO

Results of the Perceptual Study Key pose had a significant effect on perceived Valence (F(4,72)=33.26, p<0.01) and on perceived Arousal (F(4,72)=13.29, p<0.01). Velocity had a significant effect on perceived Arousal (F(2,36)=93.60, p<0.01). Jerkiness had a significant effect on perceived Arousal (F(1,18)=27.51, p<0.01).

Combining Facial and bodily expressions In comparison to Nao, the Nadine robot can use a combination of body and facial expressions to display emotions. No SDK, everything is developed within IMI/BTC

Robot Controller Software to control the robot are developed within IMI/BTC: Allow for the synchronized display of body movements, expressions, idle movements along with speech. Respond in real time. Believable movements Expression of Emotions

Main Classes of the controller: I2p Agent Control server: i2p Interface that receives instructions from the Network Nadine Controller: Execute the command, sync the output and send 1 frame to the checker every 30ms. Text to Speech: Synthetizes the speech and produces the lip animation. Joint: Stores the trajectory and state of each joint. XML Library of Animations: Load and store the Pre-defined animations (XML). Online Movement Generation: Inverse Kinematics and Gaze

Nadine Robot Controller

Thanks a lot for you attention! Any questions? 17