Associated Emotion and its Expression in an Entertainment Robot QRIO

Similar documents
Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Sensor system of a small biped entertainment robot

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

RoboCup: Not Only a Robotics Soccer Game but also a New Market Created for Future

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

The Third Generation of Robotics: Ubiquitous Robot

Birth of An Intelligent Humanoid Robot in Singapore

A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures

U ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co.

Using Reactive and Adaptive Behaviors to Play Soccer

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Concept and Architecture of a Centaur Robot

Kid-Size Humanoid Soccer Robot Design by TKU Team

Robotic Systems ECE 401RB Fall 2007

Concept and Architecture of a Centaur Robot

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

UBIQUITOUS ROBOT: THE THIRD GENERATION OF ROBOTICS. Jong-Hwan Kim, Kang-Hee Lee, and Yong-Duk Kim

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Development and Evaluation of a Centaur Robot

Robotics for Children

Emotional Robotics: Tug of War

RoboCup. Presented by Shane Murphy April 24, 2003

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Understanding the Mechanism of Sonzai-Kan

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Keywords: Multi-robot adversarial environments, real-time autonomous robots

Affective Communication System with Multimodality for the Humanoid Robot AMI

Reactive Planning with Evolutionary Computation

KI-SUNG SUH USING NAO INTRODUCTION TO INTERACTIVE HUMANOID ROBOTS

Physical and Affective Interaction between Human and Mental Commit Robot

Reading human relationships from their interaction with an interactive humanoid robot

Emotional Architecture for the Humanoid Robot Head ROMAN

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

Team Description 2006 for Team RO-PE A

Why interest in visual perception?

Multi-Platform Soccer Robot Development System

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cost Oriented Humanoid Robots

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Generating Personality Character in a Face Robot through Interaction with Human

AUTONOMY AND LEARNING IN MOBILE ROBOTS

FAST GOAL NAVIGATION WITH OBSTACLE AVOIDANCE USING A DYNAMIC LOCAL VISUAL MODEL

Online Knowledge Acquisition and General Problem Solving in a Real World by Humanoid Robots

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Emotion Based Music Player

A*STAR Unveils Singapore s First Social Robots at Robocup2010

Meet Pepper. Because of this, Pepper will truly change the way we live our lives.

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -

Children and Social Robots: An integrative framework

GA-based Learning in Behaviour Based Robotics

Incorporating Motivation in a Hybrid Robot Architecture

Intro to AI. AI is a huge field. AI is a huge field 2/19/15. What is AI. One definition:

Intro to AI. AI is a huge field. AI is a huge field 2/26/16. What is AI (artificial intelligence) What is AI. One definition:

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

Person Identification and Interaction of Social Robots by Using Wireless Tags

Emily Dobson, Sydney Reed, Steve Smoak

A Practical Approach to Understanding Robot Consciousness

Assess how research on the construction of cognitive functions in robotic systems is undertaken in Japan, China, and Korea

Robotic clicker training

Playing Tangram with a Humanoid Robot

Team TH-MOS. Liu Xingjie, Wang Qian, Qian Peng, Shi Xunlei, Cheng Jiakai Department of Engineering physics, Tsinghua University, Beijing, China

YUMI IWASHITA

System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications

Touch Perception and Emotional Appraisal for a Virtual Agent

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot

Team TH-MOS Abstract. Keywords. 1 Introduction 2 Hardware and Electronics

By Marek Perkowski ECE Seminar, Friday January 26, 2001

Human-robot relation. Human-robot relation

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Cooperation among Situated Agents in Learning Intelligent Robots. Yoichi Motomura Isao Hara Kumiko Tanaka

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs

Intent Expression Using Eye Robot for Mascot Robot System

An Intelligent Robot Based on Emotion Decision Model

Curriculum Vitae. Ryuma Niiyama

An Emotion Model of 3D Virtual Characters In Intelligent Virtual Environment

Human Robot Interaction (HRI)

Robotic Spatial Sound Localization and Its 3-D Sound Human Interface

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT

Robo-Erectus Tr-2010 TeenSize Team Description Paper.

Digital Olfaction Society Fourth World Congress December 3-4, 2018 Tokyo Institute of Technology 0

IN MOST human robot coordination systems that have

Affiliate researcher, Robotics Section, Jet Propulsion Laboratory, USA

Building Perceptive Robots with INTEL Euclid Development kit

Service Robots in an Intelligent House

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Major Project SSAD. Mentor : Raghudeep SSAD Mentor :Manish Jha Group : Group20 Members : Harshit Daga ( ) Aman Saxena ( )

Overview Agents, environments, typical components

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Human-oriented Interaction with an Anthropomorphic Robot

Efficient Gesture Interpretation for Gesture-based Human-Service Robot Interaction

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

An Unreal Based Platform for Developing Intelligent Virtual Agents

Transcription:

Associated Emotion and its Expression in an Entertainment Robot QRIO Fumihide Tanaka 1. Kuniaki Noda 1. Tsutomu Sawada 2. Masahiro Fujita 1.2. 1. Life Dynamics Laboratory Preparatory Office, Sony Corporation, Tokyo, Japan {boom, noda, mfujita}@pdp.crl.sony.co.jp 2. Information Technologies Laboratories, Sony Corporation, Tokyo, Japan tsawada@pdp.crl.sony.co.jp Abstract. We human associate and memorize situations with emotional feelings at the time, and these experiences affect our daily behaviors. In this paper, we will present our attempt to design this character in an entertainment robot QRIO aiming for more genuine Human-Robot interaction. 1. Introduction We consider that entertainment robots have broad potentials for making our life further enjoyable, and have developed several products and applications. After releasing AIBO [1], a pet-type quadruped robot, we also have been started a biped humanoid-type one QRIO [2,3,4] for years increasing its wide range of abilities. Among these manifold directions, we focus our target in this paper to the interaction domain between a human and a robot, especially in topics around emotion [5,6] as we consider it as an important factor realizing entertainment applications for humans. We set three issues: Realizing personalized interaction, Experience records dependency, and Clear expression of robotic emotions. First, to realize personalized interaction between a person and a robot, the latter should be able to recognize the former as an individual, and alter its behavior respectively. QRIO can recognize and identify humans by using its sensory information such as vision and audio [2], and

this ability plays an important role here also. Secondly, the interaction process continues through our daily life, and therefore the robot should be able to accumulate experiences in its memories for affecting not only its behaviors but also emotions. This is one central point in this paper. As will be presented in Section 3 and 4, QRIO can associate the variation of its internal value variables with corresponding situation, and update its memories by accumulating many of the experience. This device makes it possible to create such kinds of behavior like trauma: just seeing an object scares QRIO, as there were dangerous experiences before about it. Finally, besides these abilities, the robot should be able to exhibit or express its emotions in clear ways. By exploiting its rich motion control system, we implemented above ideas to QRIO with vivid behaviors, which will be described in Section 5. 2. QRIO: a Small Biped Entertainment Robot [2,3,4] Fig. 1. illustrates the appearance of QRIO. It is a stand-alone autonomous robot interacting with people and the environment with various abilities: walking, running, dancing, singing songs, playing soccer, throwing a ball, making conversation, and so on. Please refer to [2,3,4] for more details about its specification and abilities. A. B. C. D. E. F. G. I. J. A. Stereo Camera B. Speaker C. Shoulder Switch D. Distance Sensor (Head) E. Expression Lamp (Eye) (b) Side F. Expression Lamp (Chest) / Power Switch G. Mode Lamp H. Distance Sensor (Hand) H. K. I. Expression Lamp (Ear) J. Multi Microphone K. Grip Switch (a) Front (c) Back Fig. 1. Appearance of QRIO

3. Software Components in EGO Architecture [7] In this section, a brief overview of software components inside QRIO is presented (Fig. 2.). As a basic behavior and motion control architecture, we adopt the EGO (Emotionally GrOunded) architecture [7]. The main strategy of it is an ethological model [8,9]. Behavior control is based on homeostasis where a robot selects its behaviors to regulate and maintain its internal state within a certain acceptable range. Fig. 2. Overview of the EGO Architecture Perception part corrects all the input information from outside environment and inside QRIO by using various sensors. Some parts of it are processed by several recognition engines such as face recognition (FR) and general object recognition (OBJR). Memory part consists of the short-term memory (STM) and the long-term memory (LTM). STM integrates the results of perception in many ways. For example, it receives not only speech recognition results but also the sound source direction. LTM and Internal model part (DIA, ISM, and EMG) will be explained in the next section. Behavior and motion control part consists of the situated behavior layer (SBL) and the motion controller (MC). SBL [10] has multiple behavior modules and determines QRIO s actions processed by MC. Details in above are outside scope of this paper, and please refer to other literature [10] for more rich description.

4. Emotion-Grounding in Internal Model Part Internal model part (Fig. 2, 3.) takes charge of QRIO s internal states and emotions. Internal state model (ISM) maintains the former variables: HUNGER, FULLNESS, PAIN, COMFORT, FATIGUE, and SLEEPINESS [11]. They are varied with the passage of time and external stimuli such as face detection. Some of the variables relate also to internal information like battery volume or temperature. Fig. 3. Overview of the Internal Model Part Delta-Internal value associator (DIA) associates the variation of the internal state variables with the current situation that is composed of outputs from perception and recognition parts. When a variation (above a threshold) happens in ISM, it sends the variation (of the internal state variables) vector to DIA. Then DIA associates it with current outputs from FR, OBJR (and any other perceptual data). The association can be implemented by any function approximation systems like an artificial neural network or just a simple rule base that is stored at the long-term memory (LTM). After several learning (association) experiences, DIA can remind the variation of the internal state variables just seeing the corresponding person or object or both. Emotion generator (EMG) contains emotion model that is composed of 6+1 emotions: JOY, SADNESS, ANGER, SURPRISE, DISGUST, FEAR and NEUTRAL. Each emotion has an associated value that is determined based on the self-preservation values, SELF_CRISIS and SELF_CRISIS_EXPECTATION that in turn are calculated by values of ISM (and their variation vector from DIA in case there is).

5. Expression of Associated Emotion We applied above ideas to QRIO interacting with human. The far left in Fig. 4. plots a typical transition of emotion values (only NEUTRAL and FEAR are shown here) with time increases. A person comes and he twisted QRIO s hand (the 2 nd picture). Observing his face, the ISM value PAIN stands up, and it leads up to increasing the value of FEAR at t=31. DIA learns the association between the face and the PAIN variation at the same time, updating LTM. After that, the person goes away and the value of FEAR decreases gradually. Then, he comes again in front of QRIO. DIA calculates the associated variation consulting LTM, and the value of FEAR is again stands up even without twisting hands at t=145. This change comes down to the behavior and motion control part, and QRIO expresses the gesture and voice of FEAR (the 3 rd picture). We can also test continuous facial change in CG (the far right one). Emotion Value 100 NEUTRAL FEAR 80 60 40 20 0 0 50 100 150 200 time [s] Fig. 4. Association of PAIN and expression of FEAR Another person comes (the far left in Fig. 5.) stroking QRIO s head with chocolate in his hand. This time, COMFORT value increases (by the stroking) recognizing his face and the chocolate. This association is learned by DIA using a neural network stored at LTM. Thanks to its generalization property, here QRIO can remind the COMFORT variation and express the motion and voice of JOY in case not only showing the face and the chocolate (the 2 nd picture) but also just the face (the 3 rd one). Fig. 5. Association of COMFORT and expression of JOY

6. Conclusions We are trying to develop entertainment robots for human in daily life. To this end, it is important for the robot to be able to identify himself with human, and vice versa. Associated emotion, that is, the one that is associated with the robot s situation at the time can be implemented by using the internal model presented in this paper. Emotion expression is another crucial ability, and we let QRIO do it by exploiting its powerful behavior control system. Regarding face expression [12], we will consider every technology after much debate for realizing further genuine interaction. References 1. Fujita, M., Kitano, H.: Development of a Quadruped Robot for Robot Entertainment. Autonomous Robots, Vol.5, Kluwer Academic (1998) 7-18 2. Fujita, M., and et al.: SDR-4X II: A Small Humanoid as an Entertainer in Home Environment. Int. Symposium of Robotics Research (2003) 3. Ishida, T., and et al.: Development of Mechanical System for a Small Biped Entertainment Robot. IEEE Int. Workshop on Robot and Human Interactive Communication (2003) 4. Kuroki, Y., and et al.: A Small Biped Entertainment Robot Exploring Human-Robot Interactive Applications. IEEE Int. Workshop on Robot and Human Interactive Communication (2003) 5. Ogata, T., Sugano, S.: Consideration of Emotion Model and Primitive Language of Robots. In: Kitamura, T. (eds.): What Should be Computed to Understand and Model Brain Function?, World Scientific (2001) 6. Breazeal, C.: Designing Socialable Robots. MIT Press (2002) 7. Fujita, M., and et al.: Autonomous Behavior Control Architecture of Entertainment Humanoid Robot SDR-4X. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (2003) 8. Lorenz, K.: The Foundations of Ethology. Springer-Verlag (1981) 9. Arkin, R., and et al.: Ethological and Emotional Basis for Human-Robot Interaction. Robotics and Autonomous System, Vol.42, Elsevier (2003) 191-201 10. Sawada, T., Takagi, T., Fujita, M.: Behavior Selection and Motion Modulation in Emotionally Grounded Architecture for QRIO SDR-4X II. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (2004, submitted) 11. Ekman, P., Davidson, R.J.: The Nature of Emotion. Oxford University Press (1994) 12. Takanishi, A.: An Anthropomorphic Robot Head having Autonomous Facial Expression Function for Natural Communication with Human. Int. Symposium of Robotics Research (1999)