Intent Expression Using Eye Robot for Mascot Robot System

Similar documents
SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Associated Emotion and its Expression in an Entertainment Robot QRIO

Physical and Affective Interaction between Human and Mental Commit Robot

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Emotional Architecture for the Humanoid Robot Head ROMAN

Effective Emotional Model of Pet-type Robot in Interactive Emotion Communication

INTERACTIVE EMOTION COMMUNICATION BETWEEN HUMAN AND ROBOT. Received February 2010; revised August 2010

The Control of Avatar Motion Using Hand Gesture

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

Virtual Operator in Virtual Control Room: The Prototype System Implementation

Active Agent Oriented Multimodal Interface System

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Informing a User of Robot s Mind by Motion

Controlling Humanoid Robot Using Head Movements

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Understanding the Mechanism of Sonzai-Kan

Keywords: user experience, product design, vacuum cleaner, home appliance, big data

Toward Culture-Aware Elderly Care Robots. Nak Young Chong School of Information Science Japan Advanced Institute of Science and Technology

Intelligent mechatronics is a

Robot: Geminoid F This android robot looks just like a woman

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Prediction of Human s Movement for Collision Avoidance of Mobile Robot

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Making a Mobile Robot to Express its Mind by Motion Overlap

STRATEGO EXPERT SYSTEM SHELL

AFFECTIVE COMPUTING FOR HCI

Biomimetic Design of Actuators, Sensors and Robots

Mood-transition-based Emotion Generation Model for the Robot s Personality

Generating Personality Character in a Face Robot through Interaction with Human

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

3D Data Navigation via Natural User Interfaces

Ubiquitous Home Simulation Using Augmented Reality

COMPACT FUZZY Q LEARNING FOR AUTONOMOUS MOBILE ROBOT NAVIGATION

Segmentation Extracting image-region with face

Telepresence Robot Care Delivery in Different Forms

Emotional BWI Segway Robot

Definitions and Application Areas

Welcome. PSYCHOLOGY 4145, Section 200. Cognitive Psychology. Fall Handouts Student Information Form Syllabus

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

System of Recognizing Human Action by Mining in Time-Series Motion Logs and Applications

The Relationship between the Arrangement of Participants and the Comfortableness of Conversation in HyperMirror

IN MOST human robot coordination systems that have

Effects of Nonverbal Communication on Efficiency and Robustness in Human-Robot Teamwork

H2020 RIA COMANOID H2020-RIA

Definitions of Ambient Intelligence

Tabulation and Analysis of Questionnaire Results of Subjective Evaluation of Seal Robot in Seven Countries

2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY

Rearrangement task realization by multiple mobile robots with efficient calculation of task constraints

Modeling Human-Robot Interaction for Intelligent Mobile Robotics

STUDY ON REFERENCE MODELS FOR HMI IN VOICE TELEMATICS TO MEET DRIVER S MIND DISTRACTION

Sensor system of a small biped entertainment robot

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART

Eye Contact Camera System for VIDEO Conference

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

Does the Appearance of a Robot Affect Users Ways of Giving Commands and Feedback?

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

Home-Care Technology for Independent Living

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

INVESTIGATION OF ACTUAL SITUATION OF COMPANIES CONCERNING USE OF THREE-DIMENSIONAL COMPUTER-AIDED DESIGN SYSTEM

Building Perceptive Robots with INTEL Euclid Development kit

Evaluating 3D Embodied Conversational Agents In Contrasting VRML Retail Applications

Industrial Tech Framework Alexandria Public Schools. April, 2009

A DIALOGUE-BASED APPROACH TO MULTI-ROBOT TEAM CONTROL

Announcements. HW 6: Written (not programming) assignment. Assigned today; Due Friday, Dec. 9. to me.

Kid-Size Humanoid Soccer Robot Design by TKU Team

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Wirelessly Controlled Wheeled Robotic Arm

Designing Toys That Come Alive: Curious Robots for Creative Play

Birth of An Intelligent Humanoid Robot in Singapore

Design and Application of Multi-screen VR Technology in the Course of Art Painting

Intelligent Power Economy System (Ipes)

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

This is a repository copy of Designing robot personalities for human-robot symbiotic interaction in an educational context.

Robotic Systems ECE 401RB Fall 2007

KINECT HANDS-FREE. Rituj Beniwal. Department of Electrical Engineering Indian Institute of Technology, Kanpur. Pranjal Giri

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Assignment 1 IN5480: interaction with AI s

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

Experience Design to Realize Value for Life in Smart Cities

Social Acceptance of Humanoid Robots

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Autonomic gaze control of avatars using voice information in virtual space voice chat system

WIRELESS VOICE CONTROLLED ROBOTICS ARM

Toward an Augmented Reality System for Violin Learning Support

Development of a telepresence agent

Perception. The process of organizing and interpreting information, enabling us to recognize meaningful objects and events.

Tattle Tail: Social Interfaces Using Simple Anthropomorphic Cues

Design Procedure on a Newly Developed Paper Craft

Flexible Cooperation between Human and Robot by interpreting Human Intention from Gaze Information

Transcription:

Intent Expression Using Eye Robot for Mascot Robot System Yoichi Yamazaki, Fangyan Dong, Yuta Masuda, Yukiko Uehara, Petar Kormushev, Hai An Vu, Phuc Quang Le, and Kaoru Hirota Department of Computational Intelligence and Systems Science Tokyo Institute of Technology, Japan, {yama, tou, hirota}@hrt.dis.titech.ac.jp Abstract An intent expression system using eye robots is proposed for a mascot robot system from a viewpoint of humatronics. The eye robot aims at providing a basic interface method for an information terminal robot system. To achieve better understanding of the displayed information, the importance and the degree of certainty of the information should be communicated along with the main content. The proposed intent expression system aims at conveying this additional information using the eye robot system. Eye motions are represented as the states in a pleasure-arousal space model. Changes in the model state are calculated by fuzzy inference according to the importance and degree of certainty of the displayed information. These changes influence the arousal-sleep coordinates in the space that corresponds to levels of liveliness during communication. The eye robot provides a basic interface for the mascot robot system that is easy to be understood as an information terminal for home environments in a humatronics society. Keywords intent expression, eye robot, fuzzy inference, humatronics I. INTRODUCTION As technology advances, robots are expected to become widespread in home environments. Robots can provide an easily understandable interface for an information terminal, which doesn t require special input devices such as a keyboard or a mouse[1][2][3][4]. Some necessary elements for a domestic information appliances robot are IT (Information Technology), recognition technology, and emotion expression technology from a humatronics point of view. Several new results have already been achieved regarding IT and recognition technology. Emotion expression technology is necessary to allow humans to understand robots[5][6][7][8][9]. Emotion expression technology makes humans and robots more comfortable and friendly when communicating with each other. Mentality expression is an essential component for friendly robot communication, and eye motions are suitable for the subtle expression of emotions[10]. A mentality expression system using an eye robot has already been proposed [11][12] for communication between human beings and robots. The expression of emotions is based on an affinity pleasure-arousal space, where mentality status is calculated by fuzzy inference from a speech understanding module. The constructed eye robot expresses mentality by easily comprehensible eye gestures, and constitutes an interface for a new type of user friendly information terminal, the so called mascot robot system[11]. The mascot robot system consists of speech recognition modules, information recommendation modules, and the eye robots, that are integrated with RT (Robot Technology) Middleware developed by AIST Japan. Its development is part of an ongoing project Development Project for a Common Basis of Next-Generation Robots led by the NEDO organization (New Energy and Industrial Technology Development Organization). In addition to the mentality expression system, an intent expression system (from a humatronics point of view) using an eye robot is proposed for the mascot robot system. The eye robot aims at providing a basic interface for an information terminal robot system. To achieve better understanding of the displayed information, the importance and the certainty of the information should be communicated along with the main content. The proposed intent expression system aims at conveying this additional information using the eye robot. The eye motions are represented as the states in a pleasure-arousal space model. Changes in the model state are calculated by fuzzy inference according to the importance and degree of certainty of the displayed information. These changes influence the arousal-sleep coordinates in the space which corresponds to levels of liveliness during communication. The eye robot provides a basic interface for the mascot robot system which is easily understandable as an information terminal for home environments in a humatronics based society. An overview of the mascot robot system is mentioned in ΙΙ. A fuzzy inference method for intent expression is presented in ΙΙΙ. Experimental results using the eye robot are presented in ΙV. This work was supported by Development Project for a Common Basis of Next-Generation Robots (sponsored by NEDO, Japanese government).

II. MASCOT ROBOT SYSTEM IT society is the society where everyone can enjoy the benefits of IT at any time and any place. Home robots are expected to function as a convenient and friendly interface for IT systems in the home environment. In this paper, a mascot robot system is proposed as an easily visible information terminal. Three components, i.e., an information proposal module, which offers information and choices that users may select from, a speech recognition module, which can easily be used by anyone, and five friendly eye robots, are integrated into a mascot robot system with the aid of RT Middleware developed by AIST Japan. The system's architecture is shown in Fig.1. communication between humans and robots. The expression of emotions is based on an affinity pleasure-arousal space, where mentality status is calculated by fuzzy inference from a speech understanding module. The developed eye robot expresses mentality by easily comprehensible eye gestures, and constitutes the friendly mascot robot system [11]. Four eye robots are stationary, and one eye robot is put on a mobile robot in Fig.3,4,5. [2 D.O.F] [3 D.O.F] [5 D.O.F] Fig. 3 Structure of the eye robot Fig. 1 Architecture of the mascot robot system The information recommendation module collects information about users from their speech, and proposes interesting information for users [11]. The speech recognition module is developed for general robot systems by NEC Corporation in the Development Project for a Common Basis of Next-Generation Robots. Fig. 4 Five eye robots for intent expression Fig. 5 Mobile type eye robot Fig. 2 Speech recognition module This module in Fig.2 is compact and has high performance, e.g. it has a small size, low cost, low power consumption, and an audio input. The eye robot with mentality expression has been already proposed for The ability to move allows the robot to move closer to people since the speech recognition module has to be close to the source of the sound to function properly. These are all integrated with RT Middleware as shown in Fig.6, which has been developed by the National Institute of Advanced Industrial Science and Technology (AIST), with the aim of building a common basic framework for the robot industry. Its development is part of an ongoing project Development Project for a Common Basis of Next-Generation Robots

led by the NEDO organization (New Energy and Industrial Technology Development Organization). relates to liveliness during communication. To take into consideration the continuous transition of mentality during interlocution, the three-dimensional affinity pleasure-arousal space is proposed as an extension of the pleasure-arousal plane. The motivation for expression is determined by fuzzy inference in the affinity pleasure-arousal space (Fig.7). [affinity pleasure-arousal space ] Fig. 6 System construction of five robots based on RTM III. EYE ROBOT BASED INTENT EXPRESSION SYSTEM A. Eye Robot The Eye robot that can express eye motions is developed based on the mechanism of human eyes. Eye motion consists of the combination of eyelid motion and ocular motion. The structures of eyelid part and ocular part are shown in Fig. 4. The eye robot has 2 degrees of freedom (D.O.F) for the eyelid part and 3 D.O.F for the ocular part. Both left and right eyelids have 1 D.O.F for opening, closing, and blinking. This eyelid part has a palpebra superior and a palpebra inferior. These two palpebras are linked to each other and have 1 D.O.F. Hence, the eyelid part as a whole has 2 D.O.F. The 3 D.O.F are given to binocular and ocular parts. The eye robot has 5 degrees of freedom, meaning that a smaller number of actuators is needed than that necessary for a facial expression system. Therefore, this system is appropriate for multi arranged type robot applications, especially from a cost point of view. B. Intent Expression by Eye Robot System A mentality expression system using the eye robot is proposed in [11]. Its input is language category information that is generated by a speech recognition system and its output is the expression of mentality using motions of the robot s eyes. The mechanical part of the eye robot which can produce eye motions is based on the mechanisms of the human eye for familiarity considerations. The eye robot has an eyelid part and an ocular part. The modality of expression with eyelids and the ocular part is based on a two-dimensional pleasure-arousal plane. The two-dimensional plane has a pleasure-displeasure axis and arousal-sleep axis. The pleasure-displeasure axis depends on the approval of the interlocutor, which is determined from the speech recognition module. Arousal-sleeping axis Input [Outside] fuzzy Inference displeasure Language Category Information Speech recognition system sleep affinity arousal pleasure [eyelid] [eyeball] CAT1: positive to robot CAT2: negative to robot CAT3: positive of interlocutor CAT4: negative of interlocutor CAT5: big word CAT6: greeting Fig.7 Affinity pleasure-arousal space Output Mentality expression But, it is not enough for a robot application communicating with humans to just show the user the necessary information through the display unit and speech recognition parts could not recognize inputs using only one feature. Information display parts have an order of precedence when showing information. Therefore systems must be able to show information comprehensibly in addition to the main content. Human beings use nonverbal messages to support verbal information, and communicate with each other effectively. Their are mainly four types of nonverbal message, i.e. emotional messages that show mentality state of sender, emotional messages that show agreement or disagreement with verbal information, meta-communication messages that alter speech timing, and specific gestures. Robots need to use these methods just like people to be able to carry out friendly communication. Eyes are important for communication, and their mechanisms are simple to build. The mentality expression system using an eye robot has been already proposed for communication between human beings and robots, and has been successful at offering a friendly atmosphere. There has been, however, no method to show emotional messages that express agreement or disagreement between verbal information and eye motion. C. Intent Expression for the Mascot Robot System In this paper, an intention expression system using the eye robot is proposed. This makes it possible for the robot agent to communicate non-verbally, and send emotional messages such as agreement, recommendation or sympathy (cf. Fig.8).

arousal (u) (v) (w) (x) (y) (p) (q) (r) (s) (t) Fig. 8 The home environment for the mascot robot system The system is designed for the mascot robot system. The intent expression system gives the mascot robot the ability to express the priority of recommendations according to the information recommendation module. It also makes it possible to express the degree of recommendation of the various items the robot proposes. The speech recognition module of the mascot robot system doesn t output the level of confidence. Therefore in this proposed intention expression system, information from the recommendation module will be used, but potential information from the speech recognition module will not. Mentality expression with the eye is independently assigned in the affinity pleasure-arousal space. The pleasure-arousal space consists of an arousal-sleep axis for the display of liveliness and the pleasure-displeasure axis for the display of real-time friendliness. Taking into consideration the variation of the mentality state during communication, the addition of an affinity axis to create an affinity pleasure-arousal space. The input of the intent expression system is the priority level of the information recommendation coming from the recommendation module and the mentality state as given by current position within the pleasure-arousal space. The output of the intent expression system is a movement of intent expression determined from the mentality state. Fuzzy inference is used for generating intent expressing movement based on the information recommendation priority level and the mentality state. Input is represented by coordinates in the pleasure-arousal plane S and a priority of recommendation r. The S is expressed as S S S( xpl, x ar), 200 xpl 200, (1) 200 xar 200. The output is S a variation of S. S is expressed as Δ S( xpl, xar) = ( Δxpl, Δ xar), 50 Δxpl 50, (2) 50 Δxar 50. To obtain the fuzzy quantization, membership functions are defined for each element of input. To obtain the output, the center of area defuzzification method is used. displeasure (k) (l) (m (n) (o) (f) (g) (h) (i) (j) (a) (b) (c) (d) (e) sleep Fig. 9 Pleasure arousal plane pleasure IV. THE EXPERIMENTS FOR INTENT EXPRESSION WITH EYE ROBOT A. Experiments environment for intent expression In the intent expression system using the eye robot, intent is expressed based on the level of recommendation generated by the information recommendation module from input information. To determine the relationship between the expression movement and the recommendation level, evaluation experiments were performed. These experiments were based on the results of human evaluations using information collected via questionnaire. Recommending or not recommending a book is used as a benchmark to determine 6 evaluation levels of recommendation degree Fig.10. Eye robot Display(laptop) Fig. 10 Benchmark test User

The experimental procedure is described in the followings. 1) The experiment s context (recommending a book) and the questionnaire are explained to the subject. 2) A recommended book is displayed, and then the robot expresses through movement 1 out 20 possible mentality states determined from pleasure-arousal space. The order of the movements is random. 3) The subject evaluates the expression movements and their variations using 6 grades. Steps 2),3) are repeated for all 20 possible mentality states.. [evaluation value] [evaluation value] 6 5 4 3 2 1 6 5 4 3 2 1 a-f-k-p-u b-g-l-q-v c-h-m-r-w d-i-n-s-x e-j-o-t-y a/b/c/d/e f/g/h/i/j k/l/m/n/o p/q/r/s/t u/v/w/x/y [area] a-b-c-d-e f-g-h-i-j k-l-m-n-o p-q-r-s-t u - y Fig. 11 Evaluation test 1 a/f/k/p/u b/g/l/q/v c/h/m/r/w d/i/n/s/x e/j/o/t/y [area] Fig. 12 Evaluation test 2 V. CONCLUSIONS A mascot robot system is proposed for functioning in home environments and coexisting with humans, and an intent expression system is demonstrated. The mascot robot system consists of 5 eye robots, a speech recognition module, and an information recommendation module which are all integrated using RT middleware technology. The eye robot is based on human eyes of 5 years old boy, and has eyelids with 2 degrees of freedom and eyeballs with 3 degrees of freedom. Four of the eye robots are fixed and one is put on a mobile robot. For the better human and robot communications, an intent expression system is proposed. This system permits the communication of non-verbal information along with the main content of the desired message. The system takes as input information the degree of confidence, outputs a degree of trust and calculates importance levels. These are combined to form a pleasure-arousal space, which along with providing a methods for expressing mentality state, also makes it possible to express intent. The proposed system provides a user friendly interface so that humans and robots communicate in natural fashion with needing to use keyboard type input devices. This is accomplished through a speech recognition module. Compared to current information terminals, the proposed interface is user friendly and appealing to humans, so the mascot robot system using eye robots is suitable for wide spread family use. REFERENCES [1] C. Breazeal: Robot in Society: Friend or Appliance, Agents99 workshop on emotion-based agent architectures, pp.18-26, 1999. [2] M. Senda, H. Kobayashi, T. Shiba, Y. Yamazaki: Impression Evaluation of the realistic Face Robot, in Proc. Human Interface Symposium, pp. 507-510, 2003. [3] H. Miwa, A. Takanishi, and H. Takanobu: Experimental study on robot personality for humanoid head robot, in Proc. IEEE/RSJ Int. Conf. Robots and Systems, pp.1183-1188, 2001. [4] T. Fukuda: Human-Robot Mutual Communication System, IEEE International Workshop on Robot and Human Communication (ROMAN), pp.14-19, 2001. [5] P.Ekman and W.V.Friesen: The Facial Action Coding System, Consulting Psychologists Press Inc., San Francisco,CA,1978. [6] A.L. Yarbus: Eye Movements and Vision, New York Plenum, 1967. [7] H. Saito and A. Mori: Visual Perception and Auditory Perception Ohmsha, Ltd 1999. [8] R. A. Hinde: Non-verbal Communication, Cambridge Univ. Press, 1972. [9] M. F. Vargas: Louder than words, Iowa State University Press, Japanese translation, pp.15-17, 78-107, 1987. [10] E.R. Kandel, J.H. Shcwarts, T.M. Jessell: Principle of Neural Science, New York, Plenum Press, pp.782-800, 2000. [11] Y. Yamazaki, F. Dong, Y. Uehara, Y. Hatakeyama, H. Nobuhara, Y. Takama, K. Hirota: Mentality Expression in Affinity Pleasure-Arousal Space using Ocular and Eyelid Motion of Eye Robot, SCIS&ISIS2006, 2006. [12] H. Tada, F. Yamada, K. Fukuda: Psychology of blinking, Kitaoji Publishing, pp.158-206, 1991. [13] J. A. Russell: Reading emotion from and into faces, The Psychology of Facial Expression, New York, Cambridge University, pp.295-320, 1997.