Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Similar documents
Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

Associated Emotion and its Expression in an Entertainment Robot QRIO

Emotional Architecture for the Humanoid Robot Head ROMAN

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Generating Personality Character in a Face Robot through Interaction with Human

An Intelligent Robot Based on Emotion Decision Model

Sensor system of a small biped entertainment robot

Physical and Affective Interaction between Human and Mental Commit Robot

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Intent Expression Using Eye Robot for Mascot Robot System

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

The Use of Social Robot Ono in Robot Assisted Therapy

ON THE DEVELOPMENT OF THE EMOTION EXPRESSION HUMANOID ROBOT WE-4RII WITH RCH-1

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Design and Control of the BUAA Four-Fingered Hand

Complex Continuous Meaningful Humanoid Interaction: A Multi Sensory-Cue Based Approach

Evaluation of Five-finger Haptic Communication with Network Delay

A STUDY ON THE EMOTION ELICITING ALGORITHM AND FACIAL EXPRESSION FOR DESIGNING INTELLIGENT ROBOTS

2 Our Hardware Architecture

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

KMUTT Kickers: Team Description Paper

DEVELOPMENT OF AN ARTIFICIAL DYNAMIC FACE APPLIED TO AN AFFECTIVE ROBOT

Research Article Humanoid Robot Head Design Based on Uncanny Valley and FACS

Shuffle Traveling of Humanoid Robots

Robot: icub This humanoid helps us study the brain

Social Constraints on Animate Vision

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Touch Perception and Emotional Appraisal for a Virtual Agent

Psychology in Your Life

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Segmentation Extracting image-region with face

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

Robot: Geminoid F This android robot looks just like a woman

Hanuman KMUTT: Team Description Paper

Dr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA

Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Mechatronics of the Humanoid Robot ROMAN

THE DEVELOPMENT of domestic and service robots has

Active Agent Oriented Multimodal Interface System

FaceReader Methodology Note

PRESENTED BY HUMANOID IIT KANPUR

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Android (Child android)

Korea Humanoid Robot Projects

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

RB-Ais-01. Aisoy1 Programmable Interactive Robotic Companion. Renewed and funny dialogs

Detection of external stimuli Response to the stimuli Transmission of the response to the brain

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Team KMUTT: Team Description Paper

Non Verbal Communication of Emotions in Social Robots

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM

WIRELESS VOICE CONTROLLED ROBOTICS ARM

Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012

Sensation and Perception. What We Will Cover in This Section. Sensation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Emotional Robotics: Tug of War

SitiK KIT. Team Description for the Humanoid KidSize League of RoboCup 2010

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Crossmodal Attention & Multisensory Integration: Implications for Multimodal Interface Design. In the Realm of the Senses

Fibratus tactile sensor using reflection image

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

Experiments of Vision Guided Walking of Humanoid Robot, KHR-2

ZJUDancer Team Description Paper

Emotional BWI Segway Robot

Design of Joint Controller Circuit for PA10 Robot Arm

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Song Shuffler Based on Automatic Human Emotion Recognition

Experimental Investigation into Influence of Negative Attitudes toward Robots on Human Robot Interaction

Ono, a DIY Open Source Platform for Social Robotics

RoboCup TDP Team ZSTT

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Vision-Guided Motion. Presented by Tom Gray

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

AUTOMATIC EYE DETECTION IN FACIAL IMAGES WITH UNCONSTRAINED BACKGROUNDS

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Understanding the Mechanism of Sonzai-Kan

Device Distributed Approach to Expandable Robot System Using Intelligent Device with Super-Microprocessor

Emily Dobson, Sydney Reed, Steve Smoak

EDUCATION ACADEMIC DEGREE

MOBILE AND UBIQUITOUS HAPTICS

3D Face Recognition in Biometrics

2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY

Texture recognition using force sensitive resistors

KINECT CONTROLLED HUMANOID AND HELICOPTER

Cognitive Media Processing

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -

Transcription:

Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 2126, 21 Robot based on the Equations of Emotion defined in the 3D Mental Space Hiroyasu Miwa *, Tomohiko Umetsu *, Atsuo Takanishi **, ***, Hideaki Takanobu ** *Graduate School of Science and Engineering, Waseda University **Department of Mechanical Engineering, Waseda University ***Humanoid Robotics Institute, Waseda University #5938, 341 Ookubo, Shinjukuku, Tokyo, 1698555 Japan Tel: 81352863257, Fax: 8135273229 takanisi@mn.waseda.ac.jp http://www.takanishi.mech.waseda.ac.jp/ Abstract The authors have been developing humanlike head robots in order to develop new head mechanisms and functions for a humanoid robot that has the ability to communicate naturally with a human by expressing humanlike emotion. When humans communicate through auditory or visual sensations, they estimate their mental state through not only the information of each sensation but also their mental model. A robot s mental model consists of a 3D mental space having a pleasantness, an activation and a certainty axes. When the robot senses the stimulus, a mental vector moves in this mental space based on the Equations of Emotion. We also proposed a Robot, which consists of the Sensing and the Expression. Further, we assigned and loaded various Sensing Personalities. 1. Introduction At present, most practical robots are being used in manufacturing industry as industrial robots. When humans use industrial robots, they must define the robot s behavior using a complex process and method. Moreover, when a robot needs to do new actions or tasks, the operator must reprogram new commands to the robot. Besides, the operator must know the robot programming language. However people will not use such robots for wide use as personal robots in the future because the complexity of the robot programming language is beyond the ordinary person. It is necessary for a personal robot to communicate naturally with humans by reprogramming itself based on its communication with humans. In communication among humans, personality and affinity matching are extremely important in attaining smooth and effective communication (Fig. 1.1). Each personal robot in the future will have a personality and an affinity that are able to communicate naturally with humans like a human in human communication. The authors have been developing humanlike head robots Human Matching Robot Fig.1.1 Matching in order to develop new head mechanisms and functions for a humanoid robot having the ability to express emotions like a human towards the personality matching between a human and a robot. The human personality has been researched almost humans could talk. The human personality consists of genetic factors and environmental factors. The former are inborn, but the latter are obtained by experiences or learning through out one s life [1][2][3]. In the psychology, consists of many components, and it isn t separated. But, in this study, the authors defined a narrow personality, and think that the Robot is consists of the Sensing and the Expression. The head robot is an area of active research in the field of robotics. Brooks developed a head robot which expresses facial expressions using eyes, eyelids, eyebrows and a mouth. It can communicate with humans using visual information from CCD cameras [4]. Hara developed a head robot that uses fourteen Action Units of Ekman [5][6]. It can recognize a human s facial expression and express the same facial expression [7][8]. The authors were able to produce coordinated headeye motion with V.O.R (VestibularOcular Reflex) [9][1], depth direction using the angle of convergence between the two eyes [11], adjusting to the brightness of an object with the eyelids [12], and a 78364759/1/$1. 21 IEEE 262

M Mt pleasant asleep Certainty uncertain Facial/Neck Motion Expression Expression Mt t certain Activation Fig. 2.2 3D Mental Space Pleasantness humanlike expression of eyebrows, lips and the jaw [13]. Moreover, we realized auditory sensation and cutaneous sensation as human sensations [14]. On the other hand, we introduced the Equations of Emotion and a mental model which has three independent parameters for the robot [13][14]. In this paper, we introduce the robot personality that expresses more humanlike emotional behavior than previous robots. In this paper, we describe a new robot personality. 2. Robot Environment Robot Dynamic Mental Model arousal unpleasant Happiness sleep Happiness sleep Stimulus Recognition Sensing Fig. 2.1 Basic Information Processing Structure Neutral Neutral Disgust Sadness Anger Surprise Fear Fig. 2.3 Emotional Space and Region Mapping of Emotions In communication among humans, personality and affinity matching is extremely important for producing smooth and effective communication. The personal robots of the future will have personalities and affinities in order to communicate naturally with humans. When humans sense the same stimuli, the change in the mental states of different people is different. Even if the mental states are the same, their expression is different. Therefore, we introduced a Robot, which consists of a Sensing and an Expression, that are before and behind the dynamic mental model. When the robot senses stimuli, the robot expresses a facial expression and a neck motion in accordance with the flow chart in Fig. 2.1. We installed a dynamic mental model and a Robot in the robot which is controlled by the flow. We defined the mental space shown in Fig. 2.2 as the mental model of the robot. We consider that the structure of the mental space isn t changed by the robot mechanism or the personality, and has an invariable structure. 2.1 Stimulus Preprocessing WE3RIV senses stimuli from the environment with the sensors shown in Table 3.2. The robot converts the information of the external stimuli into information of a concrete sensation that affects the mental state of the robot. 2.2 Dynamic Mental Model A 3D mental space, which consists of a pleasantness axis, an activation axis and a certainty axis, is defined in WE3RIV. The mental space of WE3RIII is a twoplane space whose certainty axis takes only two of either 1 or 1 [14]. On the other hand, the certainty value of WE3RIV is a continuous value changing from 1 to 1. WE3RIV s 3D mental space is explained in Fig. 2.2. The vector M named Mental Vector expresses the mental state of WE3RIV. The mental vector M is expressed by the Equations of Emotion (2.1) [14]. M t t = M t M M t = (a t, p t, c t ) M = ( a, p, c) (2.1) We have defined the emotion of WE3RIV in the Equations (2.2). We mapped out seven different emotions in the 3D mental space as in Fig. 2.3. We think that Surprise could have possibly occurred when the Activation level rose too much, and we have defined Surprise as the large region of the high Activation level. WE3RIV determines the emotion by the Mental Vector M t passing each region. E = {Happiness, Anger, Disgust, Fear, Sadness, Surprise, Neutral} emotion E (2.2) 2.3 Sensing The Sensing determines how a stimulus works for the components of M of the Equations of Emotion. In the Equations of Emotion, the small differences a, p and c are described as follows: a = f a (S t, a t, p t, c t ) p = f p (S t, a t, p t, c t ) c = f c (S t, a t, p t, c t ) S t = (S Vt, S At, S Tt, S Ht, S Ot ) (2.3) Where, S Vt : Visual Sensation S At : Auditory Sensation S Tt : Sensation S Ht : Temperature Sensation S Ot : Sensation The Equations (2.3) are the mapping functions of the Sensing. We have assigned several specific Sensing Personalities for WE3RIV. WE3RIV was able to show the expressions of the various robot personalities. Table 2.1 shows the basic Sensing 263

Personalities which we have assigned. And, it shows whether each stimulus works as a negative or positive incident for the components of M. By changing the mapping functions between the stimuli and M, it is possible to obtain a wide variety of Sensing Personalities. The authors think that the Sensing is very important to a relationship between the stimuli and the emotion, because the mental vector determines a robot s emotion. 2.4 Expression The Expression is a pair with the Sensing in the Robot. It affects the facial expression and the neck motion. Its basic structure is similar to that of the Sensing. Although human beings share the same mental states, their expressions are different. For example, when humans become angry, some will remain calm, while others may clearly express anger. This means a wide variety of Expression Personalities by changing the mapping functions of the Expression even though the mental state of the robot is the same. 2.5 Facial/Neck Motion Control for Expressing Emotion The robot outputs the facial expressions and neck motions based on the emotion and the Expression. We used the Six Basic Facial Expressions of Ekman [5][6] in the robot s facial control, and defined the seven facial patterns, Happiness, Anger, Disgust, Fear, Sadness, Surprise, and Neutral facial expressions. The strength of each facial expression is variable by a fiftygrade proportional interpolation of the differences in location from the Neutral facial expression. WE3RIV has the facial pattern shown in Fig. 3.3. Further, WE3RIV can use neck motions to express certain emotions. Although the neck motion is generated by the stimulus that the robot senses, they are modified by the emotion [14]. 3. Hardware Configuration Fig. 3.1 and Fig. 3.2 present the hardware overview of WE3RIV. WE3RIV has 26DOF (Degrees of Freedom) as shown in Table 3.1 and has sensors shown in Table 3.2 which serve as sensory organs for extrinsic stimuli. The followings are descriptions of each part. 3.1 The Neck, Eyeballs and Eyelids The maximum angular velocity of each axis is similar to a human with 16[deg/s] for the neck, 6[deg/s] for the eyeballs and 9[deg/s] for the eyelids. Furthermore, this robot can blink within.3[s], which is as fast as a human blinks [14]. Fig. 3.1 WE3RIV (whole view) Fig. 3.2 WE3RIV (head part) Table 2.1 Stimulus and Sensing with Mapping Functions for WE3RIV Stimulus Sensation a p c Loose Sight of the Target Discover the Target Visual Dazzling Light Target is Near Periferal Vision Pushed Pushed Strongly Hit Auditory Loud Sound Temperature Heat Ammonia Smoke No Sense * No stimulus * * " " means to converge at "" 3.2 Facial Expression Mechanism with Skin Color Expression We used the same mechanism used for WE3RIII for the lips and the eyebrows [14]. We recently added a facial color expression function to the skin. We used the red EL (Electro Luminescent) sheet, which is a thin and light device and doesn t influence the other devices on the skin, such as the FSR (Force Sensitive Resistor), which is used to detect the external forces Table 3.1 DOF Configuration Part DOF Neck 4 Eyes 4 Eyelids 4 Eyebrows 8 Lips 4 Jaw 1 Lung 1 Total 26 264

as push, hit and stroke [14]. for tactile sensation. Fig. 3.3 shows the six basic facial expressions and the Neutral facial expression performed by WE3RIV. Moreover, we have added the drunken and the shame facial expressions to WE3RIV, as shown in Fig. 3.4, in addition to the six basic facial expressions. 3.4 Total System Fig. 3.5 shows the total system configuration of WE3RIV. We used four computers (PC/AT compatible). An Ethernet system connects the PC1 and the other three computers. Table 3.3 shows the functions of each PC. PC1 obtains the outputs of the semiconductor gas sensors using a 12 [bit] A/D board to recognize the smell. PC1 determines WE3RIV s mental state according to the information of the visual, auditory, cutaneous, and olfactory sensations. PC2 controls the DC motors of the eyeballs, the neck, the eyelids, the jaw and the lung according to the visual and mental information sent from PC1. In addition, PC2 obtains information from the temperature sensors through a 12 [bit] A/D board, and transmits this to PC1. PC3 controls the eyebrows, lips and facial color. To control the facial color, PC3 sends the reference input to EL sheets through the 12 [bit] D/A board and the inverter circuit. PC4 calculates the direction of the sound from loudness and the sound pressure difference between 3.3 Sensors We also added olfactory sensation to WE3RIV. The olfactory sensation consists of four semiconductor gas sensors. In addition, WE3RIV can recognize the smells of alcohol, ammonia and cigarette smoke. Moreover, WE3RIV recognizes the target position in 3D Space, pursues the target, adapts to the brightness as the visual sensation, and localizes the sound from the loudness and the phase differences in 3D space as the auditory sensation. WE3RIV has tactile sensation and temperature sensation as the human cutaneous sensation, and recognizes not only the magnitude of a force but also the difference in touching behavior such (a) Happiness (d) Sadness (b) Anger (e) Fear (c) Surprise (a) Drunken (b) Shame Fig. 3.4 New Facial Expressions (f) Disgust Table 3.2 Sensors Sensation Device Quantity Vision CCD Camera 2 Auditory Microphone 2 Tacticle FSR 28 Cutaneous Temperature Sensor IC 4 Semiconductor 4 Gas Sensor (g) Neutral Fig. 3.3 Seven Basic Facial Expressions Microphones CCD (R) CCD (L) FSR sensor Thermo sensor Gyro Eyelids Eyeballs Neck Jaw Lung WE3RIV Eyebrows Lips Facial Color PC4 A/D Image Processor PC PC1 A/D PC1 Ethernet A/D Servo Module x 4 Servo Module x 4 Servo Module x 4 D/A PC2 PC2 Servo Module x 1 Servo Module x 1 Stepping Motor Driver x 8 Stepping Motor Driver x 4 Inverter Circuit Parallel I/O D/A Fig. 3.5 System Configuration of WE3RIV 265 PC3 PC3 PC4 Table 3.3 Functions of PC OS Function Visual Sensation Tacticle Sensation Windows 95 Sensation Eyelashes Mental State Neck Motion Eyeballs Motion MSDOS Eyebrows Motion Lung Motion Temprature Sensation Facial Expression MSDOS Face Color Windows 98 Auditory Sensation

the right and left, and transmits this information to PC1. 4. Experimental Evaluation 4.1 Sensing Personalities The authors evaluated the effect of the Sensing Personalities. We prepared four Sensing Personalities, and changed them. We assigned them by changing the tactile and olfactory sensation of the Sensing shown in Table 2.1. Table 4.1 shows the Sensing Personalities which we used in the experiments. The following is the process of the experiment. We obtained the robot mental states through the experiments. (1) WE3RIV starts. (2) Pursuing the target. (3) Smelling the alcohol. (4) Hitting the head. (5) Stroking the head. (6) Turning off the light. (7) WE3RIV goes to sleep. Fig. 4.1 shows the resulting trajectory of the mental vector obtained through the experiments. Because the reaction of the robot from the stimuli is changed though the robot is affected by the same stimuli, we took this to mean that the robot has various personalities. Therefore, we confirmed that it is possible to obtain a wide variety of Robot Personalities using the various Sensing Personalities. 4.2 Expressing a Mental State We did an experiment on WE3RIV affected by various stimuli (visual, auditory, cutaneous and olfactory stimuli). We also used the Sensing shown in Table 2.1. Fig. 4.2 shows the results of the experiment. We have confirmed that WE3RIV can dynamically express its mental states, which are changed by the visual, auditory, cutaneous and olfactory stimuli, with facial expressions, facial color, and neck motions. Fig. 4.3 shows an example of the resulted trajectory of the mental vector obtained through the experiment. Because of added the olfactory sensation, WE3RIV can respond to more stimuli than our previous robot WE3RIII. In addition, by adding a facial color expression function as part of the emotional expression, the expression ability of WE3RIV has increased. Therefore, we confirmed that WE3RIV became more humanfriendly than our previous robot WE3RIII. (2) Humanlike head robot WE3RIV realized to express the various robot personalities using the Table 4.1 Sensing Pattern Stimulus Sensation a p c 1 2 3 4 (a) Pattern No.1 (b) Pattern No.2 (c) Pattern No.3 5. Conclusions and Future Work (1) We introduced the Robot that consists of the Sensing and the Expression for expressing robot emotions. (d) Pattern No.4 Fig. 4.1 Results of Sensing Personalities Experiment 266

(a) Normal (f) Fear Fig. 4.3 Resulted Trajectory of the Mental Vector in the Experiment (b) Surprise (c) Anger (d) Happiness (g) Smelling (h) Reaction of Alcohol (i) Reaction of Ammonia (e) Sadness (j) Reaction of Cigarette Fig. 4.2 Experimental Evaluation various Sensing Personalities which we have assigned and loaded. (3) WE3RIV changed its mental state caused by visual, auditory, cutaneous, and olfactory stimuli by facial expressions, facial color and neck motions. The Expression is a single type at present. In the future, we will experiment utilizing various Expression Personalities. Acknowledgment A part of this research was done at the Humanoid Robotics Institute (HRI), Waseda University. The authors would like to express thanks to ATR (Advanced Telecommunications Research Institute International), HITACHI, Ltd., MINOLTA Co., Ltd., OKINO Industries, Ltd., SANYO ELECTRIC Co., Ltd., SHARP Corp., SMC Corp., SONY Corp., for their financial support for HRI. References [1] G. W. Allport: Pattern and Growth in, New York: Holt, Rinehart and Winston, 1961 [2] Richard S. Lazarus, Alan Monat: PERSONALITY, 3rd edition, PrenticeHall, Inc, 1979 [3] Saichi Kurachi: Jinkaku Keisei no Shinrigaku (Japanese), Kitaohji Shobo, 1986 [4] Cynthia Breazeal,Brian Scassellati: How to build robots that make friends and influence people, IROS99, 1999 [5] Paul Ekman, Wallace V.Friesen: Facial Action Coding System, Consulting Psychologists Press Inc., 1978 [6] Tsutomu Kudo, P.Ekman, W.V. Friesen; Hyojo Bunseki Nyumon Hyojo ni Kakusareta Imi wo Saguru (Japanese), Seishin Shobo, 1987 [7] Hiroshi Kobayashi, Fumio Hara, et al: Study on Face Robot for Active Human Interface Mechanisms on Face Robot and Facial Expressions of 6 Basic Emotions, the Journal of the Robotics Society of Japan Vol.12 No.1, pp.155163, 1994 [8] Hiroshi Kobayashi, Fumio Hara: Real Time Dynamic Control of 6 Basic Facial Expressions on Face Robot, the Journal of the Robotics Society of Japan Vol.14 No.5, pp.677685, 1996 [9] Kazutaka Mitobe, et al.: Consideration of Associated Movements of head and Eyes to optic and Acoustic Stimulation, The institute of electronics, information and communication engineers, Vol.91, pp.8187, 1992 [1] Laurutis V.P. and Robinson D.A.: The vestibuloocular reflex during human saccadic eye movements, J. Physiol., 373, pp.29233, 1986 [11] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye Robot with Two Eyes, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.79984, 1997 [12] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye System for a Humanoid RobotRealization of Humanlike HeadEye Motion Using Eyelids Adjusting to Brightness, Proceedings of the IEEE International Conference on Robotics and Automation, pp.138134, 1998 [13] Atsuo Takanishi, et al.: Development of an Anthropomorphic HeadEye Robot WE3RII with Autonomous Facial Expression Mechanism, Proceedings of the IEEE International Conference on Robotics and Automation, pp.3255326, 1999 [14] Atsuo Takanishi, et al.: An Anthropomorphic HeadEye Robot expressing Emotions based on Equations of Emotion, Proceedings of the IEEE International Conference on Robotics and Automation, pp.22432249, 2 267