Physical and Affective Interaction between Human and Mental Commit Robot

Similar documents
Contents. Mental Commit Robot (Mental Calming Robot) Industrial Robots. In What Way are These Robots Intelligent. Video: Mental Commit Robots

Tabulation and Analysis of Questionnaire Results of Subjective Evaluation of Seal Robot in Seven Countries

Associated Emotion and its Expression in an Entertainment Robot QRIO

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

Touch Perception and Emotional Appraisal for a Virtual Agent

MIN-Fakultät Fachbereich Informatik. Universität Hamburg. Socially interactive robots. Christine Upadek. 29 November Christine Upadek 1

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Robot: icub This humanoid helps us study the brain

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Sensor system of a small biped entertainment robot

Artificial Intelligence

Intent Expression Using Eye Robot for Mascot Robot System

Generating Personality Character in a Face Robot through Interaction with Human

Robot Personality from Perceptual Behavior Engine : An Experimental Study

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Overview Agents, environments, typical components

Kid-Size Humanoid Soccer Robot Design by TKU Team

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Robot Personality based on the Equations of Emotion defined in the 3D Mental Space

Android (Child android)

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

EC The Impressionable Years

Birth of An Intelligent Humanoid Robot in Singapore

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

CS494/594: Software for Intelligent Robotics

Informing a User of Robot s Mind by Motion

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Constructivist Approach to Human-Robot Emotional Communication - Design of Evolutionary Function for WAMOEBA-3 -

Swarm Intelligence W7: Application of Machine- Learning Techniques to Automatic Control Design and Optimization

Body Movement Analysis of Human-Robot Interaction

Robot: Geminoid F This android robot looks just like a woman

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Computer Animation of Creatures in a Deep Sea

A Robust Neural Robot Navigation Using a Combination of Deliberative and Reactive Control Architectures

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Person Identification and Interaction of Social Robots by Using Wireless Tags

Robo-Erectus Tr-2010 TeenSize Team Description Paper.

THE AI REVOLUTION. How Artificial Intelligence is Redefining Marketing Automation

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

- Basics of informatics - Computer network - Software engineering - Intelligent media processing - Human interface. Professor. Professor.

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Haptic presentation of 3D objects in virtual reality for the visually disabled

Affordance based Human Motion Synthesizing System

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

An Emotion Model of 3D Virtual Characters In Intelligent Virtual Environment

BLUE BRAIN - The name of the world s first virtual brain. That means a machine that can function as human brain.

Intelligent Robotics Sensors and Actuators

Emotional BWI Segway Robot

Introduction to Haptics

Dr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA

Home-Care Technology for Independent Living

Creating Computer Games

Designing Toys That Come Alive: Curious Robots for Creative Play

Creating a 3D environment map from 2D camera images in robotics

ES 492: SCIENCE IN THE MOVIES

QUTIE TOWARD A MULTI-FUNCTIONAL ROBOTIC PLATFORM

Subsumption Architecture in Swarm Robotics. Cuong Nguyen Viet 16/11/2015

Acquisition of Multi-Modal Expression of Slip through Pick-Up Experiences

MOBILE AND UBIQUITOUS HAPTICS

Korea Humanoid Robot Projects

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Chapter 2 Intelligent Control System Architectures

Learning Reactive Neurocontrollers using Simulated Annealing for Mobile Robots

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

Development of PetRo: A Modular Robot for Pet-Like Applications

Active Agent Oriented Multimodal Interface System

User Interface Agents

Realtime 3D Computer Graphics Virtual Reality

Cognitive Media Processing

Computer Haptics and Applications

Designing the consumer experience

Henry Lin, Department of Electrical and Computer Engineering, California State University, Bakersfield Lecture 8 (Robotics) July 25 th, 2012

Meet Pepper. Because of this, Pepper will truly change the way we live our lives.

Neural Models for Multi-Sensor Integration in Robotics

Cognitive Robotics 2017/2018

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Chapter 6 Experiments

The Third Generation of Robotics: Ubiquitous Robot

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

Session 11 Introduction to Robotics and Programming mbot. >_ {Code4Loop}; Roochir Purani

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

A practical experiment with interactive humanoid robots in a human society

AFFECTIVE COMPUTING FOR HCI

Evaluation of Five-finger Haptic Communication with Network Delay

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Graphical Simulation and High-Level Control of Humanoid Robots

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

Hybrid Neuro-Fuzzy System for Mobile Robot Reactive Navigation

Extracting Navigation States from a Hand-Drawn Map

Application of Virtual Reality Technology in College Students Mental Health Education

Biomimetic Design of Actuators, Sensors and Robots

Transcription:

Proceedings of the 21 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 21 Physical and Affective Interaction between Human and Mental Commit Robot Takanori Shibata Kazuo Tanie Institute of Intelligent Systems National Institute of Advanced Industrial Science and Technology Under Ministry of Economy, Trade and Industry Abstract: Recent advances in robotics have been applied to automation in industrial manufacturing, with the primary purpose of optimizing practical systems in terms of such objective measures as accuracy, speed, and cost. This paper introduces research on mental commit robot that seeks a different direction that is not so rigidly dependent on such objective measures. The main goal of this research is to explore a new area in robotics, with an emphasis on human-robot interaction. In the previous research, we introduced a cat robot and evaluated it by interviewing many people. The results showed that physical interaction improved subjective evaluation. However, some subjects gave severe comments on structure of the cat robot when they interacted with it physically. Because of appearance of cat robot, subjects associated with a real cat depending on their own experiences and knowledge, and then compared the robot with real cat. This paper investigates influence of a priori knowledge into subjective interpretation and evaluation of mental commit robot. We developed a new seal robot that has appearance of a baby of harp seal. Most people did not know harp seal precisely nor have experience of interaction with it. Then, subjects evaluated the seal robot while interacting with it. 1. Introduction A human understands people or objects through interaction. The more and longer they interact, the deeper the human understands the other. Long interaction can result in attachment and desire for further interactions. Interaction stimulates humans, and generates motivations for behaviors. Objects with which humans interact include natural objects, animals and artifacts. Studies on interaction between human beings and animals show positive effects on psychology, development of children, and so on [1]. Artifacts that affect people in mentally can be called "aesthetic objects". Such effects are subjective and could not be measured simply in terms of objective measures. Machines are also artifacts. Different from the aesthetic objects, machines have been designed and developed as tools for human beings while being evaluated in terms of objective measures [2]. However, it is necessary for machines that exist and interact with humans to be evaluated by them in terms of their subjective measures. There are many studies on human-machine interaction. Here, we don't discuss studies on human factors in controlling machines used as tools. In other studies, machines recognize human gestures or emotions by sensory information and then act or provide some information to the human. However, modeling gestures or emotions is very difficult because these depend on the situation, context and cultural background of each person. Concerning action by a machine toward a human, an artificial creature in cyber space can give only visual and auditory information to a human. A machine with a physical body is more influential on human mind than a virtual creature. Considerable research on autonomous robots has been carried out. Their purposes are various such as navigation, exploration and delivery in structured or unstructured environments while the robots adapt to the environments. In addition, some robots have been developed to show some emotional expressions by face or gestures [9]. However, although such robots have physical bodies, most of them are not intended to interact physically with a human. We have been building animal type robots in order to investigate human-machine interaction for designing human friendly robots [1, 3-7]. Animal type robots have physical bodies and behave autonomously while generating motivations by themselves. They interact with human beings physically. When we engage physically with a animal type robot, it stimulates our affection. Then we have positive emotions such as happiness and love or negative emotions such as anger and fear. Through physical interaction, we develop attachment to the animal type robot while evaluating it as intelligent or stupid by our subjective measures. In this paper, animal type robots that give mental value to human beings are referred to as mental commit robot -783-6475-9/1/$. 21 IEEE 2572

Interpretation Interpretation Meaning Interaction Meaning Meaning Value Value Value Human Communication Verbal Explicit Nonverbal Implicit Artifact Machines, Garden, Statues etc. Dynamic Static Embodiment Verbal Explicit Nonverbal Implicit Human Designer Fig. 1 Subjective Interpretation and Evaluation of Artifact through Interaction The chapter 2 discusses subjectivity and objectivity. The chapter 3 discusses subjective interpretation and evaluation of robot through physical interaction. The chapter 4 explains previous research and development of mental commit robot. We categorize appearance of robot into four. The chapter 5 introduces a new seal robot and evaluates it. Then, we will discuss influence of a priori knowledge on models in subjective interpretation and evaluation. Finally, the chapter 6 concludes this paper. 2. Objectivity and Subjectivity Science and technologies have been developed through objectivism. Because of objectivism, people can share and use their scientific and technological knowledge in common. When we design machines, we need to use such objective knowledge. A machine that has high value evaluated in terms of objective measures, is useful as a tool for human beings, especially for automation. A machine that interacts with a human is not always evaluated by such objective measures. People evaluate a machine subjectively. Even if some machines were useless in terms of objective evaluation, some people put high subjective value on them. When we design robots that interact with human beings, we have to consider how people think of the robots subjectively. This paper deals with mental commit robot to investigate subjectivity for designing robots friendly to human beings. 3. Subjective Interpretation and Evaluation of Robot through Physical Interaction When a human interacts with a robot, he perceives it by his sense organs; vision, audition, touch, taste, olfactory, and so on. He interprets meaning of robot s behavior depending on his senses and using his memory and knowledge (Fig. 1). Depending on his subjective interpretation, he evaluates the robot. In a case of a robot in computer graphics (simulation), a human perceives the robot by his vision and audition. Even though precise expression of the robot in computer graphics was presented to the subject, only two modalities of a subject could be stimulated. In order to improve subjective evaluation, the number of modalities as well as quality should be increased. As for real robot, it has a physical body. When a human interact with a robot physically, the human senses the robot in terms of multiple modalities. In the previous research, we investigated subjective interpretation of robot s behaviors by psychological experiments. In the experiments, a picture of a dog was equipped with a tail with one degree of freedom (DOF), and subjects were asked to interpret emotions of the dog by watching wagging tail [4]. Then, a simple tactile sensor was added to the system and the tail wagged depending on stroking the tactile sensor by subjects. In the first experiment, subjects interpreted meaning of wagging by visual and auditory information. In the second one, subjects had tactile information in addition to vision and audition. As the results, the second experiment was much more impressive for most subjects because of physical interaction with tangibility. In addition, interpretations of emotions were various based on knowledge of dogs; for example, some had experience of owning dogs. Therefore, multiple modalities are important in human-robot interaction. In addition, a priori-knowledge influences subjective interpretation. 4. Previous Research and Development of Mental Commit Robot There are four categories in terms of appearance of mental commit robots: 2573

Category 1: Human Category 2: Familiar Animal as Pet (pet animals: ex. cat and dog) Category 3: Non-familiar Animal as Pet (ex. seal, penguin, and whale) Category 4: New Character (artificially designed character: ex. AIBO [8], R) We had developed three types of mental commit robots in the previous research. The first was category 4, the second category 3, and the third category 2. 4.1 Dog Robot in Category 4 The first was a dog robot that had visual, auditory, and tactile sensors, a tail with one DOF, and mobility by three wheels (Fig. 2) [4]. We emphasized tangibility for physical interaction between a human and a robot, different from other research. For this purpose, we developed a new tactile sensor that consisted of a pressure sensor and a balloon covered with artificial skin. The robot was able to sense touch such as pushing, stroking, and patting. It behaved depending on its internal state that consisted of current input from sensors and regressive input from itself. A human interacting with the robot by touching or stroking obtained visual, auditory and tactile information. The human felt softness and nice texture like real creatures by touch. Depending on the information, the human changed his behavior. This loop was considered as coupling between the human and robot. Although the robot didn t have explicit emotion model, people interacting with the robot interpreted that the robot s behaviors were emotional. 4.2 Seal Robot (Version 1) in Category 3 The second was a seal robot (Fig. 3). Seal robot in the previous research had a simple structure in order to investigate emergent emotions through physical interaction. The robot had two legs with two servomotors. Each leg had a clutch bearing at a contacting point with floor. At front and back of its body, the robot had two supports. The front support was a caster. The back support had a clutch bearing at the contacting point with floor. When the robot moved two legs back and forth at the same time, it moved forward like crawling. When it moved two legs back and forth alternately, it didn t move forward but it shook its body. When one leg was fixed at back and the other leg moved back and forth, the robot turned to a direction of the fixed leg s side. Concerning sensory system, the robot had two whiskers that sensed contact with its environment, and two pressure sensors with balloons that sensed pushing, patting and stroking on its body. The robot has 6811 CPU inside to control itself. Its weight was about 1. [kg]. The robot had an internal state depending on sensory information and recurrent information. The input to the state was weighted values of pressure sensors, a value based on whisker sensor, and values of the previous state. The value of internal state was input to a neural oscillator with two neurons. The neural oscillator generated motion pattern of a leg. Phase of the two neurons oscillator was controlled by sensory information. Though it didn t have explicit model of emotions, the robot had some rules to generate its motivation, to change its attention, and to control movement of legs. When people interacted with the robot, they interpreted the robot s behavior differently with some words of emotions to express what the robot was doing. As the movement of the robot depended on context, robot s behaviors were interpreted more complex than the number of given functions. This was the effect of emergent emotions. Complexity of interpretation depended on subjects because it was subjective view. Fig. 2 Dog Robot Fig. 3 Seal Robot (Version 1) Fig. 4 Cat Robot 4.3 Cat Robot in Category 2 The third was cat robot (Fig. 4) [5-7]. Cat robot had more complex structure than the seal robot. It was built by OMRON Corp. The robot had tactile, auditory, and postural sensors to perceive human action and its environment. For tactile sensors, it had piezo-electric force sensors on its back and one on its head, and micro 2574

switches at its chin and at its cheek. The robot could recognize stroking, hitting and touching. For audition, the robot had three microphones in its head for sound localization and for speech recognition. For posture, the robot had a 3D gyro-moment sensor. Information was processed by one 32 bit RISC chip, two DSPs for sensors and motors, and one IC for speech recognition. Its weight was 1.5 [kg] including battery. As for action, the cat robot had eight actuators; one for eyelid, two for neck, two for each front leg, and one for a tail. There were one passive joint at each front leg s ankle and two passive joints at each rear legs. Cat robot was expected to behave like living. The robot had some basic behavioral patterns, which mimic those of a real cat. In addition, it had some internal states. In the view of a designer of the robot, the patterns and states could be named with words of emotions when he designed software. We evaluated cat robot by asking some questions to eighty-eight people. They were all Japanese women who liked animals or stuffed animals. Their ages were from twenties to sixties. We had the experiment with one by one. There were four purposes in the interview. The first was to investigate whether cat robot could generate value in subjects mind. The second was whether physical interaction improved subjective value. The third was what factors were important for subjects to think cat robot like living. The fourth was whether subjects admitted value of existence of cat robot. Firstly, we explained about a cat robot to a subject and showed it without movement. Then, we asked the first impression of the cat robot. Secondly, we asked the subject to interact with the cat robot freely for about twenty minutes. Then, we asked the subject some questions. 66% of people answered that cat robot was very cute or cute. 91% wanted to touch and 81% wanted to talk to cat robot. Some were suspicious whether an artificial cat would behave like real cat. In addition, some subjects said that appearance of the cat robot was not cute at all, and that the cat robot looked like just a toy. From these results, appearance is very important for the first impression. Each subject interacted with a cat robot for about thirty minutes. As the first question, we asked each subject whether the cat robot was cute. In the first impression, 66% of the subjects answered positively that the cat robot was very cute or just cute. On the other hand, after interacting with cat robot, the number of positive answers increased from 66% to 74%. This means physical interaction improved subjective value of the cat robot. However, the number of people who answered that the cat robot was very cute decreased from 38% to 32%, although the number of people who answered that the cat robot was cute increased from 28% to 42%. In order to investigate these results, we asked subject reasons why they changed their evaluations. The reasons for decreasing evaluation were as follows: - Texture of fur on the cat robot was not good while touching. - The body was harder than the expectation. - Motors were noisy. - Voice of the cat robot was not cute. - The mouth didn t open. - Etc. On the other hand, the reasons for increasing evaluation were as follows: - The cat robot reacted as if it were real. - Attachment was arisen after physical interaction. - A subject played it without getting bored. - The cat robot made a subject want to take care of it. - Etc. As the results, the reasons for negative change depended on quality of structure and hardware of the cat robot. As subjects had knowledge on real cat very much, they compared the robot with their memory and evaluated it severely. This means a priori-knowledge has much influence in subjective evaluation. 5. New Seal Robot (Version 3) in Category 3 In order to investigate influence of a priori-knowledge, we developed and evaluated a new seal robot that belongs to category 3 (Fig. 5). 5.1 Specifications Its appearance is from a baby of harp seal. It has white fur for three weeks from its born. New seal robot was built by Sankyo Aluminum Industry. As for perception, seal robot has tactile, vision, audition, and posture sensors beneath its soft white artificial fur. For tactile sensors, ten balloon sensors were applied. As for action, it has six actuators; one for eyelid, two for neck, one for each front fin, and one for two rear fins. Weight of seal robot was 3.4 [kg]. The robot has a behavior generation system that consists of hierarchical two layers of processes: proactive and reactive processes. These two layers generate three kinds of behaviors; proactive, reactive, and physiological behaviors: 2575

(a) Active (b) Winking Eyes Fig. 5 New Seal Robot (Version 3) (1) Proactive Behaviors: The robot has two layers to generate its proactive behaviors: behavior-planning layer and behavior-generation layer. Considering internal states, stimulus, desire, rhythm and so on, robot generates proactive behaviors. (a) Behavior-planning layer: This has a state transition network based on internal states of robot and robot s desire produced by its internal rhythm. The robot has internal states that can be named with words of emotions. Each state has numerical level and is changed by stimulation. The state decays by time. Interaction changes internal states and creates character of the robot. The behavior-planning layer sends basic behavioral patterns to behavior-generation layer. The basic behavioral patterns include some poses and some motions. Here, although proactive is referred, proactive behaviors are very primitive compared with those of human beings. We implemented similar behaviors of a real seal into the robot. (b) Behavior generation layer: This layer generates control references for each actuator to perform the determined behavior. The control reference depends on strength of internal states and their variation. For example, parameters change speed of movement, and the number of the same behavior. Therefore, although the number of basic patterns is countable, the number of emerging behaviors is uncountable because numeral parameters are various. This creates living like behaviors. In addition, as for attention, the behavior-generation layer adjusts parameters of priority of reactive behaviors and proactive behaviors based on strength of internal states. This function contributes to situated behavior of robots, and makes it difficult for a subject to predict robot s action. (c) Long term memory: The robot has a reinforcement learning function. It has positive value on preferable stimulation such as stroked. It also has negative value on undesirable stimulation such as beaten. The robot put values on relationship between stimulation and behaviors. Gradually, the robot can be shaped to preferable behaviors of its owner. (2) Reactive behaviors: Seal robot reacts to sudden stimulation. For example, when it hears big sound suddenly, the robot pays attention to it and looks at the direction. There are some patterns of combination of stimulation and reaction. These patterns are assumed as conditioned and unconscious behaviors. (3) Physiological behaviors: The robot has a rhythm of a day. It has some spontaneous desires such as sleep based on the rhythm. 5.2 Evaluation Forty people were interviewed (Fig. 6). Their ages were from twenties to fifties. Nobody had experience of interacting with a real baby of harp seal, though one subject knew it very much. Before contacting with seal robot, 75% said seal robot was cute and 9% wanted to touch. They interacted with seal robot about ten minutes. Then, more than 9% said seal robot was cute. 8% said that they felt relaxation with the seal robot. Some of their positive comments were that the body of seal robot was soft and texture was comfortable, and that winking eyes were very adorable. Some of their negative comments were that seal robot was heavy a little, and that it was better if seal robot moved more actively. As the results, since people didn t have a priori-knowledge on a real baby of harp seal, they didn t compare seal robot with real seal nor have severe comments. Fig. 7 shows interaction between a senior woman and a seal robot. She loved the seal robot. After a while, however, she cried. The reason was that the seal robot associated the woman with her dead dog. 5.3 Discussions By comparing evaluations between cat robot in category 2 and seal robot in category 3, a priori knowledge has much influence in subjective 2576

interpretation and evaluation. When we design a robot with an appearance of real animals, we have to consider the effect of comparison between sensed impression of robot and associated image by subject s memory. At this point, we investigated this result by interviewing subjects after their interaction with robot in short terms. Therefore, we don t take into account the effect of learning in both subjects and robots. We will investigate it by monitoring long-term interaction between subjects and robots. Ratio (%) 4 3 2 Do you know Seal? precisely yes little no Number 1 33 5 1 Ratio (%) 7 6 5 4 3 2 8 7 6 5 4 3 2 Do you want to touch? 5 4 3 2 1 Yes No Is Seal Robot Cute? 5 4 3 2 1 Yes No 6. Conclusions We categorized models of robots into four categories. Then, we have developed mental commit robots in categories 2, 3, and 4. This paper introduced a new seal robot in category 3 and evaluated it. While comparing with results of evaluation of robots in previous research, we showed that a priori knowledge on model of robots had much influence in subjective interpretation and evaluation of mental commit robots by short-term interaction. References [1] T. Shibata, et al., Emotional Robot for Intelligent System - Artificial Emotional Creature Project, Proc. of 5th IEEE Int'l Workshop on ROMAN, pp. 466-471 (1996) [2] H. Petroski, Invention by Design, Harvard University Press (1996) [3] T. Shibata and R. Irie, Artificial Emotional Creature for Human-Robot Interaction - A New Direction for Intelligent System, Proc. of the IEEE/ASME Int'l Conf. on AIM'97 (Jun. 1997) paper number 47 and 6 pages in CD-ROM Proc. [4] T. Shibata, et al., Artificial Emotional Creature for Human-Machine Interaction, Proc. of the IEEE Int'l Conf. on SMC, pp. 2269-2274 (1997) [5] T. Tashima, S. Saito, M. Osumi, T. Kudo and T. Shibata, Interactive Pet Robot with Emotion Model, Proc. of the 16th Annual Conf. of the RSJ, Vol. 1, pp. 11, 12 (1998) [6] T. Shibata, T. Tashima, and K. Tanie, Emergence of Emotional Behavior through Physical Interaction between Human and Robot, Procs. of the 1999 IEEE Int l Conf. on Robotics and Automation (ICRA'99) [7] T. Shibata, T. Tashima (OMRON Corp.), K. Tanie, Subjective Interpretation of Emotional Behavior through Physical Interaction between Human and Robot, Procs. of Systems, Man, and Cybernetics, pp. 24-29 (1999) [8] M. Fujita and K. Kageyama, An Open Architecture for Robot Entertainment, Proc. of Agent 97 (1997) [9] J. M. Hollerbach, Entertainment Robots, Australian Conf. on Robotics & Automation, Brisbane, (1999) Ratio (%) 6 5 4 3 2 Are you comfortable with Seal Robot? 5 4 3 2 1 Yes No Fig. 6 Graphs Fig. 7 Interaction between Seal Robot and Senior People 2577